I'm currently planning to purchase a server for our MySQL database. I'm deciding whether to get an Intel 2 x quad core Core2Duo processor or 2 x dual core Xeon. Any feedback will be appreciated.
Hi All,
Could anyone please let me know from where can i download free video learning materials on Oracle Database Administration 9i or higher.
Awaiting your comments
Regards,
Srijith
Hi,
I'm configurint Postfix to use external smtp. For this, I use the following tutorial: http://carlton.oriley.net/blog/?p=31
After following it, I found in the logs that /etc/postfix/sasl_passwd.db couldn't be read. The file didn't exist. I used postmap hash:/etc/postfix/sasl_passwd (http://postfix.state-of-mind.de/patrick.koetter/smtpauth/smtp_auth_mailservers.html) as root, but I get:
postmap: fatal: open database /etc/postfix/sasl_passwd.db: Permission denied
Why do I get this?
Thanks.
Folks,
Apologies for aksing what might be a very simple question, but can I backup a database and ignore the databases FTI catalog?
These backups will be restored to a different server and the FTI Catalog isn't necessarry.
Thanks
BW
I have a database server with many poorly written queries that causes the sql server to spike then drop constantly ( a massive start from scratch is happening). I need to know if the cpu allocation on the vm to expand as needed is best practice for a case like this. I am wondering if the esxi platform cant expand as fast as the spikes happen.
I am curious what is best practice for vm cpu allocation on sql server (with horribly written queries)
Currently, every day before I start work, I complete the following procedure:
ssh to the production server
gzip our daily database dump file
scp the gzipped dump file over to my computer
gunzip the dump file
dropdb mydatabase
createdb mydatabase
psql mydatabase < dump.sql
Is it possible (I'm sure it is) to automate this process on Mac OSX? This way it is done by the time I get to work in the morning. If so, what is the quickest and easiest way
Currently, every day before I start work, I complete the following procedure:
ssh to the production server
gzip our daily database dump file
scp the gzipped dump file over to my computer
gunzip the dump file
dropdb mydatabase
createdb mydatabase
psql mydatabase < dump.sql
Is it possible (I'm sure it is) to automate this process on Mac OSX? This way it is done by the time I get to work in the morning. If so, what is the quickest and easiest way?
For Oracle 9i + 10g (database), there used to be a really helpful script that you could run to validate your environment was correct before you did the install.
Does such a thing exist for 11g does anyone know? Or is the validation finally done by the OUI? If something similar does still exist, please can someone point me in the right direction?
I'm trying to do a nightly backup of a SQL Server 2005 database to get the leanest, smallest file that includes all the table data, stored procedures, views, and functions.
I'm downloading and saving this daily in case the server isn't available and I need to just get some data out. I don't care about the logs or any other parts of the DB but the data, procs, tables, and functions.
What's the easiest way to get just this information backed up?
Hi all,
I am trying to connect to mysql database on amazon through a php script, but I am shown this error:
Warning: mysql_connect()
[function.mysql-connect]: Lost
connection to MySQL server at 'reading
initial communication packet', system
error: 111
I have tried and searched places and did the following things:
In "/etc/mysql/my.cnf" I commented out the line bind address: 127.0.0.1 to allow the acccess to all.
checked /etc/hosts.allow and /etc/hosts.deny and made sure that there are no rules present that may cause
But still no luck. Please suggest any other way.
Thanks,
Madhup
Can someone recommend me a linux flavor to install an oracle database on? I have no preferences for any flavor.
Is there flavor which is easier than others? Or has less issues?
I have a postgres database on one server and I need access it from another server.
I need to edit the pg_hba.conf file, but have now idea what are the steps to edit this file. Server Centos
I need to add the following line in the file
host all all 10.0.2.12 255.255.255.255 trust
I located it in var/lib/pgsql/data/
Now basically I'm not sure what are the correct steps to do this
I'm managing a small website, which allows people to register as members. Previously I used a google form to manage member registration, but now as the number of users becomes quite big I am switching to mysql. Currently I have around 500 members in the database, saved in a google spreadsheet. How can I do a bulk import from a google spreadsheet to a table in mysql? BTW I'm using phpmyadmin, so a solution for phpmyadmin is preferable :)
Thanks.
I'm dbowner on certain database (my account is on public role on server login but dbowner role on certain databases). Now when I try to add logins from server logins, I can only see sa account and my account. How can I add user to my databases from server logins?
My wife uses an application which stores persistent information in files that were created by FileMaker v6. The files recently became corrupted, and I'm trying to find a way to recover whatever I can.
Generally, I'm looking for any tool that would let me programmatically query the database file, without dishing out the money for full version. If anyone knows of a tool that can do this, please let me know!
I want to downgrade my database with all the views and stored procedures from SQL 2005 to SQL 2000 but the problem is that the views are not restored properly.
We have a database on a SQL 2005 server that is set to Simple transaction mode. The logging is set to 1 MB and is set to grow by 10% when it needs to.
We keep running into an issue where the transaction log fills up and we need to shrink it. What could cause the transaction log to fill up when its set to Simple and unrestricted growth is allowed?
Right now I am able to do the backup using mysqldump. But I have to take down the web server AND it takes around 5 minutes to do the backup. If I don't take down the web server, it takes forever and never finishes + the website becomes inaccessible during the backup.
Is there a quicker/better way to backup my 22 GB and growing database?
All the tables are MyISAM.
Is there a way to export the database schema in well formed XML of a MS 2000 SQL Server. I'm looking for just the structure not the data and the more detailed the better. The XML may be used in a migration processes. I'm more familiar with MySQL then with SQL Server so please be detailed if you have time.
Thanks
Hi My que is
i tried to get the data from data base using
String SQL_QUERY ="from dat_personal_info ";
Query query = session.createQuery(SQL_QUERY);
for( it=query.iterate();it.hasNext();)
{
Object[] rowObject =(Object[]) it.next();
}
but error occurs
Hibernate: select dat_person0_.row_id as col_0_0_ from dat_personal_info dat_per
son0_
java.lang.ClassCastException: bn.com.server.database.maptables.dat_personal_info
$$EnhancerByCGLIB$$e1ffd36e cannot be cast to [Ljava.lang.Object;
pls any who ans pls reply
Thanks & Regards
Sudhir gudhe
When I try to create a new database connection, the Data Sources (ODBC) programs hangs or takes a very long time to find the list of available SQL Servers.
This only happens when there are other computers on the network, when my machine (a standard Windows 7 laptop) is alone, it works just fine.
My question is: What should I look for in terms of SQL server or ODBC configurations that will take away this random behaviour?
We have large multi-gigabyte data sets on which we run very complex queries, for example
{
$or: [ { id: 30000001, ... }, { id: 30000005, ... }, ..., { id: 30001005, ... } ]
}
It seems that CPU is actually a bottleneck at this point, so I'd be advantageous to be able to run multiple mongod instances on the same set of database files.
We've considered using replica sets to this end, but would prefer to not require the extra disk space simply for CPU reasons.
I have somehow two variables for example x and y.
I have also made a model with 3 fields (longitude,latitude,name) and have it activated in mysql database.
I need to send these two variables(x,y) to the django server so as to search if there is an object with longitude=x and latitude=y.If there is one i want to get back it's name.
How can i do this?
Attempting to pull my database from Heroku gives an error partway through the process (below).
Using: Snow Leopard; heroku-1.8.2; taps-0.2.26; rails-2.3.5; mysql-5.1.42. Database is smallish, as you can see from the error message.
Heroku tech support says it's a problem on my system, but offers nothing in the way of how to solve it.
I've seen the issue reported before - for example here. How can I get around this problem?
The error:
$ heroku db:pull
Auto-detected local database: mysql://[...]@localhost/[...]?encoding=utf8
Receiving schema
Receiving data
17 tables, 9,609 records
[...]
/Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:166:in `query': Mysql::Error MySQL server has gone away (Sequel::DatabaseError)
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:166:in `_execute'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:125:in `execute'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/connection_pool.rb:101:in `hold'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/database.rb:461:in `synchronize'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:125:in `execute'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/database.rb:296:in `execute_dui'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/dataset.rb:276:in `execute_dui'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:365:in `execute_dui'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/dataset/convenience.rb:126:in `import'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/dataset/convenience.rb:126:in `each'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/dataset/convenience.rb:126:in `import'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:144:in `transaction'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/connection_pool.rb:108:in `hold'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/database.rb:461:in `synchronize'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/adapters/mysql.rb:138:in `transaction'
from /Library/Ruby/Gems/1.8/gems/sequel-3.0.0/lib/sequel/dataset/convenience.rb:126:in `import'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:211:in `cmd_receive_data'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:203:in `loop'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:203:in `cmd_receive_data'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:196:in `each'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:196:in `cmd_receive_data'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:175:in `cmd_receive'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/commands/db.rb:17:in `pull'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/commands/db.rb:119:in `taps_client'
from /Library/Ruby/Gems/1.8/gems/taps-0.2.26/lib/taps/client_session.rb:21:in `start'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/commands/db.rb:115:in `taps_client'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/commands/db.rb:16:in `pull'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/command.rb:45:in `send'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/command.rb:45:in `run_internal'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/../lib/heroku/command.rb:17:in `run'
from /Library/Ruby/Gems/1.8/gems/heroku-1.8.2/bin/heroku:14
from /usr/bin/heroku:19:in `load'
from /usr/bin/heroku:19