Search Results

Search found 31356 results on 1255 pages for 'database backups'.

Page 138/1255 | < Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >

  • MongoDb vs Ehcache caching advise for speeding up read only mysql Database

    - by paddydub
    I'm building a Route Planner Webapp using Spring/Hibernate/Tomcat and a mysql database, I have a database containing read only data, such as Bus Stop Coordinates, Bus times which is never updated. I'm trying to make the app run faster, each time the application is run it will preform approx 1000 reads to the database to calculate a route. I have setup a Ehcache which greatly improves the read from database times. I'm now setting terracotta + Ehcache distributed caching to share the cache with multiple Tomcat JVMs. This seems a bit complicated. I've tried memcached but it was not performing as fast as ehcache. I'm wondering if a MongoDb would be better suited. I have no experience with nosql but I would appreciate if anyone has any ideas. All i need is quick access to the read only database.

    Read the article

  • Cannot login to Postgrest database despite setting password for user 'postgres'

    - by Serg
    I'm trying to use pgAdmin III to manage my Postgres database. Here are the commands I've run on my machine: sudo apt-get install postgresql Then I installed the pgAdmin III application: sudo apt-get install pgadmin3 Next I focused on setting my username and password in order to login: sudo -u postgres psql postgres Here I set my password \password postgres Finally I just created my database: sudo -u postgres createdb repairsdatabase When I try to login using pgAdmin III, I get the error: An error has occurred: Error connecting to the server: FATAL: Peer authentication failed for user "postgres"

    Read the article

  • Export ASPNETDB data to another Database.

    - by raziiq
    Hi there. I am developing in Visual Web developer 2008. I have SQLEXPRESS 2005 and SQL Management Studio 2008 installed on my PC. I purchased a Database MS SQL 2008 on DiscountASP.net. Since the host provides only 1 database and my project has 2 database. One is the ASPNETDB that contains the roles and user etc (created using the Website Configuration Wizard) and the other is my database containing data to my website and is named MainDB. As Host allows only 1 database so i exported my ASPNETDB's tables and stored procedures to my MainDB using aspnet_regsql.exe, but the problem is that stored procedures and tables are exported to my MainDB but data is not exported, i mean there are no users in the tables. My Question is how to export everything of ASPNETDB including stored procedures, tables and data to my MainDB??

    Read the article

  • SQL Server 2008 Copy Database Wizard: Fail

    - by Nai
    I am trying to use the SQL Server 2008 Copy Database Wizard to copy a SQL Server 2008 database. I am using the SQL Management Object method. However, the copy fails with the following error: ERROR : errorCode=-1073548784 description=Executing the query "/* '==============================================..." failed with the following error: "Cannot use a CONTAINS or FREETEXT predicate on table or indexed view 'Product' because it is not full-text indexed." Any ideas on how I can proceed with this will be super helpful Kind Regards Nai

    Read the article

  • Google Gears - Database - VACUUM

    - by Sirber
    With this code: var db = google.gears.factory.create('beta.database'); db.open('cominar'); db.execute('CREATE TABLE IF NOT EXISTS Ajax (AJAX_ID INTEGER PRIMARY KEY AUTOINCREMENT , MODULE TEXT, FUNCTION TEXT, CONTENT_JSON TEXT);'); db.execute('VACUUM;'); // nettoye la DB I'm trying to clean the database (VACUUM) the database at each initialisation but I get this error: Uncaught Error: Database operation failed. ERROR: authorization denied DETAILS: not authorized The database was created by me (the same page). Thank you!

    Read the article

  • How do continuously update data to an asp page?

    - by Lori
    Hi, I have an asp page based on a very simple database. It references a single table of probably 30 records and maybe 12 data fields and everything works great as I am only uploading a new database every week or so. I have a special circumstance where I would like upload new data to the database and display automatically on the page every 20 to 30 seconds without the user having to refresh their screen. I would expect up to 1000 concurrent users accessing the data. I have been manually uploading the database via ftp, which will obviously not work on this timeline and would also run the risk of error pages as the database is being replaced. So, can anyone point me the right direction to setup this scenario? Other details that might be helpful: The database is an Access database (but I could change to another format if needed) Running on Windows platform hosted by an ISP, not my own server Thanks in advance for any help on this! Lori

    Read the article

  • How would the conversion of a custom CMS using a text-file-based database to Drupal be tackled?

    - by James Morris
    Just today I've started using Drupal for a site I'm designing/developing. For my own site http://jwm-art.net I wrote a user-unfriendly CMS in PHP. My brief experience with Drupal is making me want to convert from the CMS I wrote. A CMS whose sole method (other than comments) of automatically publishing content is by logging in via SSH and using NANO to create a plain text file in a format like so*: head<<END_HEAD title = Audio keywords= open,source,audio,sequencing,sampling,synthesis descr = Music, noise, and audio, created by James W. Morris. parent = home END_HEAD main<<END_MAIN text<<END_TEXT Digital music, noise, and audio made exclusively with @=xlink=http://www.linux-sound.org@:Linux Audio Software@_=@. END_TEXT image=gfb@--@;Accompanying image for penonpaper-c@right ilink=audio_2008 br= ilink=audio_2007 br= ilink=audio_2006 END_MAIN info=text<<END_TEXT I've been making PC based music since the early nineties - fortunately most of it only exists as tape recordings. END_TEXT ( http://jwm-art.net/dark.php?p=audio - There's just over 400 pages on there. ) *The jounal-entry form which takes some of the work out of it, has mysteriously broken. And it still required SSH access to copy the file to the main dat dir and to check I had actually remembered the format correctly and the code hadn't mis-formatted anything (which it always does). I don't want to drop all the old content (just some), but how much work would be involved in converting it, factoring into account I've been using Drupal for a day, have not written any PHP for a couple of years, and have zero knowledge of SQL? How might a team of developers tackle this? How do-able is it for one guy in his spare time?

    Read the article

  • PHP/MySQL database connection priority?

    - by Josh
    Hello, I have a production database where usage statistics for a service we're working on. I use php to periodically roll up different resolutions (day, week, month, year) of interesting statistics in buckets dictated by the resolution. The php application I've written "completes" its data when its run, such that it will calculate all the rolled-up statistics for the resolutions and periods since it was last run. This is useful if we want to turn this off to debug database performance issues, because I can turn it back on and have it complete its data set. The problem I have, is the calculations are fairly intensive and drive the QPS of the production database server up. Is there a way to set a "priority" on a particular database connection so that it will only use "off-cycles" to do these calculations? Maybe a proper response would be to replicate the tables I'm working on into a different stats database, but, unfortunately I don't have the resources in place to attempt such a thing (yet). Thanks in advance for any help, Josh

    Read the article

  • Tool for documentation of the fields of a database ?

    - by Jerome WAGNER
    Hello, I need to add documentation to all fields of 2 databases (1 postgresql & 1 sql server), around 100 tables each. What tool would be convenient to do that (reverse the schema + add documentation manually on all fields) ? My favors would go to an open source tool with a graphical & xml output. Thanks for your help Jerome WAGNER

    Read the article

  • Setting up nHibernate with an Oracle database and Visual Studio 2010

    - by Geoff
    I'm creating a .ASPNET project and I would like to setup nHibernate as my ORM tool. I will be using an existing oracle database and Visual Studio 2010. ORM tools are very new to me and really could use any advice to better understand the tool and the process required to implement them. I've been following an article at http://nhforge.org/wikis/howtonh/your-first-nhibernate-based-application.aspx to learn about it and am stuck where they say to create a local database as mine only give me the option to create a SQL server database (perhaps this a new for visual studio 2010?). Is the purpose of this database just to cache results from the live database? Thanks for your help! Geoff

    Read the article

  • MySQL: Load database to memory

    - by Adam Matan
    Hi, Is there a way to load an entire MySQL database to the RAM, especially on en EC2 server? The database is quite small (~500 MegaBytes) I have enough memory Speed issues are crucial - the resulted queries are used to serve a dynamic webpage. Thanks, Adam

    Read the article

  • Normalize or Denormalize in high traffic websites

    - by Inam Jameel
    what is the best practice for database design for high traffic websites like this one stackoverflow? should one must use normalize database for record keeping or normalized technique or combination of both? is it sensible to design normalize database as main database for record keeping to reduce redundancy and at the same time maintain another denormalized form of database for fast searching? or main database should be denormalize and one can make normalized views in the application level for fast database operations? or beside above mentioned approach? what is the best practice of designing high traffic websites???

    Read the article

  • database importing problem with sql server

    - by tibin mathew
    Hi, I have a database working in mu local sql server 2005 express edition. I have to import my local dtabase to a remote servr database. For that i established connection to that remote server, and i can now see that database . but when i tried to restore database fro my local machine i'm getting an error message when i tried to give backup file location. Below is the error message The EXECUTE permission was denied on the object 'xp_availablemedia', database 'mssqlsystemresource', schema 'sys'. The user does not have permission to perform this action. The statement has been terminated. (Microsoft SQL Server, Error: 229) waht is the problem, how can i solve this. Please help me

    Read the article

  • How do you deal with denormalization / secondary indexes in database sharding?

    - by Continuation
    Say I have a "message" table with 2 secondary indexes: "recipient_id" "sender_id" I want to shard the "message" table by "recipient_id". That way to retrieve all messages sent to a certain recipient I only need to query one shard. But at the same time, I want to be able to make a query that ask for all messages sent by a certain sender. Now I don't want to send that query to every single shard of the "message" table. One way to do this is to duplicate the data and have a "message_by_sender" table sharded by "sender_id". The problem with that approach is that every time a message has been sent, I need to insert the message into both "message" and "message_by_sender" tables. But what if after inserting into "message" the insertion into "message_by_sender" fail? In that case the message exists in "message" but not in "message_by_sender". How do I make sure that if a message exists in "message" then it also exists in "message_by_sender" without resorting to 2 phase commit? This must be a very common issue for anyone who shards their databases. How do you deal woth it?

    Read the article

  • Changing SQL Server Database sorting

    - by plaisthos
    I have a request to change the collation of a SQL Server Database: ALTER DATABASE solarwind95 collate SQL_Latin1_General_CP1_CI_AS but I get this strange error: Meldung 5075, Ebene 16, Status 1, Zeile 1 Das 'Spalte'-Objekt 'CustomPollerAssignment.PollerID' ist von 'Datenbanksortierung' abhängig. Die Datenbanksortierung kann nicht geändert werden, wenn ein schemagebundenes Objekt von ihr abhängig ist. Entfernen Sie die Anhängigkeiten der Datenbanksortierung, und wiederholen Sie den Vorgang. Sorry for the german errror message. I do not know how to switch the language to english, but here is a translation: Translation: Message 5075, Layer 16, Status 1, Row 1 The 'column' object 'CustomPollerAssignment.PollerID' depends on 'Database sorting. The database sorting cannot be changed if a schema bound object depends on it. Remove the dependency of the database sortieren and retry. I got a ton more of the errors like that.

    Read the article

  • How to Set Customer Table with Multiple Phone Numbers? - Relational Database Design

    - by user311509
    CREATE TABLE Phone ( phoneID - PK . . . ); CREATE TABLE PhoneDetail ( phoneDetailID - PK phoneID - FK points to Phone phoneTypeID ... phoneNumber ... . . . ); CREATE TABLE Customer ( customerID - PK firstName phoneID - Unique FK points to Phone . . . ); A customer can have multiple phone numbers e.g. Cell, Work, etc. phoneID in Customer table is unique and points to PhoneID in Phone table. If customer record is deleted, phoneID in Phone table should also be deleted. Do you have any concerns on my design? Is this designed properly? My problem is phoneID in Customer table is a child and if child record is deleted then i can not delete the parent (Phone) record automatically.

    Read the article

  • storing data in a database using edit text and button

    - by user1841444
    Hai im trying to Insert data into database using EditText and Button i have created. Im stuck at Activity part of the Code.I unbale to proceed how to write the Onclick action part for Button and EditText part Please help me. Im new to android DBAdapter.java package com.example.database1; import android.content.ContentValues; import android.content.Context; import android.database.Cursor; import android.database.SQLException; import android.database.sqlite.SQLiteDatabase; import android.database.sqlite.SQLiteOpenHelper; import android.util.Log; public class DBAdapter { public static final String KEY_ROWID = "_id"; public static final String KEY_ISBN = "isbn"; public static final String KEY_TITLE = "title"; public static final String KEY_PUBLISHER = "publisher"; private static final String TAG = "DBAdapter"; private static final String DATABASE_NAME = "books"; private static final String DATABASE_TABLE = "titles"; private static final int DATABASE_VERSION = 1; private static final String DATABASE_CREATE = "create table titles (_id integer primary key autoincrement, " + "isbn text not null, title text not null, " + "publisher text not null);"; private final Context context; private DatabaseHelper DBHelper; private SQLiteDatabase db; public DBAdapter(Context ctx) { this.context = ctx; DBHelper = new DatabaseHelper(context); } private static class DatabaseHelper extends SQLiteOpenHelper { DatabaseHelper(Context context) { super(context, DATABASE_NAME, null, DATABASE_VERSION); } @Override public void onCreate(SQLiteDatabase db) { db.execSQL(DATABASE_CREATE); } @Override public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) { Log.w(TAG, "Upgrading database from version " + oldVersion + " to " + newVersion + ", which will destroy all old data"); db.execSQL("DROP TABLE IF EXISTS titles"); onCreate(db); } } //---opens the database--- public DBAdapter open() throws SQLException { db = DBHelper.getWritableDatabase(); return this; } //---closes the database--- public void close() { DBHelper.close(); } //---insert a title into the database--- public long insertTitle(String isbn, String title, String publisher) { ContentValues initialValues = new ContentValues(); initialValues.put(KEY_ISBN, isbn); initialValues.put(KEY_TITLE, title); initialValues.put(KEY_PUBLISHER, publisher); return db.insert(DATABASE_TABLE, null, initialValues); } //---deletes a particular title--- public boolean deleteTitle(long rowId) { return db.delete(DATABASE_TABLE, KEY_ROWID + "=" + rowId, null) > 0; } //---retrieves all the titles--- public Cursor getAllTitles() { return db.query(DATABASE_TABLE, new String[] { KEY_ROWID, KEY_ISBN, KEY_TITLE, KEY_PUBLISHER}, null, null, null, null, null); } //---retrieves a particular title--- public Cursor getTitle(long rowId) throws SQLException { Cursor mCursor = db.query(true, DATABASE_TABLE, new String[] { KEY_ROWID, KEY_ISBN, KEY_TITLE, KEY_PUBLISHER }, KEY_ROWID + "=" + rowId, null, null, null, null, null); if (mCursor != null) { mCursor.moveToFirst(); } return mCursor; } //---updates a title--- public boolean updateTitle(long rowId, String isbn, String title, String publisher) { ContentValues args = new ContentValues(); args.put(KEY_ISBN, isbn); args.put(KEY_TITLE, title); args.put(KEY_PUBLISHER, publisher); return db.update(DATABASE_TABLE, args, KEY_ROWID + "=" + rowId, null) > 0; } } DatabaseActivity.java package com.example.database1; import android.os.Bundle; import android.app.Activity; import android.database.Cursor; import android.view.Menu; import android.widget.Toast; public class DatabaseActivity extends Activity { @Override public void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_database); DBAdapter db=new DBAdapter(this); db.open(); } } activity_database.xml: <LinearLayout xmlns:android="http://schemas.android.com/apk/res/android" xmlns:tools="http://schemas.android.com/tools" android:layout_width="match_parent" android:layout_height="match_parent"> <EditText android:id="@+id/edit1" android:layout_width="wrap_content" android:layout_height="wrap_content" /> <EditText android:id="@+id/edit2" android:layout_width="wrap_content" android:layout_height="wrap_content" /> <EditText android:id="@+id/edit3" android:layout_width="wrap_content" android:layout_height="wrap_content" /> <Button android:id="@+id/submit" android:layout_width="wrap_content" android:layout_height="wrap_content" /> </LinearLayout>

    Read the article

  • How do I do this Database Model in Django?

    - by alex
    Django currently does not support the "Point" datatype in MySQL. That's why I created my own. class PointField(models.Field): def db_type(self): return 'Point' class Tag(models.Model): user = models.ForeignKey(User) utm = PointField() As you can see, this works, and syncdb creates the model fine. However, my current code calculates a length between two Points using raw SQL. cursor.execute("SELECT user_id FROM life_tag WHERE\ (GLength(LineStringFromWKB(LineString(asbinary(utm), asbinary(PointFromWKB(point(%s, %s)))))) < 55)... This says: Select where the length between the given point and the table point is less than 55. How can I do this with Django instead of RAW SQL? I don't want to do cursors and SELECT statements anymore. How can I modify the models.py in order to do this?

    Read the article

  • How to put large text data (~20mb) into sql cs 3.5 database?

    - by Anindya Chatterjee
    I am using following query to insert some large text data : internal static string InsertStorageItem = "insert into Storage(FolderName, MessageId, MessageDate, StorageData) values ('{0}', '{1}', '{2}', @StorageData)"; and the code I am using to execute this query is as follows : string content = "very very large data"; string query = string.Format(InsertStorageItem, "Inbox", "AXOGTRR1445/DSDS587444WEE", "4/19/2010 11:11:03 AM"); var command = new SqlCeCommand(query, _sqlConnection); var paramData = command.Parameters.Add("@StorageData", System.Data.SqlDbType.NText); paramData.Value = content; paramData.SourceColumn = "StorageData"; command.ExecuteNonQuery(); But at the last line I am getting this following error : System.Data.SqlServerCe.SqlCeException was unhandled by user code Message=The data was truncated while converting from one data type to another. [ Name of function(if known) = ] Source=SQL Server Compact ADO.NET Data Provider HResult=-2147467259 NativeError=25920 StackTrace: at System.Data.SqlServerCe.SqlCeCommand.ProcessResults(Int32 hr) at System.Data.SqlServerCe.SqlCeCommand.ExecuteCommandText(IntPtr& pCursor, Boolean& isBaseTableCursor) at System.Data.SqlServerCe.SqlCeCommand.ExecuteCommand(CommandBehavior behavior, String method, ResultSetOptions options) at System.Data.SqlServerCe.SqlCeCommand.ExecuteNonQuery() at Chithi.Client.Exchange.ExchangeClient.SaveItem(Item item, Folder parentFolder) at Chithi.Client.Exchange.ExchangeClient.DownloadNewMails(Folder folder) at Chithi.Client.Exchange.ExchangeClient.SynchronizeParentChildFolder(WellKnownFolder wellknownFolder, Folder parentFolder) at Chithi.Client.Exchange.ExchangeClient.SynchronizeFolders() at Chithi.Client.Exchange.ExchangeClient.WorkerThreadDoWork(Object sender, DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.OnDoWork(DoWorkEventArgs e) at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument) InnerException: Now my question is how am I supposed to insert such large data to sqlce db? Regards, Anindya Chatterjee http://abstractclass.org

    Read the article

< Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >