Search Results

Search found 31483 results on 1260 pages for 'database migration'.

Page 444/1260 | < Previous Page | 440 441 442 443 444 445 446 447 448 449 450 451  | Next Page >

  • Trouble with Berkeley DB JE Base API Secondary Databases and Sequences

    - by milosz
    I have a class Document which consists of Id (int) and Url (String). I would like to have a primary index on Id and secondary index on Url. I would also like to have a sequence for Id auto-incrementation. So I create a SecondaryDatabase and then I create a Sequence. During initialisation of the Sequence I get an exception: Exception in thread "main" java.lang.IllegalArgumentException at com.sleepycat.util.UtfOps.getCharLength(UtfOps.java:137) at com.sleepycat.util.UtfOps.bytesToString(UtfOps.java:259) at com.sleepycat.bind.tuple.TupleInput.readString(TupleInput.java:152) at pl.edu.mimuw.zbd.berkeley.zadanie.rozwiazanie.MyDocumentBiding.entryToObject(MyDocumentBiding.java:12) at pl.edu.mimuw.zbd.berkeley.zadanie.rozwiazanie.MyDocumentBiding.entryToObject(MyDocumentBiding.java:1) at com.sleepycat.bind.tuple.TupleBinding.entryToObject(TupleBinding.java:76) at pl.edu.mimuw.zbd.berkeley.zadanie.rozwiazanie.UrlKeyCreator.createSecondaryKey(UrlKeyCreator.java:20) at com.sleepycat.je.SecondaryDatabase.updateSecondary(SecondaryDatabase.java:835) at com.sleepycat.je.SecondaryTrigger.databaseUpdated(SecondaryTrigger.java:42) at com.sleepycat.je.Database.notifyTriggers(Database.java:2004) at com.sleepycat.je.Cursor.putNotify(Cursor.java:1692) at com.sleepycat.je.Cursor.putInternal(Cursor.java:1616) at com.sleepycat.je.Cursor.putNoOverwrite(Cursor.java:663) at com.sleepycat.je.Sequence.<init>(Sequence.java:188) at com.sleepycat.je.Database.openSequence(Database.java:546) at pl.edu.mimuw.zbd.berkeley.zadanie.rozwiazanie.MyFullTextSearchEngine.init(MyFullTextSearchEngine.java:131) at pl.edu.mimuw.zbd.berkeley.zadanie.testy.MyFullTextSearchEngineTest.main(MyFullTextSearchEngineTest.java:18) It seems that during the initialisation of the sequence the secondary database is forced to update. When I debug the entryToObject method of MyDocumentBiding the bytes that it tries to convert to object seem random. What am I doing wrong?

    Read the article

  • Updating multiple related tables in SQLite with C#

    - by PerryJ
    Just some background, sorry so long winded. I'm using the System.Data.SQLite ADO.net adapter to create a local sqlite database and this will be the only process hitting the database, so I don't need to worry about concurrency. I'm building the database from various sources and don't want to build this all in memory using datasets or dataadapters or anything like that. I want to do this using SQL (DdCommands). I'm not very good with SQL and complete noob in sqlite. I'm basically using sqlite as a local database / save file structure. The database has a lot of related tables and the data has nothing to do with People or Regions or Districts, but to use a simple analogy, imagine: Region table with auto increment RegionID, RegionName column and various optional columns. District table with auto increment DistrictID, DistrictName, RegionId, and various optional columns Person table with auto increment PersonID, PersonName, DistrictID, and various optional columns So I get some data representing RegionName, DistrictName,PersonName, and other Person related data. The Region, District and/or Person may or may not be created at this point. Once again, not being the greatest with this, my thoughts would be something like: Check to see if Region exists and if so get the RegionID else create it and get RegionID Check to see if District exists and if so get the DistrictID else create it adding in RegionID from above and get DistrictID Check to see if Person exists and if so get the PersonID else create it adding in DistrictID from above and get PersonID Update Person with rest of data. In MS SQL Server I would create a stored procedure to handle all this. Only way I can see to do this with sqlite is a lot of commands. So I'm sure I'm not getting this. I've spent hours looking around on various sites but just don't feel like I'm going down the right road. Any suggestions would be greatly appreciated.

    Read the article

  • SQLConnection.Open(); throwing exception

    - by flavour404
    Hi, Updating an old piece of software but in order to maintain backward compatibility I need to connect to a .mdb (access) database. I am using the following connection but keep getting an exception, why? I have validated the path, database existence etc. and that is all correct. string Server = "localhost"; string Database = drive + "\\btc2\\state\\states.mdb"; string Username = ""; string Password = "Lhotse"; string ConnectionString = "Data Source = " + Server + ";" + "Initial Catalog = " + Database + ";" + "User Id = '';" + "Password = " + Password + ";"; SqlConnection SQLConnection = new SqlConnection(); try { SQLConnection.ConnectionString = ConnectionString; SQLConnection.Open(); } catch (Exception Ex) { // Try to close the connection if (SQLConnection != null) SQLConnection.Dispose(); // //can't connect // // Stop here return false; }

    Read the article

  • Are SharePoint site templates really less performant than site definitions?

    - by Jim
    So, it seems in the SharePoint blogosphere that everybody just copies and pastes the same bullet points from other blogs. One bullet point I've seen is that SharePoint site templates are less performant than site definitions because site definitions are stored on the file system. Is that true? It seems odd that site templates would be less performant. It's my understanding that all site content lives in a database, whether you use a site template or a site definition. A site template is applied once to the database, and from then on the site should not care if the content was created using a site template or not. So, does anybody have an architectural reason why a site template would be less performant than a site definition? Edit: Links to the blogs that say there is a performance difference: From MSDN: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. From DevX: However, user templates in SharePoint can lead to performance problems and may not be the best approach if you're trying to create a set of reusable templates for an entire organization. From IT Footprint: Because it is slow to store templates in and retrieve them from the database, site templates can result in slower performance. Templates in the database are compiled and executed every time a page is rendered. From Branding SharePoint:Custom site definitions hold the following advantages over custom templates: Data is stored directly on the Web servers, so performance is typically better. At a minimum, I think the above articles are incomplete, and I think several are misleading based on what I know of SharePoints architecture. I read another blog post that argued against the performance differences, but I can't find the link.

    Read the article

  • Application Context in Rails

    - by Sean McMains
    Rails comes with a handy session hash into which we can cram stuff to our heart's content. I would, however, like something like ASP's application context, which instead of sharing data only within a single session, will share it with all sessions in the same application. I'm writing a simple dashboard app, and would like to pull data every 5 minutes, rather than every 5 minutes for each session. I could, of course, store the cache update times in a database, but so far haven't needed to set up a database for this app, and would love to avoid that dependency if possible. So, is there any way to get (or simulate) this sort of thing? If there's no way to do it without a database, is there any kind of "fake" database engine that comes with Rails, runs in memory, but doesn't bother persisting data between restarts?

    Read the article

  • XML Schema (XSD) to Rails ActiveRecord Mapping?

    - by Incomethax
    I'm looking for a way to convert an XML Schema definition file into an ActiveRecord modeled database. Does anyone know of a tool that happens to do this? So far the best way I've found is to first load the XSD into an RDBMS like postgres or mysql and then have rails connect to do a rake db:schema:dump. This however, only leaves me with a database without rails Models. What would be the best way to import/load this xsd based database into rails?

    Read the article

  • Info on type family instances

    - by yairchu
    Intro: While checking out snoyman's "persistent" library I found myself wanting ghci's (or another tool) assistance in figuring out stuff. ghci's :info doesn't seem to work as nicely with type-families and data-families as it does with "plain" types: > :info Maybe data Maybe a = Nothing | Just a -- Defined in Data.Maybe ... > :info Persist.Key Potato -- "Key Potato" defined in example below data family Persist.Key val -- Defined in Database.Persist ... (no info on the structure/identity of the actual instance) One can always look for the instance in the source code, but sometimes it could be hard to find it and it may be hidden in template-haskell generated code etc. Code example: {-# LANGUAGE FlexibleInstances, GeneralizedNewtypeDeriving, MultiParamTypeClasses, TypeFamilies, QuasiQuotes #-} import qualified Database.Persist as Persist import Database.Persist.Sqlite as PSqlite PSqlite.persistSqlite [$persist| Potato name String isTasty Bool luckyNumber Int UniqueId name |] What's going on in the code example above is that Template-Haskell is generating code for us here. All the extensions above except for QuasiQuotes are required because the generated code uses them. I found out what Persist.Key Potato is by doing: -- test.hs: test = PSqlite.persistSqlite [$persist| ... -- ghci: > :l test.hs > import Language.Haskell.TH > import Data.List > runQ test >>= putStrLn . unlines . filter (isInfixOf "Key Potato") . lines . pprint where newtype Database.Persist.Key Potato = PotatoId Int64 type PotatoId = Database.Persist.Key Potato Question: Is there an easier way to get information on instances of type families and data families, using ghci or any other tool?

    Read the article

  • CakePHP Test Fixtures Drop My Tables Permanently After Running A Test Case

    - by Frank
    I'm not sure what I've done wrong in my CakePHP unit test configuration. Every time I run a test case, the model tables associated with my fixtures are missing form my test database. After running an individual test case I have to re-import my database tables using phpMyAdmin. Here are the relevant files: This is the class I'm trying to test comment.php. This table is dropped after the test. App::import('Sanitize'); class Comment extends AppModel{ public $name = 'Comment'; public $actsAs = array('Tree'); public $belongsTo = array('User' => array('fields'=>array('id', 'username'))); public $validate = array( 'text' = array( 'rule' =array('between', 1, 4000), 'required' ='true', 'allowEmpty'='false', 'message' = "You can't leave your comment text empty!") ); database.php class DATABASE_CONFIG { var $default = array( 'driver' = 'mysql', 'persistent' = false, 'host' = 'project.db', 'login' = 'projectman', 'password' = 'projectpassword', 'database' = 'projectdb', 'prefix' = '' ); var $test = array( 'driver' = 'mysql', 'persistent' = false, 'host' = 'project.db', 'login' = 'projectman', 'password' = 'projectpassword', 'database' = 'testprojectdb', 'prefix' = '' ); } My comment.test.php file. This is the table that keeps getting dropped. <?php App::import('Model', 'Comment'); class CommentTestCase extends CakeTestCase { public $fixtures = array('app.comment', 'app.user'); function start(){ $this-Comment =& ClassRegistry::init('Comment'); $this-Comment-useDbConfig = 'test_suite'; } This is my comment_fixture.php class: <?php class CommentFixture extends CakeTestFixture { var $name = "Comment"; var $import = 'Comment'; } And just in case, here is a typical test method in the CommentTestCase class function testMsgNotificationUserComment(){ $user_id = '1'; $submission_id = '1'; $parent_id = $this-Comment-commentOnModel('Submission', $submission_id, '0', $user_id, "Says: A"); $other_user_id = '2'; $msg_id = $this-Comment-commentOnModel('Submission', $submission_id, $parent_id, $other_user_id, "Says: B"); $expected = array(array('Comment'=array('id'=$msg_id, 'text'="Says: B", 'submission_id'=$submission_id, 'topic_id'='0', 'ack'='0'))); $result = $this-Comment-getMessages($user_id); $this-assertEqual($result, $expected); } I've been dealing with this for a day now and I'm starting to be put off by CakePHP's unit testing. In addition to this issue -- Servral times now I've had data inserted into by 'default' database configuration after running tests! What's going on with my configuration?!

    Read the article

  • Retrieving many huge sized EPS files and converting them to JPEG in ASP.NET application

    - by Ashish Gupta
    I have many (600) EPS files(300 KB - 1 MB) in database. In my ASP.NET application (using ASP.NET 4.0) I need to retrieve them one by one and call a web service which would convert the content to the JPEG file and update the database (JPEGContent column with the JPEG content). However, retrieving the content for 600 of them itself takes too long from the SQL management studio itself (takes 5 minutes for 10 EPS contents). So I have two issues:- 1) How to get the EPS content ( unfortunately, selecting certain number of content is not an option :-( ):- Approach 1:- foreach(var DataRow in DataTable.Rows) { // get the Id and byte[] of EPS // Call the web method to convert EPS content to JPEG which would also update the database. } or foreach(var DataRow in DataTable.Rows) { // get only the Id of EPS // Hit database to get the content of EPS // Call the web method to convert EPS content to JPEG which would also update the database. } or Any other approach? 2) Converting EPS to JPEG using a web method for 600 contents. Ofcourse, each call would be a long running operation. Would task parellel library (TPL) be a better way to achieve this? Also, is doing the entire thing in a SQL CLR function a good idea?

    Read the article

  • Problem: Sorting for GridView/ObjectDataSource changes depending on page

    - by user148298
    I have a GridView tied to an ObjectDataSource using paging. The paging works fine, except that the sort order changes depending on which page of the results is being viewed. This causes items to reappear on subsequent pages among other issues. I traced the problem to my DAL, which reads a page at a time and then sorts it. Obviously the sorting is going to change as the result set size changes. Is there an improvement to this algorithm. I would like to use a datareader if possible: [System.ComponentModel.DataObjectMethod(System.ComponentModel.DataObjectMethodType.Select)] public static WordsCollection LoadForCriteria(string sqlCriteria, int maximumRows, int startRowIndex, string sortExpression) { //DEFAULT SORT EXPRESSION if (string.IsNullOrEmpty(sortExpression)) sortExpression = "OrderBy"; //CREATE THE DYNAMIC SQL TO LOAD OBJECT StringBuilder selectQuery = new StringBuilder(); selectQuery.Append("SELECT"); if (maximumRows > 0) selectQuery.Append(" TOP " + (startRowIndex + maximumRows).ToString()); selectQuery.Append(" " + Words.GetColumnNames(string.Empty)); selectQuery.Append(" FROM sw_Words"); string whereClause = string.IsNullOrEmpty(sqlCriteria) ? string.Empty : " WHERE " + sqlCriteria; selectQuery.Append(whereClause); selectQuery.Append(" ORDER BY " + sortExpression); Database database = Token.Instance.Database; DbCommand selectCommand = database.GetSqlStringCommand(selectQuery.ToString()); //EXECUTE THE COMMAND WordsCollection results = new WordsCollection(); int thisIndex = 0; int rowCount = 0; using (IDataReader dr = database.ExecuteReader(selectCommand)) { while (dr.Read() && ((maximumRows < 1) || (rowCount < maximumRows))) { if (thisIndex >= startRowIndex) { Words varWords = new Words(); Words.LoadDataReader(varWords, dr); results.Add(varWords); rowCount++; } thisIndex++; } dr.Close(); } return results; }

    Read the article

  • How to use SQLAlchemy to dump an SQL file from query expressions to bulk-insert into a DBMS?

    - by Mahmoud Abdelkader
    Please bear with me as I explain the problem, how I tried to solve it, and my question on how to improve it is at the end. I have a 100,000 line csv file from an offline batch job and I needed to insert it into the database as its proper models. Ordinarily, if this is a fairly straight-forward load, this can be trivially loaded by just munging the CSV file to fit a schema, but I had to do some external processing that requires querying and it's just much more convenient to use SQLAlchemy to generate the data I want. The data I want here is 3 models that represent 3 pre-exiting tables in the database and each subsequent model depends on the previous model. For example: Model C --> Foreign Key --> Model B --> Foreign Key --> Model A So, the models must be inserted in the order A, B, and C. I came up with a producer/consumer approach: - instantiate a multiprocessing.Process which contains a threadpool of 50 persister threads that have a threadlocal connection to a database - read a line from the file using the csv DictReader - enqueue the dictionary to the process, where each thread creates the appropriate models by querying the right values and each thread persists the models in the appropriate order This was faster than a non-threaded read/persist but it is way slower than bulk-loading a file into the database. The job finished persisting after about 45 minutes. For fun, I decided to write it in SQL statements, it took 5 minutes. Writing the SQL statements took me a couple of hours, though. So my question is, could I have used a faster method to insert rows using SQLAlchemy? As I understand it, SQLAlchemy is not designed for bulk insert operations, so this is less than ideal. This follows to my question, is there a way to generate the SQL statements using SQLAlchemy, throw them in a file, and then just use a bulk-load into the database? I know about str(model_object) but it does not show the interpolated values. I would appreciate any guidance for how to do this faster. Thanks!

    Read the article

  • MDX performance vs. T-SQL

    - by SubPortal
    I have a database containing tables with more than 600 million records and a set of stored procedures that make complex search operations on the database. The performance of the stored procedures is so slow even with suitable indexes on the tables. The design of the database is a normal relational db design. I want to change the database design to be multidimensional and use the MDX queries instead of the traditional T-SQL queries but the question is: Is the MDX query better than the traditional T-SQL query with regard to performance? and if yes, to what extent will that improve the performance of the queries? Thanks for any help.

    Read the article

  • openquery issue in SQL Server

    - by George2
    Hello everyone, I am using SQL Server 2008 (let us call this source database server in this question discussion), and in SSMS, I have created a linked server to another SQL Server 2008 database (let us call this destination database server in this question discussion). When I issue statement -- select * from [linked server name].[database name].[dbo].[table name], error will be returned, Linked server "ZS" The OLE DB access interface "SQLNCLI10" returned "NON-CLUSTERED and NOT INTEGRATED "Index" ix_foo_basic_info_nf ", which is incorrect bookmark ordinal 0. When I issue statement -- select * from openquery([linked server name],'select * from [table name]'), there will be no errors, any ideas what is wrong? thanks in advance, George

    Read the article

  • Best Installation Software?

    - by Chris
    I am interested in knowing what the best software would be to build an installation package that performs the following: (1) Installs client application (2) Detects all SQL server instances on network, allowing user to select specific database to upgrade (which would then upgrade database using an embedded SQL script) (3) Installs website on a server/location specified by user, and configures IIS 6.0 and/or 7.0 based on settings that I specify. (4) Creates a simple setup.exe - and allows user to choose installation components (listed above, i.e install client app, sql server database, and/or website), and then download selected components from remove server. I have tried NSIS - as was able to create an installation package that will download a compressed (gzip) component from a remote server, decompress the file, install the components, and then remove the gzip file. So, this worked beautifully. The part where I am stuck is to be able to perform the database upgrade and website install. Any suggestions would be great. Thanks. Chris

    Read the article

  • Peer review maatkits mk-parallel-dump and mk-parallel-restore usage?

    - by Brent
    Hiya Im trying to make use of maatkit as a means of dumping a database and then restoring to another database. For dumps: mk-parallel-dump --user abc --password xyz --databases $db --base-dir /tmp/dump For restore: mk-parallel-restore --create-databases --user abc --password xyz --database devdb /tmp/dump My question is, is my logic and understanding correct, and would it be ok to do it like this. Kind Regards Brent

    Read the article

  • How do I connect to mysql from php ?

    - by roberto
    Hi guys. I'm working through examples from a book on php/mysql development. I'm working on a linux/apache environment. I've set up a database and a user. I attempt to connect with this line of code: $db_server = mysql_connect($db_hostname, $db_username, $db_password); I get this error: Warning: mysql_connect() [function.mysql-connect]: Access denied for user 'www-data'@'localhost' (using password: YES) in /var/www/hosts/dj/connect.php on line 3 unable to connect to database: Access denied for user 'www-data'@'localhost' (using password: YES) I can only guess what is happening here: I think www-data is a username for apache. Upon the database connection, the credentials being passed in to mysql are not those of my database user, but rather apache's own credentials. Is that what is happening here? How do I pass in the credentials I've defined for my user ?

    Read the article

  • Autocomplete and Dynamic Parameter Passing

    - by abcParsing
    The code below works fine using jQuery UI 1.8 and jQuery 1.4.2 $("#sid_entry_box").autocomplete( {source:"autocomplete_sid.php?database="+database_name, minLength:4, delay:1000, enable:true, cacheLength:1 }); The database name is passed as a get parameter of the php call. In this application, I have two databases selected by a radio button. Since jQuery loads and assigns this function when the document is loaded, the database name is whatever was checked at that momemnt. What I really need to pass to the php call is the following: database=$("input[name=rf_database_option]:checked").val(); Is ther ean easy to understand way to be able to pass a dynamic dom value?

    Read the article

  • Can't connect SQL 2008 Express to Access project

    - by Gerhard
    I just installed SQL 2008 Express and want to create an Access project (ADP). When I get to the Microsoft SQL Server Database Wizard in Access and after clicking on next to create the database I get this message: The new database wizard does not work with the version of Microsoft SQL Server to which you Access project is connected. See the Microsoft Update Web site for the latest information an downloads I can't find any solution to the problem so far. Any ideas why and how to solve this problem?

    Read the article

  • How to generate two XML files from a single HL7 file and insert both into two different columns as a single record?

    - by Vivek Ratnaparkhi
    I have Source Connector Type as 'File Reader' which is reading HL7 files and Destination Connector Type as 'Database Writer'. My database table has two columns Participant_Information SPR_Information I want to transform a single HL7 file into two XML files one for Participant_Information column and other for SPR_Information column and need to insert both as a single record into the database table. I'm able to insert one XML at a time but not able to find the way to insert both the XMLs as a single record into the database table. Any help is really greatly appreciated!

    Read the article

  • Deadlock error in INSERT statement

    - by Gnanam
    We've got a web-based application. There is a time-bound database operation (INSERTs and UPDATEs) in the application which takes more time to complete, hence this particular flow has been changed into Java Thread so it will not wait (block) for the complete database operation to be completed. My problem is, if more than 1 user come across this particular flow, I'm facing the following error thrown by PostgreSQL: org.postgresql.util.PSQLException: ERROR: deadlock detected Detail: Process 13560 waits for ShareLock on transaction 3147316424; blocked by process 13566. Process 13566 waits for ShareLock on transaction 3147316408; blocked by process 13560. Above error is consistently thrown in INSERT statements. Additional Information: 1) I have PRIMARY KEY defined in this table. 2) There are FOREIGN KEY references in this table. 3) Separate database connection is passed to each Java Thread. Technologies Web Server: Tomcat v6.0.10 Java v1.6.0 Servlet Database: PostgreSQL v8.2.3 Connection Management: pgpool II

    Read the article

  • How to return a recordset from a function

    - by Scott
    I'm building a data access layer in Excel VBA and having trouble returning a recordset. The Execute() function in my class is definitely retrieving a row from the database, but doesn't seem to be returning anything. The following function is contained in a class called DataAccessLayer. The class contains functions Connect and Disconnect which handle opening and closing the connection. Public Function Execute(ByVal sqlQuery as String) As ADODB.recordset Set recordset = New ADODB.recordset Dim recordsAffected As Long ' Make sure we are connected to the database. If Connect Then Set command = New ADODB.command With command .ActiveConnection = connection .CommandText = sqlQuery .CommandType = adCmdText End With ' These seem to be equivalent. 'Set recordset = command.Execute(recordsAffected) recordset.Open command.Execute(recordsAffected) Set Execute = recordset recordset.ActiveConnection = Nothing recordset.Close Set command = Nothing Call Disconnect End If Set recordset = Nothing End Function Here's a public function that I'm using in cell A1 of my spreadsheet for testing. Public Function Scott_Test() Dim Database As New DataAccessLayer 'Dim rs As ADODB.recordset 'Set rs = CreateObject("ADODB.Recordset") Set rs = New ADODB.recordset Set rs = Database.Execute("SELECT item_desc_1 FROM imitmidx_sql WHERE item_no = '11001'") 'rs.Open Database.Execute("SELECT item_desc_1 FROM imitmidx_sql WHERE item_no = '11001'") 'rs.Open ' This never displays. MsgBox rs.EOF If Not rs.EOF Then ' This is displaying #VALUE! in cell A1. Scott_Test = rs!item_desc_1 End If rs.ActiveConnection = Nothing Set rs = Nothing End Function What am I doing wrong?

    Read the article

  • SQLiteException Unknown error

    - by -providergeoff.bruckner
    Does anyone know what this means? I'm trying to start a transaction in onActivityResult() to insert a row based on the received result. 03-05 15:39:51.937: ERROR/Database(2387): Failure 21 (out of memory) on 0x0 when preparing 'BEGIN EXCLUSIVE;'. 03-05 15:39:51.967: DEBUG/AndroidRuntime(2387): Shutting down VM 03-05 15:39:51.967: WARN/dalvikvm(2387): threadid=3: thread exiting with uncaught exception (group=0x40013140) 03-05 15:39:51.967: ERROR/AndroidRuntime(2387): Uncaught handler: thread main exiting due to uncaught exception 03-05 15:39:52.137: ERROR/AndroidRuntime(2387): java.lang.RuntimeException: Failure delivering result ResultInfo{who=null, request=1, result=-1, data=Intent { (has extras) }} to activity {com.ozdroid/com.ozdroid.load.LoadView}: android.database.sqlite.SQLiteException: unknown error: BEGIN EXCLUSIVE; ... 03-05 15:39:52.137: ERROR/AndroidRuntime(2387): Caused by: android.database.sqlite.SQLiteException: unknown error: BEGIN EXCLUSIVE; ... 03-05 15:39:52.137: ERROR/AndroidRuntime(2387): at android.database.sqlite.SQLiteDatabase.beginTransaction(SQLiteDatabase.java:434)

    Read the article

  • Access to Apache::DBI from DBI

    - by hpavc
    Is it possible to access a Apache::DBI database handle from a perl script (that isn't running under mod_perl). What I am looking for is database pooling for my perl scripts, I have a fair number of database sources (oracle/mysql) and a growing number of scripts. Some ideas like SQLRelay, using Oracle10XE with database links and pooling, and or convert all scripts to SOAP calls, etc are becoming more and more viable. But if there was a mechanism for reusing Apache::DBI I could fight this a bit. I have no non-perl requirements, so we don't have a php/jdbc implementation or similar to deal with. Thanks

    Read the article

  • MySQL Connection Error in PHP

    - by user309381
    I have set the password for root and grant all privileges for root. Why does it say it is denied? ****mysql_query() [function.mysql-query]: Access denied for user 'SYSTEM'@'localhost' (using password: NO) in C:\wamp\www\photo_gallery\includes\database.php on line 56 Warning: mysql_query() [function.mysql-query]: A link to the server could not be established in C:\wamp\www\photo_gallery\includes\database.php on line 56 The Query has problemAccess denied for user 'SYSTEM'@'localhost' (using password: NO) Code as follows: <?php include("DB_Info.php"); class MySQLDatabase { public $connection; function _construct() { $this->open_connection(); } public function open_connection() { /* $DB_SERVER = "localhost"; $DB_USER = "root"; $DB_PASS = ""; $DB_NAME = "photo_gallery";*/ $this->connection = mysql_connect($DBSERVER,$DBUSER,$DBPASS); if(!$this->connection) { die("Database Connection Failed" . mysql_error()); } else { $db_select = mysql_select_db($DBNAME,$this->connection); if(!$db_select) { die("Database Selection Failed" . mysql_error()); } } } function mysql_prep($value) { if (get_magic_quotes_gpc()) { $value = stripslashes($value); } // Quote if not a number if (!is_numeric($value)) { $value = "'" . mysql_real_escape_string($value) . "'"; } return $value; } public function close_connection() { if(isset($this->connection)) { mysql_close($this->connection); unset($this->connection); } } public function query($sql) { //$sql = "SELECT*FROM users where id = 1"; $result = mysql_query($sql); $this->confirm_query($result); //$found_user = mysql_fetch_assoc($result); //echo $found_user; return $found_user; } private function confirm_query($result) { if(!$result) { die("The Query has problem" . mysql_error()); } } } $database = new MySQLDatabase(); ?>

    Read the article

< Previous Page | 440 441 442 443 444 445 446 447 448 449 450 451  | Next Page >