Search Results

Search found 17634 results on 706 pages for 'django multi db'.

Page 564/706 | < Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >

  • jQuery: how to handle empty return from getJSON

    - by Gee
    Alright so I have a php script which gets results from a DB, and to get those results I'm using a jQuery script to pull the results via getJSON. It works perfectly but now I want to do something if the php script returns no results (empty). I tried: $.getJSON('path/to/script'), {parameter:parameter}, function(data){ if (data) { alert('Result'); } else { alert('Empty); } }); But it's no good. I've tried different things like if (data.length) but still nothing. I've noticed that if there is no returned data the callback will never fire at all. So if that's the case, how do I handle a empty return?

    Read the article

  • Hibernate / MySQL Bulk insert problem

    - by Marty Pitt
    I'm having trouble getting Hibernate to perform a bulk insert on MySQL. I'm using Hibernate 3.3 and MySQL 5.1 At a high level, this is what's happening: @Transactional public Set<Long> doUpdate(Project project, IRepository externalSource) { List<IEntity> entities = externalSource.loadEntites(); buildEntities(entities, project); persistEntities(project); } public void persistEntities(Project project) { projectDAO.update(project); } This results in n log entries (1 for every row) as follows: Hibernate: insert into ProjectEntity (name, parent_id, path, project_id, state, type) values (?, ?, ?, ?, ?, ?) I'd like to see this get batched, so the update is more performant. It's possible that this routine could result in tens-of-thousands of rows generated, and a db trip per row is a killer. Why isn't this getting batched? (It's my understanding that batch inserts are supposed to be default where appropriate by hibernate).

    Read the article

  • How do you make your Java application memory efficient?

    - by Boune
    How do you optimize the heap size usage of an application that has a lot (millions) of long-lived objects? (big cache, loading lots of records from a db) Use the right data type Avoid java.lang.String to represent other data types Avoid duplicated objects Use enums if the values are known in advance Use object pools String.intern() (good idea?) Load/keep only the objects you need I am looking for general programming or Java specific answers. No funky compiler switch. Edit: Optimize the memory representation of a POJO that can appear millions of times in the heap. Use cases Load a huge csv file in memory (converted into POJOs) Use hibernate to retrieve million of records from a database Resume of answers: Use flyweight pattern Copy on write Instead of loading 10M objects with 3 properties, is it more efficient to have 3 arrays (or other data structure) of size 10M? (Could be a pain to manipulate data but if you are really short on memory...)

    Read the article

  • Fluent NHibernate + multiple databases

    - by Pablote
    My project needs to handle three databases, that means three session factories. The thing is if i do something like this with fluent nhibernate: .Mappings(m = m.FluentMappings.AddFromAssembly(Assembly.GetExecutingAssembly())) the factories would pick up all the mappings, even the ones that correspond to another database I've seen that when using automapping you can do something like this, and filter by namespace: .Mappings(m = m.AutoMappings.Add(AutoMap.AssemblyOf().Where(t = t.Namespace == "Storefront.Entities"))) I havent found anything like this for fluent mappings, is it possible?? The only solutions I can think of are: either create separate assemblies for each db mapping classes or explicitly adding each of the entities to the factory configuration. I would prefer to avoid both, if possible. Thanks.

    Read the article

  • rails howto compare datetime ?

    - by fenec
    hello, i have games in my sqLite DB with the attribute starting_date( t.date :starting_date). i would like to know all the games that have alreday started so i am using this lines of code: Game.find :all,:conditions=>"starting_date <= #{Date.today}" Game.find_by_sql("SELECT * FROM "games" WHERE (created_at < 2010-05-13)") the result is nill,even though i know that i have games that have already started like this one : #<Game id: 1, team_1_id: 2, team_2_id: 1, status: 2, team_1_points: nil, team_2_points: nil, starting_date: "2010-05-05", winner: 1, sport: "football", country: nil, league: "calcio", created_at: "2010-04-07 00:09:21", updated_at: "2010-05-13 00:57:19"> what am i doing wrong here?

    Read the article

  • Problems using jQuery $.ajax to pass data

    - by iboeno
    I'm using ASP.NET and attempting to call a method with a signature of [WebMethod] public static string GetInfo(string id){...} using the following javascript: var elementValue = $("#element").attr('id'); var d = "{id : " + elementValue + "}"; $.ajax({ type: "POST", url: "../WebPage.aspx/GetInfo", data: d, contentType: "application/json; charset=utf-8", dataType: "json", success: function(msg) { //do this } }); And this is not working. If instead I set elementValue = 2; it works fine. If I try to hardcode in a string value for testing purposes e.g. elementValue = "nameToLookUp"; It fails. Why is this happening, and how do I resolve it? On a side not, why is type: required to be POST instead of a GET? In the end I just want to pass a string value I want to look up in a DB and retrieving some json data.

    Read the article

  • Embed a database in the .apk of a distributed application [Android]

    - by Sephy
    Hi everybody, My question is I think quite simple but I don't think the answer will be... I have quite a lot of content to have in my application to make it run properly, and I'm thinking of putting all of it in a database, and distribute it in an embeded database with the application in the market. The only trouble is that I have no idea of how to do that. I know that I can extract a file .db from Eclipse DDMS with the content of my database, and I suppose I need to put it in the asset folder of my application, but then how to make the application use it to regenerate the application database? If you have any link to some code or help, that would be great. Thanks

    Read the article

  • Read and write .NET Objects in SQL Database without serialization.

    - by Mohit
    Hello, I have a small query. I need to create a Caching Service of my own that will write and read .NET Objects to and from the Database. Now, I have achieved that with the help of Binary Serialization. But the Problem is I need to deliberately marked my objects as [Serializable], which makes me think that what if someone will try to add an object which is not marked as [Serializable]. Thus, I need to find a way to read and write Objects to Database without Serialization. I have one thought too.. As we all know Session can store any object in it. Now, we can make sessions to be stored in the DB, outproc. What mechanism it uses to store these objects without serializing or deserializing. Any help will be highly appreciated. Thanks. M.B

    Read the article

  • Tips for maximizing Nginx requests/sec?

    - by linkedlinked
    I'm building an analytics package, and project requirements state that I need to support 1 billion hits per day. Yep, "billion". In other words, no less than 12,000 hits per second sustained, and preferably some room to burst. I know I'll need multiple servers for this, but I'm trying to get maximum performance out of each node before "throwing more hardware at it". Right now, I have the hits-tracking portion completed, and well optimized. I pretty much just save the requests straight into Redis (for later processing with Hadoop). The application is Python/Django with a gunicorn for the gateway. My 2GB Ubuntu 10.04 Rackspace server (not a production machine) can serve about 1200 static files per second (benchmarked using Apache AB against a single static asset). To compare, if I swap out the static file link with my tracking link, I still get about 600 requests per second -- I think this means my tracker is well optimized, because it's only a factor of 2 slower than serving static assets. However, when I benchmark with millions of hits, I notice a few things -- No disk usage -- this is expected, because I've turned off all Nginx logs, and my custom code doesn't do anything but save the request details into Redis. Non-constant memory usage -- Presumably due to Redis' memory managing, my memory usage will gradually climb up and then drop back down, but it's never once been my bottleneck. System load hovers around 2-4, the system is still responsive during even my heaviest benchmarks, and I can still manually view http://mysite.com/tracking/pixel with little visible delay while my (other) server performs 600 requests per second. If I run a short test, say 50,000 hits (takes about 2m), I get a steady, reliable 600 requests per second. If I run a longer test (tried up to 3.5m so far), my r/s degrades to about 250. My questions -- a. Does it look like I'm maxing out this server yet? Is 1,200/s static files nginx performance comparable to what others have experienced? b. Are there common nginx tunings for such high-volume applications? I have worker threads set to 64, and gunicorn worker threads set to 8, but tweaking these values doesn't seem to help or harm me much. c. Are there any linux-level settings that could be limiting my incoming connections? d. What could cause my performance to degrade to 250r/s on long-running tests? Again, the memory is not maxing out during these tests, and HDD use is nil. Thanks in advance, all :)

    Read the article

  • MySQL - mysqldump --routines to only export 1 stored procedure (by name) and not every routine

    - by Joe Stein
    So we have a lot of routines that come out from exporting. We often need to get these out in CLI, make changes and bring them back in. Yes some of these are managed by different folks and a better change control is required but for now this is the situation. If I do mysqldump --routines --no-create-info --no-data --no-create-db then great I have 200 functions I need to go through a file to find just the one or set I want. Is there anyway to mysqldump routines that I want like there is for tables???

    Read the article

  • Unread email notifier, most practical approach

    - by Michael Pasqualone
    I'm in the process of writing a small php-cli script that will loop over over my personal inbox and then send me an SMS via a gateway. The question I have is: As will have the script launch via cron every 10 minutes, if there is an email sitting in my inbox that is not read before the next script launch then I will receive 2 sms. Does any one (pseudocode will do) have any idea what the best practice would be in php5 to ensure only 1 SMS is sent? What I am currently learning towards is towards storing the message ID in a sqlite DB and flagging a field whether an SMS has been sent or not - but wondering if there is an easier way?

    Read the article

  • VS 2010 Entity Repository Error

    - by Steve
    In my project I have it set up so that all the tables in the DB has the property "id" and then I have the entity objects inherit from the EntityBase class using a repository pattern. I then set the inheritance modifier for "id" property in the dbml file o/r designer to "overrides" Public MustInherit Class EntityBase MustOverride Property id() As Integer End Class Public MustInherit Class RepositoryBase(Of T As EntityBase) Protected _Db As New DataClasses1DataContext Public Function GetById(ByVal Id As Integer) As T Return (From a In _Db.GetTable(Of T)() Where a.id = Id).SingleOrDefault End Function End Class Partial Public Class Entity1 Inherits EntityBase End Class Public Class TestRepository Inherits RepositoryBase(Of Entity1) End Class the line Return (From a In _Db.GetTable(Of T)() Where a.id = Id).SingleOrDefault however produces the error "Class member EntityBase.id is unmapped" when i use VS 2010 using the 4.0 framework but I never received that error with the old one. Any help would be greatly appreciated. Thanks in advance.

    Read the article

  • Python library to detect if a file has changed between different runs?

    - by Stefano Borini
    Suppose I have a program A. I run it, and performs some operation starting from a file foo.txt. Now A terminates. New run of A. It checks if the file foo.txt has changed. If the file has changed, A runs its operation again, otherwise, it quits. Does a library function/external library for this exists ? Of course it can be implemented with an md5 + a file/db containing the md5. I want to prevent reinventing the wheel.

    Read the article

  • Problem with NHibernate and saving.

    - by Vilx-
    When I do this: Cat x = Session.Load<Cat>(123); x.Name = "fritz"; Session.Flush(); NHibernate detects the change and UPDATEs the DB. But, when I do this: Cat x = new Cat(); Session.Save(x); x.Name = "fritz"; Session.Flush(); I get NULL for name, because that's what was there when I called Session.Save(). Why doesn't NHibernate detect the changes - or better yet, take the values for the INSERT statement at the time of Flush()?

    Read the article

  • Call to a member function query() on a non-object

    - by Randy Gonzalez
    Ok, this is so weird!!! I am running PHP Version 5.1.6 when I try and run the code below it gives a fatal error of an object that has not been instantiated. As soon as I un-comment this line of code //$cb_db = new cb_db(USER, PASSWORD, NAME, HOST); everything works. Even though I have declared the $cb_db object as global within in the method. Any help would be greatly appreciated. require_once ( ROOT_CB_CLASSES . 'db.php'); $cb_db = new cb_db(USER, PASSWORD, NAME, HOST); class cb_user { protected function find_by_sql( $sql ) { global $cb_db; //$cb_db = new cb_db(USER, PASSWORD, NAME, HOST); $result_set = $cb_db->query( $sql ); $object_array = array(); while( $row = $cb_db->fetch_array( $result_set ) ) { $object_array[] = self::instantiate( $row ); } return $object_array; } }

    Read the article

  • How easy would it be to refactor a small JSP/Servlet/JDBC project to SpringMVC/Hibernate

    - by John
    With reference to this post, I am considering starting a new web-based Java project. Since I don't know Spring/Hibernate I was concerned if it's a bad plan to start learning them while creating a new project, especially since it will slow down the early development. One idea I had was to write a prototype using tech I do know, namely JSP/Servlets/JDBC, since I can get this running much quicker with my current knowledge. I could then throw the whole thing away and start over with Spring, etc, but I'd like to consider how easy it would be to refactor a smallish project from JSP/Servlets/JDB to SpringMVC/Hibernate? My DB could of course be re-used but what about other code... would I expect to save most of it plugged into an MVC framework, or is the paradigm shift big enough this would cause more trouble than it avoids? Please use the other question for more general advice on choosing technologies

    Read the article

  • query excuting problem

    - by srini-r85
    hi, i tried to execute following query in php script. $db_selected = mysql_select_db("lumiinc1_sndemo1", $con); if ($db_selected) { echo "database connected"; } else { die ("Can\'t use db : " . mysql_error()); } $sql = "INSERT INTO `markers` ( `name`, `address`, `lat`, `lng`, `id` ) SELECT `name`, `street`, `latitude`, `longitude`, `lid` FROM `location` WHERE NOT EXISTS ( SELECT * FROM `markers` WHERE `location`.`lid` = `markers`.`id` )"; $result = mysql_query($sql); if ($result) { echo "Query executed OK"; } else { die("Invalid query: " . mysql_error()); } script does not show any error.also query executed.but i didn't get my expected result.at the same i try this query in phpmyAdmin i got my expected result. i dont know the cause of this problem. plz any one find the problem . thanks

    Read the article

  • [jquery] Appending to Second Last element

    - by Shishant
    Hello, This is the final output My HTML <li id='$id'>TEXT <ul class='indent'> <li id='$id'>TEXT</li> <li id='$id'>TEXT</li> <li class='formContainer'> FORM </li> </ul> </li> I want to append a li between form after all other li So in this example new li will be appended between Test141 APPEND Input Box The $id are db ids of li which are unique

    Read the article

  • why doesnt' nhibernate support this syntax ??

    - by ooo
    i have the following query and its failing in Nhibernate 3 LINQ witha a "Non supported" exception. My DB tables are: VacationRequest (id, personId) VacationRequestDate (id, vacationRequestId) Person (id, FirstName, LastName) My Entities are: VacationRequest (Person, IList) VacationRequestDate (VacationRequest, Date) Here is the query that is getting a "Non supported" Exception Session.Query<VacationRequestDate>().Where(r => people.Contains(r.VacationRequest.Person, new PersonComparer())).Fetch(r=>r.VacationRequest).ToList(); is there a better way to write this that would be supported in Nhibernate? fyi . .the PersonComparer just compared person.Id

    Read the article

  • Best way to correct garbled data caused by false encoding

    - by ercan
    Hi all, I have a set of data that contains garbled text fields because of encoding errors during many import/exports from one database to another. Most of the errors were caused by converting UTF-8 to ISO-8859-1. Strangely enough, the errors are not consistent: the word 'München' appears as 'München' in some place and as 'MÜnchen'. Is there a trick in SQL server to correct this kind of crap? The first thing that I can think of is to exploit the COLLATE clause, so that ü is interpreted as ü, but I don't exactly know how. If it isn't possible to make it in the DB level, do you know any tool that helps for a bulk correction? (no manual find/replace tool, but a tool that guesses the garbled text somehow and correct them)

    Read the article

  • Easy plugin or procedure for sqlserver Management Studio to script row inserts.

    - by Patrick Karcher
    I've never been able to find a good script or plugin for sql server Management Studio (2005 and or 2008) for a very common scripting need: specifying a few/all rows in a table and scripting their insert. You can guess my story: I've got some configuration data in my dev db and I need to script it for deployment to UAT and then production. I've found a few cludgy systems in the past, that were more trouble than they were worth. I need something free and unobtrusive. Once I find it I'll share it with the other 20 developers in my shop who are annoyed by this. Aren't we all annoyed by this by the way? What is the best, easiest, free, way to specify a few/all rows in a table and get a script their insert?

    Read the article

  • Transfer Core Data from One Project to Another

    - by Michael
    The answer is probably a resounding 'NO' but before I start a new project from scratch, I thought I'd ask. I create many throw away projects to test ideas and code before combining all the successful bits from the scratch projects into a final version. So I have one project with the Core Data stuff worked out but I want to move it to a new project. My guess is that there is too many internal hooks and dropping in the .xcdatamodel and the sqlite db is just not going to work. I'd glad to be wrong...

    Read the article

  • Service Broker not working after database restore

    - by roryok
    Have a working Service Broker set up on a server, we're in the process of moving to a new server but I can't seem to get Service Broker set up on the new box. Have done the obvious (to me) things like Enabling Broker on the DB, dropping the route, services, contract, queues and even message type and re adding them, setting ALTER QUEUE with STATUS ON SELECT * FROM sys.service_queues gives me a list of the queues, including my own two, which show as activation_enabled, receive_enabled etc. Needless to say the queues aren't working. When I drop messages into them nothing goes in and nothing comes out. Any ideas? I'm sure there's something really obvious I've missed...

    Read the article

  • How to batch retrieve documents with mongoDB?

    - by edude05
    Hello everyone, I have an application that queries data from a mongoDB using the mongoDB C# driver something like this: public void main() { foreach (int i in listOfKey) { list.add(getObjectfromDB(i); } } public myObject getObjFromDb(int primaryKey) { document query = new document(); query["primKey"] = primaryKey; document result= mongo["myDatabase"]["myCollection"].findOne(query); return parseObject(result); } On my local (development) machine to get 100 object this way takes less than a second. However, I recently moved the database to a server on the internet, and this query takes about 30 seconds to execute for the same number of object. Furthermore, looking at the mongoDB log, it seems to open about 8-10 connections to the DB to perform this query. So what I'd like to do is have the query the database for an array of primaryKeys and get them all back at once, then do the parsing in a loop afterwards, using one connection if possible. How could I optimize my query to do so? Thanks, --Michael

    Read the article

  • Disable Primary Key and Re-Enable After SQL Bulk Insert

    - by Jon
    I am about to run a massive data insert into my DB. I have managed to work out how to enable and rebuild non-clustered indexes on my tables but I also want to disable/enable primary keys. You can't disable the clustered index for the primary key as the table is inaccessible when that is done and my attempt to do a ALTER TABLE for constraints does not work as I think that is only for foreign keys. Do you know of a way to Disable the Primary Key and Re-Enable After SQL Bulk Insert. NOTE: This is over numerous tables and so I don't know the exact primary key specifications eg/name etc

    Read the article

< Previous Page | 560 561 562 563 564 565 566 567 568 569 570 571  | Next Page >