Search Results

Search found 30858 results on 1235 pages for 'database tuning'.

Page 602/1235 | < Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >

  • Remote Postgresql - extremely slow

    - by Muffinbubble
    Hi, I have setup PostgreSQL on a VPS I own - the software that accesses the database is a program called PokerTracker. PokerTracker logs all your hands and statistics whilst playing online poker. I wanted this accessible from several different computers so decided to installed it on my VPS and after a few hiccups I managed to get it connecting without errors. However, the performance is dreadful. I have done tons of research on 'remote postgresql slow' etc and am yet to find an answer so am hoping someone is able to help. Things to note: The query I am trying to execute is very small. Whilst connecting locally on the VPS, the query runs instantly. While running it remotely, it takes about 1 minute and 30 seconds to run the query. The VPS is running 100MBPS and then computer I'm connecting to it from is on an 8MB line. The network communication between the two is almost instant, I am able to remotely connect fine with no lag whatsoever and am hosting several websites running MSSQL and all the queries run instantly, whether connected remotely or locally so it seems specific to PostgreSQL. I'm running their newest version of the software and the newest compatible version of PostgreSQL with their software. The database is a new database, containing hardly any data and I've ran vacuum/analyze etc all to no avail, I see no improvements. I don't understand how MSSQL can query almost instantly yet PostgreSQL struggles so much. I am able to telnet to the post 5432 on the VPS IP with no problems, and as I say the query does execute it just takes an extremely long time. What I do notice is on the router when the query is running that hardly any bandwidth is being used - but then again I wouldn't expect it to for a simple query but am not sure if this is the issue. I've tried connecting remotely on 3 different networks now (including different routers) but the problem remains. Connecting remotely via another machine via the LAN is instant. I have also edited the postgre conf file to allow for more memory/buffers etc but I don't think this is the problem - what I am asking it to do is very simple - it shouldn't be intensive at all. Thanks, Ricky

    Read the article

  • How to write a datatable to excel work book at client side

    - by senthilkumar
    Hi Guys Currently i am doing one project in that we need to generate report from database. Since my server memory is too low im getting 'Out of Memory' Exception when im writing it at serverside and also when i write directly to a excel file using http header as excel file im not able to create multiple sheets since my database table is huge more than 65536 rows. I saw many solution using a third party tool but i cant use those into mine..If anyone already worked on this please give me some direction. Also i tried using javascript but for that i need to use datagrid at server side?? but in my project i m not allowed to use like this.

    Read the article

  • Seeking tutorial: introduction to ODBC with Delphi

    - by mawg
    I have a lot of embedded C/C++/Ada experience and an outdated smattering of Delphi plus some database stuff. Now I have to implement an app in Delphi which can manipulate MySql, Oracle, maybe MS Access. In short, I need ODBC. I need to programmatically created a database, define its structure and populate its contents, then later query its existence and programmatically search. I would prefer not to use 3rd party components, unless there is a compelling reason to do so (performance ought not to be an issue for the app, it won't have much data or be run often, at least not in v1.0) . Can anyone point me at a tutorial which can get me up to speed? Thanks

    Read the article

  • Longish list of allowed referrers

    - by Tim
    I want to allow hotlinking only from a list of referrers (paying customers, probably a few hundred). I am on Apache 1.3 and I do not have access to the configuration (only .htaccess). What is the fastest way to implement this? My thoughts so far: PHP with database and readfile() (SSI with) Perl and database the list implemented as symlinks named after the allower referrer, then RewriteCond using HTTP_REFERER everything in .htaccess, lots of RewriteCond's everything in .htaccess, lots of SetEnvIf's Any better (faster) ways to do this? Thanks!

    Read the article

  • Need Help with Page Life Cycle(I think it is screwing me up)

    - by chobo2
    Hi I have dragged a empty asp.net table onto my webform. I generate all the rows in the code behind those. Now my table gets filled up and has dropdown lists. When the user hits save I go through all the rows and update the values from the dropdownlist in the db. This works all great. However if 2 columns have each have "Present" then those 2 columns should be not be shown anymore and 2 new columns get put in its place with other dropdown lists. This all works. However you have to refresh the entire page to for the 2 columns that should go away to go away. So what I tried to do is at the end of the button click event. Clear the whole table and then regenerate it. However when I do this then my values are not saved to the database anymore for whatever reason. if (IsPostBack == false) { // check if dummy variables exist in db- If true just generate tables with values in db. If not generate them. } else { // grab the values from the database // generate tables with the values } btn click event { go through all rows in table(foreach loop) update each column in the database with cells in each row. while in foreach loop. //done } So this is how it goes and it works expect(all correct values are saved) the table is just not updated to the user. Does not work if (IsPostBack == false) { // same code as above } // if postback is true do nothing. By the time it gets to the click event it says there is zero rows in the table so nothing happens. btn click event { // same code } Fails also. if (IsPostBack == false) { // same code as above } else { // same code as above but moved into its own method. gernerateTable(); } btn click event { // update all rows // once done clear the Tables rows // call generateTable() } This last one does nothing as for some reason it does not update anything. I don't understand why. So what am I doing wrong with this life cycle something in my process is wrong. The code works just not when I want the table to be updated right away.

    Read the article

  • Size SKU in Magento

    - by latvian
    Hi, How can I allow SKU be longer than 34 characters (for simple products) for all products? When i add new product(simple) and enter more than 34 characters, Magento cuts it to 34 after saving. In the database the 'sku' attributes ( for quete item,order item,invoice item, shipment item) from eav_attribute table hold varchar(255). For Catalog_Product_Entity the attributed is VarChar(64). In either case, it is more than 34characters. Thus, I could change SKU to 64 characters without making any change in database, correct? How do i do that? I have good understanding of the user side magento code, but not Admin side. Can you suggest me good tutorial on making changes in Admin side that could help me figure this question myself. Thank you, Margots

    Read the article

  • Problem With Inserts of multibyte (converted to utf-8) strings in the mysql tables of utf_unicode_ci encoding

    - by user381595
    http://domainsoutlook.com/sandbox/keyword/?s=http://bhaskar.com raw example of my keyword density analyser. Every keyword shows up properly with no problems in unicode conversions etc. Now, When I am adding these words to the database column of a table, the words show up as messed up. http domainsoutlook.com/b/site/bhaskar.com.html For example on this front end page if you see there is a keyword that is shown as a blank but still occurs on the website 8 times. (It isnt empty in the database though). I have checked and there is no problem with mysql_real_escape_String...because the output stays the same before and after the word is gone through mysql_real_escape_String. Another problem was that I wanted to fix my urls for arabic language. They should be showing up as /word-{1st letter of the word}/{whole word}.html but its showing as /word-{whole word}/{1st letter of the word}.html I really need answers for these two questions.

    Read the article

  • Syncing a table records with a Service response frequently

    - by Karthik Dheeraj
    I am requesting data from a service whose response in stored in a database.First, I have an empty table, whenever I make my very first request the records from the service comes to my database table. from now, whenever I make second request, the service will provide me some records which may be same as my first response, may be new records, may be updated records etc. my query is to how to update my table with respect to the responses coming from the service during my second request on-wards? so that Unchanged records will remain same, New records will be added, updated records will be updated.Do I need to write any stored procedure on my DB or any workaround ?what might be the scenario if I use Nomysql DB's like mongo DB ? Thanks In Advance.

    Read the article

  • Why does tokyo tyrant slow down exponentially even after adjusting bnum?

    - by HenryL
    Has anyone successfully used Tokyo Cabinet / Tokyo Tyrant with large datasets? I am trying to upload a subgraph of the Wikipedia datasource. After hitting about 30 million records, I get exponential slow down. This occurs with both the HDB and BDB databases. I adjusted bnum to 2-4x the expected number of records for the HDB case with only a slight speed up. I also set xmsiz to 1GB or so but ultimately I still hit a wall. It seems that Tokyo Tyrant is basically an in memory database and after you exceed the xmsiz or your RAM, you get a barely usable database. Has anyone else encountered this problem before? Were you able to solve it?

    Read the article

  • SQL Azure server as unit of billing

    - by vtortola
    Hi, One of the azure training kit presentation says: Each account has zero or more logical servers Provisioned via a common portal Establishes a billing instrument Each logical server has one or more databases Contains metadata about database & usage Unit of authentication, geo-location, billing, reporting Generated DNS-based name Each database has standard SQL objects Users, Tables, Views, Indices, etc Unit of consistency So now I'm lost :D. Were not the databases themselves the units of billing? I mean, I thought that servers were just like logical containers and you were charged per number and size of databases. How servers are billed? Thanks.

    Read the article

  • Passing extended parameter into Sql 2008 connection string

    - by Pita.O
    Hi, I have a need to support extensive auditing capabilities for a system backing into Sql Server 2008. Since I plan to use LINQ (with no Stored Procs), the database would be a clean, zero contact data repository. However, I need to pratically record a snapshot of every change that happens in the db. So, I thought I should use triggers. But then, I need a user id for the particular user (not the connection string user id) to flow through into the database. In oracle, I should have been able to set up a PROXY USER and the trigger would be able to pick that up. Last I checked, there was no proxy user concept in Sql Server. Does anyone know if there's any extender property I can use to flow through my authenticated user name? ps: I don't mind the impact on connection pooling (if any). Thanks. P

    Read the article

  • Php getting too many connections error from MySQL

    - by uzioriluzan
    Hello everyone, I am using MySQL and PHP with 2 application servers and 1 database server. With the increase of the number of users (around 1000 by now), I'm getting the following error : SQLSTATE[08004] [1040] Too many connections The parameter "max_connections" is set to "1000" in my.cnf and "mysql.max_persistent" is set to -1 in php.ini. There are at most 1500 apache processes running at a time since the "MaxClients" apache parameter is equal to 750 and we have 2 application servers. Should I raise the "max_connections" to 1500 as indicated in here? Or should I set "mysql.max_persistent" to 750 (we use PDO with persistent connections for performance reasons since the database server is not the same as the application servers)? Or should I try something else? Thanks in advance!

    Read the article

  • DateTime in SqlServer VisualStudio C#

    - by menacheb
    Hi, I Have a DataBase in my project With Table named 'ProcessData' and columns named 'Start_At' (Type: DateTime) and 'End_At' (Type: DateTime) . When I try to enter a new record into this table, it enter the data in the following format: 'YYYY/MM/DD HH:mm', when I actualy want it to be in that format: 'YYYY/MM/DD HH:mm:ss' (the secondes dosen't apper). Does anyone know why, and what should I do in order to fix this? Here is the code I using: con = new SqlConnection("...."); String startAt = "20100413 11:05:28"; String endAt = "20100414 11:05:28"; ... con.Open();//open the connection, in order to get access to the database SqlCommand command = new SqlCommand("insert into ProcessData (Start_At, End_At) values('" + startAt + "','" + endAt + "')", con); command.ExecuteNonQuery();//execute the 'insert' query. con.Close(); Many thanks

    Read the article

  • MS Access questions - Scalability / indexing / transactions

    - by oo
    A few questions on MS Access databases - Size: Are there limits to the size of an access database? The reason i ask is that we have an access database that has a few simple tables. The size of the db is about 1GB. When I do a query on it, i see it taking over 10 minutes to run. With proper indexing, should MS Access be able to handle this or are there fundamental limitations to the technology. This is MS Access XP. Also, does MS Access support db transactions, commit and rollback?

    Read the article

  • Is Amazon SQS the right choice here? Rails performance issue.

    - by ole_berlin
    I'm close to releasing a rails app with the common networking features (messaging, wall, etc.). I want to use some kind of background processing (most likely Bj) for off-loading tasks from the request/response cycle. This would happen when users invite friends via email to join and for email notifications. I'm not sure if I should just drop these invites and notifications in my Database, using a model and then just process it with a worker process every x minutes or if I should go for Amazon SQS, storing the messages and invites there and let my worker retrieve it from Amazon SQS for processing (sending the invites / notifications). The Amazon approach would get load off my Database but I guess it is slower to retrieve messages from there. What do you think?

    Read the article

  • Django. default=datetime.now() problem

    - by Shamanu4
    Hello. I've such db model: from datetime import datetime class TermPayment(models.Model): dev_session = models.ForeignKey(DeviceSession, related_name='payments') user_session = models.ForeignKey(UserSession, related_name='payment') date = models.DateTimeField(default=datetime.now(),blank=True) sum = models.FloatField(default=0) cnt = models.IntegerField(default=0) class Meta: db_table = 'term_payments' ordering = ['-date'] and here new instance is added: # ... tp = TermPayment() tp.dev_session = self.conn.session # device session hash tp.user_session = self.session # user session hash tp.sum = sum tp.cnt = cnt tp.save() But i've a problem: all records in database have the same value in date field - the date of the first payment. After server restart - one record have new date and others have the same as first after restart. It's look like some data cache is using but I can't found where. database: mysql 5.1.25 django v1.1.1

    Read the article

  • Showing current value in DB in a drop down box

    - by user195257
    Hello, this is my code which populates a drop down menu, all working perfectly, but when editing a database record, i want the first value in the drop down to be what is currently in the database, how would i do this? <li class="odd"><label class="field-title">Background <em>*</em>:</label> <label><select class="txtbox-middle" name="background" /> <?php $bgResult = mysql_query("SELECT * FROM `backgrounds`"); while($bgRow = mysql_fetch_array($bgResult)){ echo '<option value="'.$bgRow['name'].'">'.$bgRow['name'].'</option>'; } ?> </select></li>

    Read the article

  • Reflection & Parameters in C#

    - by Jim
    Hello, I'm writing an application that runs "things" to a schedule. Idea being that the database contains assembly, method information and also the parameter values. The timer will come along, reflect the method to be run, add the parameters and then execute the method. Everything is fine except for the parameters. So, lets say the method accepts an ENUM of CustomerType where CustomerType has two values of CustomerType.Master and CustomerType.Associate. EDIT I don't the type of parameter that will be getting passed in. ENUM used as an example END OF EDIT We want to run Method "X" and pass in parameter "CustomerType.Master". In the database, there will be a varchar entry of "CustomerType.Master". How do I convert the string "CustomerType.Master" into a type of CustomerType with a value of "Master" generically? Thanks in advance, Jim

    Read the article

  • JsonStore.insert() causes exception in extjs

    - by kalan
    I have an EditorGridPanel with toolbar button to add new records. Everything works fine except one scenario. When I try to insert a record which already exists in database, server sends back: {"success":false,"message":"already exists","data":{}} but grid creates a new row marked with red triangle. If after that I try to insert a new record (even if it doesn't exist in database), everything works fine on the server side, but i get an 'uncaught exception' in firebug. It says: 'uncaught exception: Ext.data.DataReader: #realize was called with invalid remote-data. Please see the docs for DataReader#realize and review your DataReader configuration.' why is that?

    Read the article

< Previous Page | 598 599 600 601 602 603 604 605 606 607 608 609  | Next Page >