Search Results

Search found 32492 results on 1300 pages for 'reporting database'.

Page 650/1300 | < Previous Page | 646 647 648 649 650 651 652 653 654 655 656 657  | Next Page >

  • Remote Postgresql - extremely slow

    - by Muffinbubble
    Hi, I have setup PostgreSQL on a VPS I own - the software that accesses the database is a program called PokerTracker. PokerTracker logs all your hands and statistics whilst playing online poker. I wanted this accessible from several different computers so decided to installed it on my VPS and after a few hiccups I managed to get it connecting without errors. However, the performance is dreadful. I have done tons of research on 'remote postgresql slow' etc and am yet to find an answer so am hoping someone is able to help. Things to note: The query I am trying to execute is very small. Whilst connecting locally on the VPS, the query runs instantly. While running it remotely, it takes about 1 minute and 30 seconds to run the query. The VPS is running 100MBPS and then computer I'm connecting to it from is on an 8MB line. The network communication between the two is almost instant, I am able to remotely connect fine with no lag whatsoever and am hosting several websites running MSSQL and all the queries run instantly, whether connected remotely or locally so it seems specific to PostgreSQL. I'm running their newest version of the software and the newest compatible version of PostgreSQL with their software. The database is a new database, containing hardly any data and I've ran vacuum/analyze etc all to no avail, I see no improvements. I don't understand how MSSQL can query almost instantly yet PostgreSQL struggles so much. I am able to telnet to the post 5432 on the VPS IP with no problems, and as I say the query does execute it just takes an extremely long time. What I do notice is on the router when the query is running that hardly any bandwidth is being used - but then again I wouldn't expect it to for a simple query but am not sure if this is the issue. I've tried connecting remotely on 3 different networks now (including different routers) but the problem remains. Connecting remotely via another machine via the LAN is instant. I have also edited the postgre conf file to allow for more memory/buffers etc but I don't think this is the problem - what I am asking it to do is very simple - it shouldn't be intensive at all. Thanks, Ricky

    Read the article

  • Can Google App Engine be used for "check for updated" and download binary file web service?

    - by Tofrizer
    Hi, I'm a Google App Engine newbie and would be grateful for any help. I have an iPhone app which sources data from an sqlite db stored localling on the device. I'd like to set up a Google App Engine web service which my iPhone client will talk to and check if there is a newer version of the sqlite database it needs to download. So iPhone client hits the web service with some kind of version number/timestamp and if there is a newer file, the App Engine will notify the client and the client will then request the new database to download which the App Engine will serve. Is it possible to set up a web service in Google App Engine to do this? Could anyone point me to any sample code / tutorials please? Many Thanks

    Read the article

  • How do I set up a test duplicate of a Django and Postgresql based web application?

    - by cojadate
    Not sure if this is an excessively broad and newbie-ish question for Stack Overflow but here goes: I paid someone else to build a web application for me and now I want to tweak certain aspects of it myself. I learn best by trial and error – changing stuff and seeing what happens. Obviously that's not a great way to treat a live site, so I need to duplicate the site on some kind of test server which I can play with without fear of the consequences. Unfortunately the closest I've come to programming has been creating ActionScript-based websites. I've never touched a database. So I really don't know where to start with setting up a test server. I would really appreciate any advice about where to start. I am completely ignorant and lost here. The web application is built in python/django using a Postgresql database. I use Mac OS X 10.6 if that makes any difference.

    Read the article

  • Need Help with Page Life Cycle(I think it is screwing me up)

    - by chobo2
    Hi I have dragged a empty asp.net table onto my webform. I generate all the rows in the code behind those. Now my table gets filled up and has dropdown lists. When the user hits save I go through all the rows and update the values from the dropdownlist in the db. This works all great. However if 2 columns have each have "Present" then those 2 columns should be not be shown anymore and 2 new columns get put in its place with other dropdown lists. This all works. However you have to refresh the entire page to for the 2 columns that should go away to go away. So what I tried to do is at the end of the button click event. Clear the whole table and then regenerate it. However when I do this then my values are not saved to the database anymore for whatever reason. if (IsPostBack == false) { // check if dummy variables exist in db- If true just generate tables with values in db. If not generate them. } else { // grab the values from the database // generate tables with the values } btn click event { go through all rows in table(foreach loop) update each column in the database with cells in each row. while in foreach loop. //done } So this is how it goes and it works expect(all correct values are saved) the table is just not updated to the user. Does not work if (IsPostBack == false) { // same code as above } // if postback is true do nothing. By the time it gets to the click event it says there is zero rows in the table so nothing happens. btn click event { // same code } Fails also. if (IsPostBack == false) { // same code as above } else { // same code as above but moved into its own method. gernerateTable(); } btn click event { // update all rows // once done clear the Tables rows // call generateTable() } This last one does nothing as for some reason it does not update anything. I don't understand why. So what am I doing wrong with this life cycle something in my process is wrong. The code works just not when I want the table to be updated right away.

    Read the article

  • Asp.Net Impersonation Fails On First Try But Succeeds on Second

    - by KevDog
    We are using RDLC's in a Asp.net web application. For reasons beyond our understanding, the first call to the database server fails with the following error: An error has occurred during report processing. Cannot open database "TryParkingIt2" requested by the login. The login failed. Login failed for user 'EXTRANET\OurServerNameHere$'. Run the report again, it works. Huh? Update Click the button the first time, it fails. Click the button again, it works. The account being impersonated is a domain account.

    Read the article

  • Low overhead Java Web Services container?

    - by trojanfoe
    I want to provide a Java-based Web Service, but I don't require the features of a full-blown J2EE Application Server. I would like it to start as quickly as possible, though that's not a hard requirement. The Web Service will handle multiple connections and require access to an Oracle database so it will at least require a thread pool and database connection pool. I may want to put a JSP interface onto it later to provide an internal maintainence interface. I have looked at Jetty with an Apache CXF stack, but it looks like I'll have to do a fair amount configuration before even coding the web service - Will it be worth it? Will it even work? Should I forget about the complexity and simply go with JBoss/Weblogic/etc and put up with the bloat and extra start-up time?

    Read the article

  • Longish list of allowed referrers

    - by Tim
    I want to allow hotlinking only from a list of referrers (paying customers, probably a few hundred). I am on Apache 1.3 and I do not have access to the configuration (only .htaccess). What is the fastest way to implement this? My thoughts so far: PHP with database and readfile() (SSI with) Perl and database the list implemented as symlinks named after the allower referrer, then RewriteCond using HTTP_REFERER everything in .htaccess, lots of RewriteCond's everything in .htaccess, lots of SetEnvIf's Any better (faster) ways to do this? Thanks!

    Read the article

  • Gaining application/module context from a symfony task

    - by Martin Chatterton
    I have written a reporting suite, and I have a specific report that builds a CSV file. Serving this file via a browser on demand isn't an issue, but I need to be able to build this CSV file nightly, and email round a link to be able to download it. Essentially, I need to be able to replace a specific action with a symfony task, run via cron. So how do I gain application/module context from a symfony task? And secondly, how would I invoke the SwiftMailer library from a symfony task? I'm using symfony v1.4.4 and PHP v.5.2.13. Thanks in advance for your help.

    Read the article

  • How do you deploy your SharePoint solutions?

    - by Lars Mæhlum
    I am now in the process of planning the deployment of a SharePoint solution into a production environment. I have read about some tools that promise an easy way to automate this process, but nothing that seems to fit my scenario. In the testing phase I have used SharePoint Designer to copy site content between the different development and testing servers, but this process is manual and it seems a bit unnecessary. The site is made up of SharePoint web part pages with custom web parts, and a lot of Reporting Services report definitions. So, is there any good advice out there in this vast land of geeks on how to most efficiently create and deploy a SharePoint site for a multiple deployment scenario? Edit Just to clarify. I need to deploy several "SharePoint Sites" into an existing site collection. Since SharePoint likes to have its sites in the SharePoint content database, just putting the files into IIS is not an option at this time.

    Read the article

  • How to write a datatable to excel work book at client side

    - by senthilkumar
    Hi Guys Currently i am doing one project in that we need to generate report from database. Since my server memory is too low im getting 'Out of Memory' Exception when im writing it at serverside and also when i write directly to a excel file using http header as excel file im not able to create multiple sheets since my database table is huge more than 65536 rows. I saw many solution using a third party tool but i cant use those into mine..If anyone already worked on this please give me some direction. Also i tried using javascript but for that i need to use datagrid at server side?? but in my project i m not allowed to use like this.

    Read the article

  • Size SKU in Magento

    - by latvian
    Hi, How can I allow SKU be longer than 34 characters (for simple products) for all products? When i add new product(simple) and enter more than 34 characters, Magento cuts it to 34 after saving. In the database the 'sku' attributes ( for quete item,order item,invoice item, shipment item) from eav_attribute table hold varchar(255). For Catalog_Product_Entity the attributed is VarChar(64). In either case, it is more than 34characters. Thus, I could change SKU to 64 characters without making any change in database, correct? How do i do that? I have good understanding of the user side magento code, but not Admin side. Can you suggest me good tutorial on making changes in Admin side that could help me figure this question myself. Thank you, Margots

    Read the article

  • Seeking tutorial: introduction to ODBC with Delphi

    - by mawg
    I have a lot of embedded C/C++/Ada experience and an outdated smattering of Delphi plus some database stuff. Now I have to implement an app in Delphi which can manipulate MySql, Oracle, maybe MS Access. In short, I need ODBC. I need to programmatically created a database, define its structure and populate its contents, then later query its existence and programmatically search. I would prefer not to use 3rd party components, unless there is a compelling reason to do so (performance ought not to be an issue for the app, it won't have much data or be run often, at least not in v1.0) . Can anyone point me at a tutorial which can get me up to speed? Thanks

    Read the article

  • Problem With Inserts of multibyte (converted to utf-8) strings in the mysql tables of utf_unicode_ci encoding

    - by user381595
    http://domainsoutlook.com/sandbox/keyword/?s=http://bhaskar.com raw example of my keyword density analyser. Every keyword shows up properly with no problems in unicode conversions etc. Now, When I am adding these words to the database column of a table, the words show up as messed up. http domainsoutlook.com/b/site/bhaskar.com.html For example on this front end page if you see there is a keyword that is shown as a blank but still occurs on the website 8 times. (It isnt empty in the database though). I have checked and there is no problem with mysql_real_escape_String...because the output stays the same before and after the word is gone through mysql_real_escape_String. Another problem was that I wanted to fix my urls for arabic language. They should be showing up as /word-{1st letter of the word}/{whole word}.html but its showing as /word-{whole word}/{1st letter of the word}.html I really need answers for these two questions.

    Read the article

  • PHP getting blank pages after submit a form + signal Segmentation fault (11)

    - by Ole Media
    I few days ago I update my macbook pro to snow leopard, and since then some php files are not showing. This is what happens: I created a php form, when going to 'http://localhost/webform.php' I can see the form just fine. Then, once I submit the form, I just get a blank page. I enable error and warnings reporting under php.ini to make sure I'm not missing something, but still I'm not getting anything, just the blank page. Then I checked under apache log files, and what I notice is that every time I submit the form I see the following line coming up under the apache logs: [Wed Apr 07 21:40:28 2010] [notice] child pid 70223 exit signal Segmentation fault (11) I'm clueless on this one. Any ideas on how to fix it?

    Read the article

  • Why does tokyo tyrant slow down exponentially even after adjusting bnum?

    - by HenryL
    Has anyone successfully used Tokyo Cabinet / Tokyo Tyrant with large datasets? I am trying to upload a subgraph of the Wikipedia datasource. After hitting about 30 million records, I get exponential slow down. This occurs with both the HDB and BDB databases. I adjusted bnum to 2-4x the expected number of records for the HDB case with only a slight speed up. I also set xmsiz to 1GB or so but ultimately I still hit a wall. It seems that Tokyo Tyrant is basically an in memory database and after you exceed the xmsiz or your RAM, you get a barely usable database. Has anyone else encountered this problem before? Were you able to solve it?

    Read the article

  • Syncing a table records with a Service response frequently

    - by Karthik Dheeraj
    I am requesting data from a service whose response in stored in a database.First, I have an empty table, whenever I make my very first request the records from the service comes to my database table. from now, whenever I make second request, the service will provide me some records which may be same as my first response, may be new records, may be updated records etc. my query is to how to update my table with respect to the responses coming from the service during my second request on-wards? so that Unchanged records will remain same, New records will be added, updated records will be updated.Do I need to write any stored procedure on my DB or any workaround ?what might be the scenario if I use Nomysql DB's like mongo DB ? Thanks In Advance.

    Read the article

  • rails rollback updates when task fails

    - by ash34
    Hi, I have the following "generate_report" method being called from a rake task, which gets a hash as an input, that contains the reported hours spent by each user on a task and outputs the data as a .csv report. desc "Task reporting" task :report, [:inp_dt] => [:environment] do |t, args| h = select_data(args.inp_dt) /* not shown here */ generate_report(h) end def generate_report(h) out_dir = File.dirname(__FILE__) + '/../../output' myfile = "#{out_dir}" + "/monthly_#{Date.today.strftime("%m%d%Y")}.csv" writer = CSV.open(myfile, 'w') h.each do |h,v| v.each do |key,val| writer << val end end writer.close end where h = {:BILL=>{:PROJA=>["CYR", "00876", "2", 24], :PROJB=>["EPR", "00876", "2", 16]}, :JANE=>{:PROJA=>["TRB", "049576", "2", 16]}} I would like to set/update a 'processed' flag for each reported transaction and only commit the update when the file is written correctly or rollback the updates when the task fails. How can I accomplish this. thanks, ash

    Read the article

  • SQL Azure server as unit of billing

    - by vtortola
    Hi, One of the azure training kit presentation says: Each account has zero or more logical servers Provisioned via a common portal Establishes a billing instrument Each logical server has one or more databases Contains metadata about database & usage Unit of authentication, geo-location, billing, reporting Generated DNS-based name Each database has standard SQL objects Users, Tables, Views, Indices, etc Unit of consistency So now I'm lost :D. Were not the databases themselves the units of billing? I mean, I thought that servers were just like logical containers and you were charged per number and size of databases. How servers are billed? Thanks.

    Read the article

  • DateTime in SqlServer VisualStudio C#

    - by menacheb
    Hi, I Have a DataBase in my project With Table named 'ProcessData' and columns named 'Start_At' (Type: DateTime) and 'End_At' (Type: DateTime) . When I try to enter a new record into this table, it enter the data in the following format: 'YYYY/MM/DD HH:mm', when I actualy want it to be in that format: 'YYYY/MM/DD HH:mm:ss' (the secondes dosen't apper). Does anyone know why, and what should I do in order to fix this? Here is the code I using: con = new SqlConnection("...."); String startAt = "20100413 11:05:28"; String endAt = "20100414 11:05:28"; ... con.Open();//open the connection, in order to get access to the database SqlCommand command = new SqlCommand("insert into ProcessData (Start_At, End_At) values('" + startAt + "','" + endAt + "')", con); command.ExecuteNonQuery();//execute the 'insert' query. con.Close(); Many thanks

    Read the article

< Previous Page | 646 647 648 649 650 651 652 653 654 655 656 657  | Next Page >