Search Results

Search found 17233 results on 690 pages for 'download speed'.

Page 85/690 | < Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >

  • Trying to speed up a SQLITE UNION QUERY

    - by user142683
    I have the below SQLITE code SELECT x.t, CASE WHEN S.Status='A' AND M.Nomorebets=0 THEN S.PriceText ELSE '-' END AS Show_Price FROM sb_Market M LEFT OUTER JOIN (select 2010 t union select 2020 t union select 2030 t union select 2040 t union select 2050 t union select 2060 t union select 2070 t ) as x LEFT OUTER JOIN sb_Selection S ON S.MeetingId=M.MeetingId AND S.EventId=M.EventId AND S.MarketId=M.MarketId AND x.t=S.team WHERE M.meetingid=8051 AND M.eventid=3 AND M.Name='Correct Score' With the current interface restrictions, I have to use the above code to ensure that if one selection is missing, that a '-' appears. Some feed would be something like the following SelectionId Name Team Status PriceText =================================== 1 Barney 2010 A 10 2 Jim 2020 A 5 3 Matt 2030 A 6 4 John 2040 A 8 5 Paul 2050 A 15/2 6 Frank 2060 S 10/11 7 Tom 2070 A 15 Is using the above SQL code the quickest & efficient?? Please advise of anything that could help. Messages with updates would be preferable.

    Read the article

  • Sending multipart response for downloads in Zend Framework

    - by takeshin
    I'm sending files in action helper for downloads (in parts if needed) like this: ... $response->sendHeaders(); $chunksize = 1 * (1024 * 1024); $bytesSent = 0; if ($httpRange) { fseek($file, $range); } while(!feof($file) && (!connection_aborted() && ($bytesSent < $newLength)) ) { $buffer = fread($file, $chunksize); // $response->appendBody($buffer); // this would be better print($buffer); flush(); $bytesSent += strlen($buffer); } fclose($file); I suspect that better way would be to make use of $response object instead of print. Which is the recommended way to send big response objects using Zend Framework?

    Read the article

  • Improve drawingvisual render's speed

    - by Michael Hao
    I create my own FrameworkElement and override VisualChildrenCount{get;} and GetVisualChild(int index) by returning my own DrawingVisual collection instance.I have override OnRender . I will add 20-50 DrawingVisuals in this FrameworkElement ,every DrawingVisual will have 2000 line segments.The logic value of these points between 0 to 60000.when I zoom into 1:1 the FrameworkElement 's Height will be 60000, the rending time will be 15 minutes!! How do I improve the rending performance ?

    Read the article

  • Slow insert speed in Postgresql memory tablespace

    - by Prashant
    Hi, I have a requirement where I need to store the records at rate of 10,000 records/sec into a database (with indexing on a few fields). Number of columns in one record is 25. I am doing a batch insert of 100,000 records in one transaction block. To improve the insertion rate, I changed the tablespace from disk to RAM.With that I am able to achieve only 5,000 inserts per second. I have also done the following tuning in the postgres config: Indexes : no fsync : false logging : disabled Other information: - Tablespace : RAM - Number of columns in one row : 25 (mostly integers) - CPU : 4 core, 2.5 GHz - RAM : 48 GB I am wondering why a single insert query is taking around 0.2 msec on average when database is not writing anything on disk (as I am using RAM based tablespace). Is there something I am doing wrong? Help appreciated. Prashant

    Read the article

  • FtpWebRequest Download File

    - by pm_2
    The following code is intended to retrieve a file via FTP. However, I'm getting an error with it. serverPath = "ftp://x.x.x.x/tmp/myfile.txt"; FtpWebRequest request = (FtpWebRequest)WebRequest.Create(serverPath); request.KeepAlive = true; request.UsePassive = true; request.UseBinary = true; request.Method = WebRequestMethods.Ftp.DownloadFile; request.Credentials = new NetworkCredential(username, password); // Read the file from the server & write to destination using (FtpWebResponse response = (FtpWebResponse)request.GetResponse()) // Error here using (Stream responseStream = response.GetResponseStream()) using (StreamReader reader = new StreamReader(responseStream)) using (StreamWriter destination = new StreamWriter(destinationFile)) { destination.Write(reader.ReadToEnd()); destination.Flush(); } The error is: The remote server returned an error: (550) File unavailable (e.g., file not found, no access) The file definately does exist on the remote machine and I am able to perform this ftp manually (i.e. I have permissions). Can anyone tell me why I might be getting this error?

    Read the article

  • jQuery pre document ready event

    - by Luke Duddridge
    Hi, I have a list of 179 thumbnail images that I am trying to apply a jQuery lightbox tool to an unorder list of hyper-links. The problem I have is, the jQuery isnt firing until the images have finished downloading, each image is around 23K so on their own, not so big, but as a group this equates to around 4MB. There is a delay on IE (main browser used by clients) of a good 5 seconds before the page has completely downloaded every thumbnail and then allows the jQuery to fire. I have tried putting the jQuery document ready event in various places with no success, and only been able to put a bandaid on by setting the css on the ul to hide using display:none before applying .show() after the lightbox has applied. I was hoping there is a way to make the jQuery scripts fire before all the content has downloaded? Cheers

    Read the article

  • When sending headers to download a PDF, Safari appends .html

    - by alex
    Here is the request and response headers http://www.example.com/get/pdf GET /~get/pdf HTTP/1.1 Host: www.example.com User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.6; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3 Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 115 Connection: keep-alive Referer: http://www.example.com Cookie: etc HTTP/1.1 200 OK Date: Thu, 29 Apr 2010 02:20:43 GMT Server: Apache/2.2.14 (Unix) mod_ssl/2.2.14 OpenSSL/0.9.8i DAV/2 mod_auth_passthrough/2.1 mod_bwlimited/1.4 FrontPage/5.0.2.2635 X-Powered-By: Me Expires: Thu, 19 Nov 1981 08:52:00 GMT Pragma: no-cache Cache-Control: private Content-Disposition: attachment; filename="File #1.pdf" Content-Length: 18776 Keep-Alive: timeout=5, max=100 Connection: Keep-Alive Content-Type: text/html; charset=utf-8 ---------------------------------------------------------- Basically, the response headers are sent by DOMPDF's stream() method. In Firefox, the file is prompted as File #1.pdf. However, in Safari, the file is saved as File #1.pdf.html. Does anyone know why Safari is appending the html extension to the filename?

    Read the article

  • Downloading RGoogleDocs for R (it fails)

    - by Tal Galili
    Hi all, I am trying: install.packages("RGoogleDocs", repos = "http://www.omegahat.org/R") As suggested here, but it doesn't work. I ended up manually downloading the file from here. What other ways are there for me to get to the file directly? Thanks, Tal

    Read the article

  • Cassandra random read speed

    - by Jody Powlette
    We're still evaluating Cassandra for our data store. As a very simple test, I inserted a value for 4 columns into the Keyspace1/Standard1 column family on my local machine amounting to about 100 bytes of data. Then I read it back as fast as I could by row key. I can read it back at 160,000/second. Great. Then I put in a million similar records all with keys in the form of X.Y where X in (1..10) and Y in (1..100,000) and I queried for a random record. Performance fell to 26,000 queries per second. This is still well above the number of queries we need to support (about 1,500/sec) Finally I put ten million records in from 1.1 up through 10.1000000 and randomly queried for one of the 10 million records. Performance is abysmal at 60 queries per second and my disk is thrashing around like crazy. I also verified that if I ask for a subset of the data, say the 1,000 records between 3,000,000 and 3,001,000, it returns slowly at first and then as they cache, it speeds right up to 20,000 queries per second and my disk stops going crazy. I've read all over that people are storing billions of records in Cassandra and fetching them at 5-6k per second, but I can't get anywhere near that with only 10mil records. Any idea what I'm doing wrong? Is there some setting I need to change from the defaults? I'm on an overclocked Core i7 box with 6gigs of ram so I don't think it's the machine. Here's my code to fetch records which I'm spawning into 8 threads to ask for one value from one column via row key: ColumnPath cp = new ColumnPath(); cp.Column_family = "Standard1"; cp.Column = utf8Encoding.GetBytes("site"); string key = (1+sRand.Next(9)) + "." + (1+sRand.Next(1000000)); ColumnOrSuperColumn logline = client.get("Keyspace1", key, cp, ConsistencyLevel.ONE); Thanks for any insights

    Read the article

  • C# Lambda Expression Speed

    - by Nathan
    I have not used many lambda expressions before and I ran into a case where I thought I could make slick use of one. I have a custom list of ~19,000 records and I need to find out if a record exists or not in the list so instead of writing a bunch of loops or using linq to go through the list I decided to try this: for (int i = MinX; i <= MaxX; ++i) { tempY = MinY; while (tempY <= MaxY) { bool exists = myList.Exists(item => item.XCoord == i && item.YCoord == tempY); ++tempY; } } Only problem is it take ~9 - 11 seconds to execute. Am I doing something wrong is this just a case of where I shouldn't be using an expression like this? Thanks.

    Read the article

  • Silverlight file download for COM Interop

    - by rip
    Is the following possible in Silverlight when a button is clicked? An Excel template is downloaded from a remote server and saved to the local machine An instance of the template is then opened on the client A macro is then executed within the new Excel document I can do everything apart from saving the template to the local machine. I can save this in isolated storage but then I don’t know where this is when trying to open it from the Excel COM interop code. Has anyone any ideas or is this not possible?

    Read the article

  • Speed up compilation with mockito on Android

    - by pbreault
    I am currently developing an android app in eclipse using: One project for the app One project for the tests (Instrumentation and Pojo tests) In the test project, I am importing the mockito library for standard POJO testing. However, when I import the library, the compilation time skyrockets from 1 second to about 30 seconds in eclipse. The cause seems to be that the whole library is converted each time. So basically, each time a make a modification that I want to test, I have to wait 30 seconds. The only workarounds that I have found so far would be: Disable "Build Automatically" Create a project that includes only pojo tests and put mockito only there. Use another library that compiles faster (e.g. easymock) Any other suggestion?

    Read the article

  • SQL: Speed Improvement - Cluttered union query

    - by vol7ron
    SELECT * FROM ( SELECT a.user_id, a.f_name, a.l_name, b.user_id, b.f_name, b.l_name FROM current_tbl a INNER JOIN import_tbl b ON ( a.user_id = b.user_id ) UNION SELECT a.user_id, a.f_name, a.l_name, b.user_id, b.f_name, b.l_name FROM current_tbl a INNER JOIN import_tbl b ON ( lower(a.f_name)=lower(b.f_name) AND lower(a.l_name)=lower(b.l_name) ) ) foo -- UNION -- SELECT a.user_id , a.f_name , a.l_name , '' , '' , '' FROM current_tbl a WHERE a.user_id NOT IN ( select user_id from( SELECT a.user_id, a.f_name, a.l_name, b.user_id, b.f_name, b.l_name FROM current_tbl a INNER JOIN import_tbl b ON ( a.user_id = b.user_id ) UNION SELECT a.user_id, a.f_name, a.l_name, b.user_id, b.f_name, b.l_name FROM current_tbl a INNER JOIN import_tbl b ON ( lower(a.f_name)=lower(b.f_name) AND lower(a.l_name)=lower(b.l_name) ) ) bar ) ORDER BY user_id Example of table population: current_tbl: ------------------------------- user_id | f_name | l_name ---------+----------+---------- A1 | Adam | Acorn A2 | Beth | Berry A3 | Calv | Chard | | import_tbl: ------------------------------- user_id | f_name | l_name ---------+----------+---------- A1 | Adam | Acorn A2 | Beth | Butcher <- last_name different | | Expected Output: ----------------------------------------------------------------------- user_id1 | f_name1 | l_name1 | user_id2 | f_name2 | l_name2 ----------+-----------+-----------+------------+-----------+----------- A1 | Adam | Acorn | A1 | Adam | Acorn A2 | Beth | Berry | A2 | Beth | Butcher A3 | Calv | Chard | | | Doing this method gets rid of conditions where the row would be: A2 | Beth | Berry | A2 | Beth | Butcher But it keeps the A3 row I hope this makes sense and I haven't overly simplified it. This is a continuation question from my other question. The succession of these improvements has dropped the query down from ~32000ms to where it's at now ~1200ms - quite an improvement. I supect I can optimize by using UNION ALL in the subquery and of course the usual index optimizations, but I'm looking for the best SQL optimization. FYI this particular case is for PostgreSQL.

    Read the article

  • Progressive MP4 video issues in Flash- Video stops rendering

    - by Conor
    I'm currently working on a flash project that has an intro video that plays before heading into the main app. This video is an H.264 .mp4, 1550x540, and around 10MB. The problem thats currently driving me insane is that when I test it, occasionally the video will begin playing, and then suddenly stop rendering the video frames, leaving the audio playing in the background with nothing on screen. Once the file is played through fully (based on listening to the audio), my playback complete event fires like it should, but I can't find any info of people having similar issues. Attached is a trace of the .mp4 metadata in case that helps. videoframerate : 24 audiochannels : 2 audiocodecid : mp4a audiosamplerate : 48000 trackinfo: 0: length : 608000 timescale : 24000 language : eng sampledescription: 0: sampletype : avc1 1: length : 1218560 timescale : 48000 language : eng sampledescription: 0: sampletype : mp4a duration : 25.386666666666667 width : 1540 videocodecid : avc1 seekpoints: 0: time : 0 offset : 13964 1: time : 0.333 offset : 16893 2: time : 0.667 offset : 34212 ... 73: time : 24.333 offset : 9770329 74: time : 24.667 offset : 9845709 75: time : 25 offset : 9895215 moovposition : 32 height : 540 avcprofile : 77 avclevel : 51 aacaot : 2 This has been driving me absolutely insane... any help would be much appreciated!

    Read the article

  • How can I change the text in a <span></span> element using jQuery?

    - by Eric Reynolds
    I have a span element as follows: <span id="download">Download</span>. This element is controlled by a few radio buttons. Basically, what I want to do is have 1 button to download the item selected by the radio buttons, but am looking to make it a little more "flashy" by changing the text inside the <span> to say more specifically what they are downloading. The span is the downloading button, and I have it animated so that the span calls slideUp(), then should change the text, then return by slideDown().Here is the code I am using that does not want to work. $("input[name=method]").change(function() { if($("input[name=method]").val() == 'installer') { $('#download').slideUp(500); $('#download').removeClass("downloadRequest").removeClass("styling").css({"cursor":"default"}); $('#download').text("Download"); $('#download').addClass("downloadRequest").addClass("styling").css({"cursor":"pointer"}); $('#download').slideDown(500); } else if($("input[name=method]").val() == 'url') { $('#download').slideUp(500); $('#download').removeClass("downloadRequest").removeClass("styling").css({"cursor":"default"}); $('#download').text("Download From Vendor Website"); $('#download').addClass("styling").addClass("downloadRequest").css({"cursor":"pointer"}); $('#download').slideDown(500); } }); I changed the code a bit to be more readable so I know that it doesn't have the short code that jQuery so eloquently allows. Everything in the code works, with the exception of the changing of the text inside the span. I'm sure its a simple solution that I am just overlooking. Any help is appreciated, Eric R.

    Read the article

  • Speed up SQL Server Fulltext Index through Text Duplication of Non-Indexed Columns

    - by Alex
    1) I have the text fields FirstName, LastName, and City. They are fulltext indexed. 2) I also have the FK int fields AuthorId and EditorId, not fulltext indexed. A search on FirstName = 'abc' AND AuthorId = 1 will first search the entire fulltext index for 'abc', and then narrow the resultset for AuthorId = 1. This is bad because it is a huge waste of resources as the fulltext search will be performed on many records that won't be applicable. Unfortunately, to my knowledge, this can't be turned around (narrow by AuthorId first and then fulltext-search the subset that matches) because the FTS process is separate from SQL Server. Now my proposed solution that I seek feedback on: Does it make sense to create another computed column which will be included in the fulltext search which will identify the Author as text (e.g. AUTHORONE). That way I could get rid of the AuthorId restriction, and instead make it part of my fulltext search (a search for 'abc' would be 'abc' and 'AUTHORONE' - all executed as part of the fulltext search). Is this a good idea or not? Why?

    Read the article

  • Make clients to download InstallShield PreRequisites from Internet

    - by LioKig
    Hi Everyone, My installshield project uses custom prerequisites to install .Net Framework 4.0 Client Profile and Microsoft Sync Framework 2.0 client package. I want to let clients to donwload .Net Framework and Sync Framework directly from the Internet so that our installer is small. But I cant see a way to this. If you could give some advices or example, it would be most appreciated. Cheers

    Read the article

  • OutOfMemoryException when I read FileStream 500 MB

    - by Alhambra Eidos
    Hi all, I'm using Filestream for read big file ( 500 MB) and I get the OutOfMemoryException. Any solutions about it. My Code is: using (var fs3 = new FileStream(filePath2, FileMode.Open, FileAccess.Read)) { byte[] b2 = ReadFully(fs3, 1024); } public static byte[] ReadFully(Stream stream, int initialLength) { // If we've been passed an unhelpful initial length, just // use 32K. if (initialLength < 1) { initialLength = 32768; } byte[] buffer = new byte[initialLength]; int read = 0; int chunk; while ((chunk = stream.Read(buffer, read, buffer.Length - read)) > 0) { read += chunk; // If we've reached the end of our buffer, check to see if there's // any more information if (read == buffer.Length) { int nextByte = stream.ReadByte(); // End of stream? If so, we're done if (nextByte == -1) { return buffer; } // Nope. Resize the buffer, put in the byte we've just // read, and continue byte[] newBuffer = new byte[buffer.Length * 2]; Array.Copy(buffer, newBuffer, buffer.Length); newBuffer[read] = (byte)nextByte; buffer = newBuffer; read++; } } // Buffer is now too big. Shrink it. byte[] ret = new byte[read]; Array.Copy(buffer, ret, read); return ret; } thanks in advanced,

    Read the article

  • Improving the speed of php

    - by cast01
    I'm currently working on a website in PHP, and I'm wondering what the best practices/methods are to reduce the time requests take. I've build the site in a modular way, so a page would consist of a number of modules, and each of these would need to request information. For example, I have a cart module, that (if a cart is set) will fetch the cart with the id (stored in a session variable) from the database and return its contents. I have another module that lists categories and this needs to fetch the categories from the database. My system is built with models, and each model might also make a request, for example a category model will make a request to get products in that category.

    Read the article

  • ColdFusion speed cost of FileExists

    - by davidosomething
    I want to: on every page, check if a file exists include that file if TRUE i.e.: <cfset variables.includes.header = ExpandPath("_inc_header.cfm")> <cfif FileExists(variables.includes.header)> <cfinclude template = "#variables.includes.header#"> </cfif> Is this a good idea?

    Read the article

< Previous Page | 81 82 83 84 85 86 87 88 89 90 91 92  | Next Page >