Search Results

Search found 44076 results on 1764 pages for 'large text'.

Page 84/1764 | < Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >

  • Large ResultSet on postgresql query

    - by tuler
    I'm running a query against a table in a postgresql database. The database is on a remote machine. The table has around 30 sub-tables using postgresql partitioning capability. The query will return a large result set, something around 1.8 million rows. In my code I use spring jdbc support, method JdbcTemplate.query, but my RowCallbackHandler is not being called. My best guess is that the postgresql jdbc driver (I use version 8.3-603.jdbc4) is accumulating the result in memory before calling my code. I thought the fetchSize configuration could control this, but I tried it and nothing changes. I did this as postgresql manual recomended. This query worked fine when I used Oracle XE. But I'm trying to migrate to postgresql because of the partitioning feature, which is not available in Oracle XE. My environment: Postgresql 8.3 Windows Server 2008 Enterprise 64-bit JRE 1.6 64-bit Spring 2.5.6 Postgresql JDBC Driver 8.3-603

    Read the article

  • Selecting an item from a very large list

    - by Ben Fulton
    Suppose I have a list of a couple of thousand organizations and a user needs to be able to select one of them. The list is too large to populate in a dropdown at page load, and the user often knows what they want but it's not the first part of the organization name. That is, they know "Collections" but not that the precise name of the organization is "Department of Collections". So the user will need/want to type in some information. It's easy enough to use an autocompleting textbox of some kind, but I don't want to allow the user to type in random text - they have to choose one of the organizations explicitly. What's the best solution?

    Read the article

  • Large number of Google Map Markers and IE6?

    - by Sivakanesh
    I'm working on an application that generates a large number of Google Map markers (2000 - 7000) via JSON. I'm also using MarkerCluster. It works quick on Chrome and FF but IE6 takes few minutes and just crashes the first time I try to zoom in. I'm not doing any more than just adding the markers to a map using JQuery & GMap API. So I looked at the following URL of the regular Google Map. http://maps.google.co.uk/maps?f=q&source=s_q&hl=en&q=hotel&sll=53.182996,-2.581787&sspn=1.494529,4.927368&ie=UTF8&split=1&rq=1&ev=p&hq=hotel&hnear=&ll=53.123702,-2.730103&spn=1.496594,4.927368&t=h&z=8 It shows a lot of tiny markers (~1000) and works fine on IE6. Do you have any ideas why this works and the markers added via the API struggles? Thanks

    Read the article

  • C#: Efficiently search a large string for occurences of other strings

    - by Jon
    Hi, I'm using C# to continuously search for multiple string "keywords" within large strings, which are = 4kb. This code is constantly looping, and sleeps aren't cutting down CPU usage enough while maintaining a reasonable speed. The bog-down is the keyword matching method. I've found a few possibilities, and all of them give similar efficiency. 1) http://tomasp.net/articles/ahocorasick.aspx -I do not have enough keywords for this to be the most efficient algorithm. 2) Regex. Using an instance level, compiled regex. -Provides more functionality than I require, and not quite enough efficiency. 3) String.IndexOf. -I would need to do a "smart" version of this for it provide enough efficiency. Looping through each keyword and calling IndexOf doesn't cut it. Does anyone know of any algorithms or methods that I can use to attain my goal?

    Read the article

  • IE Problem: Jagged Scrolling and Dragging Inside Large Viewport

    - by br4inwash3r
    My site is a single page website with a very large "canvas" size. and to navigate around the site i'm using jquery scrollTo and jquery Dragscrollable plugin. in IE 7 & 8 the scrolling/dragging movement is very jagged. at first i thought it was my script or some other plugin that's causing this. but after i stripped down everything it's still the same. i've tried a few tips i've found around here. but none is really working for me. i know i should be asking this question to the plugin developers. i did.. i just thought maybe you guys have some other solution for this issue. here's the URL to the demo site: http://satuhati.com/bare/template/ really appreciate any help u can give :) thx..

    Read the article

  • Faster forking of large processes on Linux ?

    - by timday
    What's the fastest, best way on modern Linux of achieving the same effect as a fork-execve combo from a large process ? My problem is that the process forking is ~500MByte big, and a simple benchmarking test achieves only about 50 forks/s from the process (c.f ~1600 forks/s from a minimally sized process) which is too slow for the intended application. Some googling turns up vfork as having being invented as the solution to this problem... but also warnings about not to use it. Modern Linux seems to have acquired related clone and posix_spawn calls; are these likely to help ? What's the modern replacement for vfork ? I'm using 64bit Debian Lenny on an i7 (the project could move to Squeeze if posix_spawn would help).

    Read the article

  • Speeding up inner joins between a large table and a small table

    - by Zaid
    This may be a silly question, but it may shed some light on how joins work internally. Let's say I have a large table L and a small table S (100K rows vs. 100 rows). Would there be any difference in terms of speed between the following two options?: OPTION 1: OPTION 2: --------- --------- SELECT * SELECT * FROM L INNER JOIN S FROM S INNER JOIN L ON L.id = S.id; ON L.id = S.id; Notice that the only difference is the order in which the tables are joined. I realize performance may vary between different SQL languages. If so, how would MySQL compare to Access?

    Read the article

  • Write large PDFs with Java sequentially

    - by Benjamin Muschko
    I am looking for a Java library that let's you write large PDFs sequentially with a minimum amount of memory. Most of the libraries I had a look at has to build up the document in memory first before you can actually write it. The problem I have to deal with are OutOfMemoryErrors. It would be great if I could flush the writer programmatically whenever needed e.g. for each page. Does anyone have any recommendations? I need something with a license along the lines of the LGPL (so not the GPL or the Affero GPL that iText uses).

    Read the article

  • Problem with large Canvas in Silverlight

    - by Fury
    Hi, I am developing (using Silvelight 3) an aplication that creates some kind of timeline and places objects on it. For this purpose I need a really large Canvas (up to 2000000 pixels width) with long lines on it, but whenever I create Canvas even 40000 pixels width it behaves very strangely, randomly disappearing. I have found a post with the description of the exactly same problem on silverlight forums and another one here on the stackoverflow. It seems that is a known problem since silverlight 2, but I can't find any good workaround. Does anybody know such workaround or can check is it still an issue in Silverlight 4? Thanks in advance.

    Read the article

  • How do you pronounce large hex numbers?

    - by warrenm
    This question might be subjective, but I'm hoping there's some consensus that I just don't know about. Short hex numbers are relatively easy to spell out (e.g., 0xC4A might be "cee-four-ay"). Hex numbers ending with a multiple of three zeros are likewise pretty easy (e.g., 0xC000 might be "cee-thousand"). But is there a concise way to pronounce 0xFFFF0000 or 0xCA000000? Magic numbers like 0xDEADBEEF are popular for their pronounceability, but I'm mostly asking about large-ish, round numbers that seem like they should have a more concise pronunciation.

    Read the article

  • Technical reasons for not having large background images in websites

    - by kees-kist
    Most websites tend to have either a solid color as background, or a small image that is repeated. Why aren't more websites using a large image (such as a photo) as background? I can think of the following reasons: 1) Problems with different screen resolutions. Too small and gaps start to appear on the left and/or right side for higher resolutions, too big and lower resolutions only show part of the image. 2) Bandwidth. Although this is unlikely to be a problem for most websites. Are there any other reasons why such backgrounds are not being used more often?

    Read the article

  • Upload Large files(1GB)-ASP.net

    - by Ramya Raj
    I need to upload large files of atleast 1GB file size. I am using ASP.Net, C# and IIS 5.1 as my development platform. I am using HIF.PostedFile.InputStream.Read(fileBytes,0,HIF.PostedFile.ContentLength) before using File.WriteAllBytes(filePath, fileByteArray)(doesnt go here but gives System.OutOfMemoryException' exception) Currently i have set the httpRuntime to executionTimeout="999999" maxRequestLength="2097151"(thats 2GB!) useFullyQualifiedRedirectUrl="true" minFreeThreads="8" minLocalRequestFreeThreads="4" appRequestQueueLimit="5000" enableVersionHeader="true" requestLengthDiskThreshold="8192" Also i have set maxAllowedContentLength="2097151" (guess its only for IIS7) I have changed IIS connection timeout to 999,999 secs too. I am unable to upload files of even 4578KB(Ajaz-Uploader.zip)

    Read the article

  • C# Importing Large Volume of Data from CSV to Database

    - by guazz
    What's the most efficient method to load large volumes of data from CSV (3 million + rows) to a database. The data needs to be formatted(e.g. name column needs to be split into first name and last name, etc.) I need to do this in a efficiently as possible i.e. time constraints I am siding with the option of reading, transforming and loading the data using a C# application row-by-row? Is this ideal, if not, what are my options? Should I use multithreading?

    Read the article

  • Large file download for a Rails project

    - by Horace Ho
    One client project will be online two months later. One of the requirements changed is to support large files (10 to 15MB per RAW camera file, expected 1000 to 5000 files download per day) download worldwide for their customers. The process will be: there is upload screen via paperclip to the rails local public folder a hourly task to upload to web storage (S3?) update the download url from paperclip url to the web url Questions: is there a gem/plug-in for this purpose? if no, any gem/plug-in for S3 to recommend? Questions about the storage provider: is S3 recommended? or other service to recommend? The baseline is: the client's web server does not and will not have the bandwidth to handle the downloads. Thanks

    Read the article

  • Unable upload large file size on Google Docs

    - by Preeti
    Hi, I am uploading document on Google Docs as: DocumentsService myService = new DocumentsService(""); myService.setUserCredentials("[email protected]", password ); DocumentEntry newEntry = myService.UploadDocument(@"C:\Sample.txt", "Sample.txt"); But when i try to upload a file of 3 MB it result into exception: An unhandled exception of type 'Google.GData.Client.GDataRequestException' occurred in Google.GData.Client.dll Additional information: Execution of request failed: http://docs.google.com/feeds/documents/private/full How can i upload large size file on Google Docs? I am using Google API ver 2. Thanx

    Read the article

  • Storage for large gridded datasets

    - by nullglob
    I am looking for a good storage format for large, gridded datasets. The application is meteorology, and we would prefer a format that is common within this field (to help exchange data with others). I don't need to deal with special data structures, and there should be a Fortran API. I am currently considering HDF5, GRIB2 and NetCDF4. How do these formats compare in terms of data compression? What are their main limitations? How steep is the learning curve? Are there any other storage formats worth investigating? I have not found a great deal of material outlining the differences and pros/cons of these formats (there is one relevant SO thread, and a presentation comparing GRIB and NetCDF).

    Read the article

  • POST data disapearing on large file upload

    - by DfKimera
    I'm having issues with a file uploading utility in my PHP application. When sending large files (9MB+) over the form, I get a very odd behaviour: the POST data I've included in the form dissapears, including the file information. I've already increased all PHP limits I could (time limit, max input time, post max size, memory limit and upload max filesize) and I still can't get the proper behaviour. I've tried replacing the regular HTTP forms with a Flash-based solution (SWFUpload, www.swfupload.org), still the same behaviour. I've tried multiple files of similar sizes and its definitely not a particular file issue. I've debugged the POST vars sent using Firebug, and the correct variables are still there in the header, together with the file. What could be going on here?

    Read the article

  • How do I include extremely long literals in C++ source?

    - by BillyONeal
    Hello everyone :) I've got a bit of a problem. Essentially, I need to store a large list of whitelisted entries inside my program, and I'd like to include such a list directly -- I don't want to have to distribute other libraries and such, and I don't want to embed the strings into a Win32 resource, for a bunch of reasons I don't want to go into right now. I simply included my big whitelist in my .cpp file, and was presented with this error: 1>ServicesWhitelist.cpp(2807): fatal error C1091: compiler limit: string exceeds 65535 bytes in length The string itself is about twice this allowed limit by VC++. What's the best way to include such a large literal in a program?

    Read the article

  • How to execute a large PHP Script ?

    - by atif089
    Well basically I may want to execute a script that may take as much as 1 hours as well. What I really want to do is Send SMS to my users using a third party API. So its basically like I supply my script with an array of phone numbers and fire the method to send SMS. However assuming it take 5 seconds to send 1 SMS and I want to send 1000 SMS which is roughly 1 - 2 hours. I can use set_time_limit() because I am on shared host. One way to do this is store numbers in a session and execute each SMS and use javascript to refresh that page until end. This way I need to keep my browser open and the execution will stop if my Internet Connection is disconnected. So, Is there any better way to do this ? Hope I am clear enough to explain what I want? I want to execute a large script that may take hours to execute without getting timeout.

    Read the article

  • rails large amount of data in single insert activerecord gave out

    - by Nik
    So I have I think around 36,000 just to be safe, a number I wouldn't think was too large for a modern sql database like mysql. Each record has just two attributes. So I do: so I collected them into one single insert statement sql = "INSERT INTO tasks (attrib_a, attrib_b) VALUES (c1,d1),(c2,d2),(c3,d3)...(c36000,d36000);" ActiveRecord::Base.connection.execute sql from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/abstract_adapter.rb:219:in `log' from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/mysql_adapter.rb:323:in `execute_without_analyzer from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute' from C:/Ruby/lib/ruby/1.8/benchmark.rb:308:in `realtime' from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute' from (irb):53 from C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/vendor/tzinfo-0.3.12/tzinfo/time_or_datetime.rb:242 I don't know if the above info is enough, please do ask for anything that I didn't provide here. So any idea what this is about? THANK YOU!!!!

    Read the article

  • Enable export to XML via HTTP on a large number of models with child relations

    - by Vasil
    I've a large number of models (120+) and I would like to let users of my application export all of the data from them in XML format. I looked at django-piston, but I would like to do this with minimum code. Basically I'd like to have something like this: GET /export/applabel/ModelName/ Would stream all instances of ModelName in applabel together with it's tree of related objects . I'd like to do this without writing code for each model. What would be the best way to do this?

    Read the article

  • Ruby: Read large data from stdout and stderr of an external process on Windows

    - by BinaryMuse
    Greetings, all, I need to run a potentially long-running process from Ruby on Windows and subsequently capture and parse the data from the external process's standard output and error. A large amount of data can be sent to each, but I am only necessarily interested in one line at a time (not capturing and storing the whole of the output). After a bit of research, I found that the Open3 class would take care of executing the process and giving me IO objects connected to the process's standard output and error (via popen3). Open3.popen3("external-program.bat") do |stdin, out, err, thread| # Step3.profit() ? end However, I'm not sure how to continually read from both streams without blocking the program. Since calling IO#readlines on out or err when a lot of data has been sent results in a memory allocation error, I'm trying to continuously check both streams for available input, but not having much luck with any of my implementations. Thanks in advance for any advice!

    Read the article

  • slow record deletion with large ntext values

    - by asking
    I'm having trouble deleting some records via a stored procedure from a table in SQLServer 2008R2 that has ntext columns. The stored proc is timing out and running the query directly takes a very long time. The initial query was a straight "delete from y where x = z" and I've also tried running it in batches of 1000 with transactions but it is still slow and timing out in a stored proc. The majority of the records in the table will not be deleted each time (it's not just a once-off query but will be run other times). The ntext columns are not used in the where clause and I can't change the column types. Any suggestions on the quickest way to delete records with large ntext values? Thanks

    Read the article

  • How to transfer large files from desktop to server ( .NET)

    - by rahulchandran
    I am writing a .NET 2.0 based desktop client that will send large files ( well largish under 2GB) to a server. Need to develop the server as well. Server can be on any technology It should be secure so an underlying SSL stream is needed What are my options. Any obvious caveats etc I should be aware of To my mind the simplest solution is to open a tcp\ip connection over SSL to the server and send n packets each of size M bytes and then have the server append the chunks to the file and finally send an EOF packet as well IS this horrible. Will the perf suck on the server with all these disk writes What are any other clever options. I am limited to .NET 2.0 on the client if I did move to a WCF client will it buy be something magical and cool for this scenario Thanks

    Read the article

  • Calculating very large exponents in python

    - by miraclesoul
    Dear All, Currently i am simulating my cryptographic scheme to test it. I have developed the code but i am stuck at one point. I am trying to take : g**x where g = 256 bit number x = 256 bit number Python hangs at this point, i have read alot of forums, threads etcc but only come to the conclusion that python hangs, as its hard for it to process such large numbers. any idea how can it be done? any two line piece of code, any library, anything that can be done.(ALSO PLEASE I AM A NEW PYTHON USER AND THIS IS FIRST TIME I DID PROGRAMMING IN IT, SO NO COMPLEX METHODS ...HOPE YOU UNDERSTAND :s)

    Read the article

< Previous Page | 80 81 82 83 84 85 86 87 88 89 90 91  | Next Page >