Search Results

Search found 65999 results on 2640 pages for 'large data volumes'.

Page 132/2640 | < Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >

  • Upload Large files(1GB)-ASP.net

    - by Ramya Raj
    I need to upload large files of atleast 1GB file size. I am using ASP.Net, C# and IIS 5.1 as my development platform. I am using HIF.PostedFile.InputStream.Read(fileBytes,0,HIF.PostedFile.ContentLength) before using File.WriteAllBytes(filePath, fileByteArray)(doesnt go here but gives System.OutOfMemoryException' exception) Currently i have set the httpRuntime to executionTimeout="999999" maxRequestLength="2097151"(thats 2GB!) useFullyQualifiedRedirectUrl="true" minFreeThreads="8" minLocalRequestFreeThreads="4" appRequestQueueLimit="5000" enableVersionHeader="true" requestLengthDiskThreshold="8192" Also i have set maxAllowedContentLength="2097151" (guess its only for IIS7) I have changed IIS connection timeout to 999,999 secs too. I am unable to upload files of even 4578KB(Ajaz-Uploader.zip)

    Read the article

  • Large file download for a Rails project

    - by Horace Ho
    One client project will be online two months later. One of the requirements changed is to support large files (10 to 15MB per RAW camera file, expected 1000 to 5000 files download per day) download worldwide for their customers. The process will be: there is upload screen via paperclip to the rails local public folder a hourly task to upload to web storage (S3?) update the download url from paperclip url to the web url Questions: is there a gem/plug-in for this purpose? if no, any gem/plug-in for S3 to recommend? Questions about the storage provider: is S3 recommended? or other service to recommend? The baseline is: the client's web server does not and will not have the bandwidth to handle the downloads. Thanks

    Read the article

  • Convert 12hour time to 24Hour time

    - by RwardBound
    I have hourly weather data. I've seen the function examples from here: http://casoilresource.lawr.ucdavis.edu/drupal/node/991 I'm altering the code to account for airport data, which has a different URL type. Another issue with the airport weather data is that the time data is saved in 12 hour format. Here is a sample of the data: 14 10:43 AM 15 10:54 AM 16 11:54 AM 17 12:07 PM 18 12:15 PM 19 12:54 PM 20 1:54 PM 21 2:54 PM Here's what I attempted: (I see that using just 'PM' isn't careful enough because any times between 12 and 1 pm will be off if they go through this alg) date<-Sys.Date() data$TimeEST<-strsplit(data$TimeEST, ' ') for (x in 1:35){ if('AM' %in% data$TimeEST[[x]]){ gsub('AM','',data$TimeEST[[x]]) data$TimeEST[[x]]<-str_trim(data$TimeEST[[x]]) data$TimeEST[[x]]<-str_c(date,' ',data$TimeEST[x],':',data$TimeEST[2]) } else if('PM' %in% data$TimeEST[[x]]){ data$TimeEST[[x]]<-gsub('PM', '',data$TimeEST[[x]]) data$TimeEST[[x]]<-strsplit(data$TimeEST[[x]], ':') data$TimeEST[[x]][[1]][1]<-as.integer(data$TimeEST[[x]][[1]][1])+12 data$TimeEST[[x]]<-str_trim(data$TimeEST[[x]][[1]]) data$TimeEST[[x]]<-str_c(date, " ", data$TimeEST[[x]][1],':',data$TimeEST[[x]][2]) } } Any help?

    Read the article

  • Unable upload large file size on Google Docs

    - by Preeti
    Hi, I am uploading document on Google Docs as: DocumentsService myService = new DocumentsService(""); myService.setUserCredentials("[email protected]", password ); DocumentEntry newEntry = myService.UploadDocument(@"C:\Sample.txt", "Sample.txt"); But when i try to upload a file of 3 MB it result into exception: An unhandled exception of type 'Google.GData.Client.GDataRequestException' occurred in Google.GData.Client.dll Additional information: Execution of request failed: http://docs.google.com/feeds/documents/private/full How can i upload large size file on Google Docs? I am using Google API ver 2. Thanx

    Read the article

  • POST data disapearing on large file upload

    - by DfKimera
    I'm having issues with a file uploading utility in my PHP application. When sending large files (9MB+) over the form, I get a very odd behaviour: the POST data I've included in the form dissapears, including the file information. I've already increased all PHP limits I could (time limit, max input time, post max size, memory limit and upload max filesize) and I still can't get the proper behaviour. I've tried replacing the regular HTTP forms with a Flash-based solution (SWFUpload, www.swfupload.org), still the same behaviour. I've tried multiple files of similar sizes and its definitely not a particular file issue. I've debugged the POST vars sent using Firebug, and the correct variables are still there in the header, together with the file. What could be going on here?

    Read the article

  • How do I include extremely long literals in C++ source?

    - by BillyONeal
    Hello everyone :) I've got a bit of a problem. Essentially, I need to store a large list of whitelisted entries inside my program, and I'd like to include such a list directly -- I don't want to have to distribute other libraries and such, and I don't want to embed the strings into a Win32 resource, for a bunch of reasons I don't want to go into right now. I simply included my big whitelist in my .cpp file, and was presented with this error: 1>ServicesWhitelist.cpp(2807): fatal error C1091: compiler limit: string exceeds 65535 bytes in length The string itself is about twice this allowed limit by VC++. What's the best way to include such a large literal in a program?

    Read the article

  • How to execute a large PHP Script ?

    - by atif089
    Well basically I may want to execute a script that may take as much as 1 hours as well. What I really want to do is Send SMS to my users using a third party API. So its basically like I supply my script with an array of phone numbers and fire the method to send SMS. However assuming it take 5 seconds to send 1 SMS and I want to send 1000 SMS which is roughly 1 - 2 hours. I can use set_time_limit() because I am on shared host. One way to do this is store numbers in a session and execute each SMS and use javascript to refresh that page until end. This way I need to keep my browser open and the execution will stop if my Internet Connection is disconnected. So, Is there any better way to do this ? Hope I am clear enough to explain what I want? I want to execute a large script that may take hours to execute without getting timeout.

    Read the article

  • rails large amount of data in single insert activerecord gave out

    - by Nik
    So I have I think around 36,000 just to be safe, a number I wouldn't think was too large for a modern sql database like mysql. Each record has just two attributes. So I do: so I collected them into one single insert statement sql = "INSERT INTO tasks (attrib_a, attrib_b) VALUES (c1,d1),(c2,d2),(c3,d3)...(c36000,d36000);" ActiveRecord::Base.connection.execute sql from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/abstract_adapter.rb:219:in `log' from C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.3.5/lib/active_record/connection_adapters/mysql_adapter.rb:323:in `execute_without_analyzer from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute' from C:/Ruby/lib/ruby/1.8/benchmark.rb:308:in `realtime' from c:/r/projects/vendor/plugins/rails-footnotes/lib/rails-footnotes/notes/queries_note.rb:130:in `execute' from (irb):53 from C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.3.5/lib/active_support/vendor/tzinfo-0.3.12/tzinfo/time_or_datetime.rb:242 I don't know if the above info is enough, please do ask for anything that I didn't provide here. So any idea what this is about? THANK YOU!!!!

    Read the article

  • How to handle large table in MySQL ?

    - by Frantz Miccoli
    I've a database used to store items and properties about these items. The number of properties is extensible, thus there is a join table to store each property associated to an item value. CREATE TABLE `item_property` ( `property_id` int(11) NOT NULL, `item_id` int(11) NOT NULL, `value` double NOT NULL, PRIMARY KEY (`property_id`,`item_id`), KEY `item_id` (`item_id`) ) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci; This database has two goals : storing (which has first priority and has to be very quick, I would like to perform many inserts (hundreds) in few seconds), retrieving data (selects using item_id and property_id) (this is a second priority, it can be slower but not too much because this would ruin my usage of the DB). Currently this table hosts 1.6 billions entries and a simple count can take up to 2 minutes... Inserting isn't fast enough to be usable. I'm using Zend_Db to access my data and would really be happy if you don't suggest me to develop any php side part. Thanks for your advices !

    Read the article

  • Enable export to XML via HTTP on a large number of models with child relations

    - by Vasil
    I've a large number of models (120+) and I would like to let users of my application export all of the data from them in XML format. I looked at django-piston, but I would like to do this with minimum code. Basically I'd like to have something like this: GET /export/applabel/ModelName/ Would stream all instances of ModelName in applabel together with it's tree of related objects . I'd like to do this without writing code for each model. What would be the best way to do this?

    Read the article

  • slow record deletion with large ntext values

    - by asking
    I'm having trouble deleting some records via a stored procedure from a table in SQLServer 2008R2 that has ntext columns. The stored proc is timing out and running the query directly takes a very long time. The initial query was a straight "delete from y where x = z" and I've also tried running it in batches of 1000 with transactions but it is still slow and timing out in a stored proc. The majority of the records in the table will not be deleted each time (it's not just a once-off query but will be run other times). The ntext columns are not used in the where clause and I can't change the column types. Any suggestions on the quickest way to delete records with large ntext values? Thanks

    Read the article

  • Calculating very large exponents in python

    - by miraclesoul
    Dear All, Currently i am simulating my cryptographic scheme to test it. I have developed the code but i am stuck at one point. I am trying to take : g**x where g = 256 bit number x = 256 bit number Python hangs at this point, i have read alot of forums, threads etcc but only come to the conclusion that python hangs, as its hard for it to process such large numbers. any idea how can it be done? any two line piece of code, any library, anything that can be done.(ALSO PLEASE I AM A NEW PYTHON USER AND THIS IS FIRST TIME I DID PROGRAMMING IN IT, SO NO COMPLEX METHODS ...HOPE YOU UNDERSTAND :s)

    Read the article

  • Core Data and iTunes File Sharing - Move/hide the .sqlite file on app update?

    - by Eric
    I have an iPad app that uses Core Data for data storage. I would like to enable file sharing in iTunes and I don't really want the users to be able to delete or modify the .sqlite file. Can I move the file to a different, hidden directory? Alternatively, could the file be made read-only? I wouldn't mind users having access to the file as long as it couldn't be changed. I suspect there is a trivial solution that is escaping me at the moment.

    Read the article

  • Data clean up: are there libraries of common permutations that we can use? Or is there a better appr

    - by anyaelena
    We are working on clean-up and analysis of a lot of human-entered customer data. We need to decide programmatically whether 2 addresses (for example) are the same, even though the data was entered with slight variations. Right now we run each address through fairly simplistic string replacement (replacing avenue with ave, for example), concatenate the fields and compare the results. We are doing something similar with names. At the very least, it seems like our list of search-replace values should already exist somewhere. Or perhaps you can suggest a totally different and superior way to detect matches?

    Read the article

  • How to transfer large files from desktop to server ( .NET)

    - by rahulchandran
    I am writing a .NET 2.0 based desktop client that will send large files ( well largish under 2GB) to a server. Need to develop the server as well. Server can be on any technology It should be secure so an underlying SSL stream is needed What are my options. Any obvious caveats etc I should be aware of To my mind the simplest solution is to open a tcp\ip connection over SSL to the server and send n packets each of size M bytes and then have the server append the chunks to the file and finally send an EOF packet as well IS this horrible. Will the perf suck on the server with all these disk writes What are any other clever options. I am limited to .NET 2.0 on the client if I did move to a WCF client will it buy be something magical and cool for this scenario Thanks

    Read the article

  • A programming language for teaching data structures and algorithms with? [closed]

    - by Andreas Grech
    Possible Duplicate: Choice of programming language for learning data structures and algorithms Teachers have different opinions on what programming language they would choose to teach data structures and algorithms with. Some would prefer a lower level language such as C because it allows the student to learn more about what goes on beyond the abstractions in terms of memory allocation and deallocation and pointers and pointer arithmetic. On the other hand, others would say that they would prefer a higher level language like Java because it allows the student to learn more about the concepts of the structures and the algorithm design rather than 'waste time' and fiddle around with memory segmentation faults and all the blunders that come with languages where memory management is manual. What is your take on this issue? And also, please post any references you may know of that also discuss this argument.

    Read the article

  • Redirecting a large number of URLs with htaccess or php header

    - by Peter
    I have undergone a major website overhaul and now have 5,000+ incoming links from search engines and external sites, bookmark services etc that lead to dead pages or 404 errors. A lot of the pages have corresponding "permalinks" or known replacement hierarchy/URL structure. I've started to list the main redirects with htaccess or physical files with simply a header location reidrect which is clearly not sustainable! What would be the best method to list all of the old link addresses and their corresponding new addresses with htaccess, php headers, mysql, sitemap file or is it better to have all broken links and wait for search engines etc to re-index my site? Are there any implications for having a large number of redirecting files for this temporary period until links are reset?

    Read the article

  • Paging a UIScrollView with a large PDF

    - by Fousa
    I try to create a simple UIScrollView with paging. And I want to be able to scroll through a large PDF document, but this gives me some problems... I tried the following options: Convert all the PDF pages to UIImages at startup, this works, but is very slow on start Manually drawing the PDF page in the drawRect, but yet again this was slow... And I prefer not to load everything at startup but to do it during the usage. Did anyone did this recently? Can't seem to find a nice example project. Thnx! Jelle

    Read the article

  • Unable to return large result set ORA-22814

    - by rvenugopal
    Hello All I am encountering an issue when I am trying to load a large result set using a range query in Oracle 10g. When I try a smaller range (1 to 100), it works but when I try a larger range(1 and 1000), I get the following error "ORA-22814: attribute or element value is larger than specified in type" error. I have a basic UDT (PostComments_Type) and I have tried using both a VArray and a Table type of PostComments_Type but that hasn't made a difference. Your help is appreciated --Thanks Venu PROCEDURE RangeLoad ( floorId IN NUMBER, ceilingId IN NUMBER, o_PostComments_LARGE_COLL_TYPE OUT PostComments_LARGE_COLL_TYPE -- Tried using as VArray and also Table type of PostComments_Type )IS BEGIN SELECT PostComments_TYPE ( PostComments_ID, ... ) BULK COLLECT INTO o_PostComments_LARGE_COLL_TYPE ------------This is for VARRAY/Table Type. So bulk operation FROM PostComments WHERE PostComments_ID BETWEEN floorId And ceilingId; END RangeLoad;

    Read the article

  • namespacing large javascript like jquery

    - by frenchie
    I have a very large javascript file: it's over 9,000 lines. The code looks like this: var GlobalVar1 = ""; var GlobalVar2 = null; function A() {...} function B(SomeParameter) {...} I'm using the google compiler and the global variables and functions get renamed a,b,c... and there's a good change that there might be some collision later with some outside code. What I want to do is have my code organized like the jquery library where everything is accessible with $. Is there a way to namespace my code so that everything is behind a # character for example. I'd like to have this to call my code: #.GlobalVar #.functionA(SomeParameter) How can I do this? Thanks.

    Read the article

  • Transforming large Xml files

    - by Chad
    I was using this extension method to transform very large xml files with an xslt. Unfortunately, I get an OutOfMemoryException on the source.ToString() line. I realize there must be a better way, I'm just not sure what that would be? public static XElement Transform(this XElement source, string xslPath, XsltArgumentList arguments) { var doc = new XmlDocument(); doc.LoadXml(source.ToString()); var xsl = new XslCompiledTransform(); xsl.Load(xslPath); using (var swDocument = new StringWriter(System.Globalization.CultureInfo.InvariantCulture)) { using (var xtw = new XmlTextWriter(swDocument)) { xsl.Transform((doc.CreateNavigator()), arguments, xtw); xtw.Flush(); return XElement.Parse(swDocument.ToString()); } } } Thoughts? Solutions? Etc.

    Read the article

  • Mysql Master Slave Replication on Large Database table (how to sync initial data)

    - by Brian Lovett
    We have a production server and a dev server. We have found that backups are nearly impossible on the production server because of the query volume we experience. So, we're looking at setting up replication with our dev server being the slave. This is ideal because we can afford to lock the tables on that server and additionally it will be nice to have up to date data for the developers. Now, the issues. The production server can't really be taken down or locked at this point, at least not easily. We have a high query volume and fairly large 30+ GB innodb tables. Both servers are running all innodb and are also both on mysql 5.1. What can we do to sync the data initially to get replication started? I've tried a few options, but so far, none have worked.

    Read the article

< Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >