Search Results

Search found 11025 results on 441 pages for 'f4r 20'.

Page 10/441 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Android: OutOfMemoryError while uploading video - how best to chunk?

    - by AP257
    Hi all, I have the same problem as described here, but I will supply a few more details. While trying to upload a video in Android, I'm reading it into memory, and if the video is large I get an OutOfMemoryError. Here's my code: // get bytestream to upload videoByteArray = getBytesFromFile(cR, fileUriString); public static byte[] getBytesFromFile(ContentResolver cR, String fileUriString) throws IOException { Uri tempuri = Uri.parse(fileUriString); InputStream is = cR.openInputStream(tempuri); byte[] b3 = readBytes(is); is.close(); return b3; } public static byte[] readBytes(InputStream inputStream) throws IOException { ByteArrayOutputStream byteBuffer = new ByteArrayOutputStream(); // this is storage overwritten on each iteration with bytes int bufferSize = 1024; byte[] buffer = new byte[bufferSize]; int len = 0; while ((len = inputStream.read(buffer)) != -1) { byteBuffer.write(buffer, 0, len); } return byteBuffer.toByteArray(); } And here's the traceback (the error is thrown on the byteBuffer.write(buffer, 0, len) line): 04-08 11:56:20.456: ERROR/dalvikvm-heap(6088): Out of memory on a 16775184-byte allocation. 04-08 11:56:20.456: INFO/dalvikvm(6088): "IntentService[UploadService]" prio=5 tid=17 RUNNABLE 04-08 11:56:20.456: INFO/dalvikvm(6088): | group="main" sCount=0 dsCount=0 s=N obj=0x449a3cf0 self=0x38d410 04-08 11:56:20.456: INFO/dalvikvm(6088): | sysTid=6119 nice=0 sched=0/0 cgrp=default handle=4010416 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:~93) 04-08 11:56:20.456: INFO/dalvikvm(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.456: INFO/dalvikvm(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.456: INFO/dalvikvm(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.467: WARN/dalvikvm(6088): threadid=17: thread exiting with uncaught exception (group=0x4001b180) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): Uncaught handler: thread IntentService[UploadService] exiting due to uncaught exception 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): java.lang.OutOfMemoryError 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.expand(ByteArrayOutputStream.java:93) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at java.io.ByteArrayOutputStream.write(ByteArrayOutputStream.java:218) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.readBytes(UploadService.java:199) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.getBytesFromFile(UploadService.java:182) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.doUploadinBackground(UploadService.java:118) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at com.android.election2010.UploadService.onHandleIntent(UploadService.java:85) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.app.IntentService$ServiceHandler.handleMessage(IntentService.java:30) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Handler.dispatchMessage(Handler.java:99) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.Looper.loop(Looper.java:123) 04-08 11:56:20.467: ERROR/AndroidRuntime(6088): at android.os.HandlerThread.run(HandlerThread.java:60) 04-08 11:56:20.496: INFO/Process(4657): Sending signal. PID: 6088 SIG: 3 I guess that as @DroidIn suggests, I need to upload it in chunks. But (newbie question alert) does that mean that I should make multiple PostMethod requests, and glue the file together at the server end? Or can I load the bytestream into memory in chunks, and glue it together in the Android code? If anyone could give me a clue as to the best approach, I would be very grateful.

    Read the article

  • "Hello World" in less than 20 bytes

    - by xelurg
    We have had an interesting competition once, where everyone would write their implementation of hello world program. One requirement was that is should be less than 20 bytes in compiled form. the winner would be the one whose version is the smallest... What would be your solution? :) Platform: 32bit, x86 OS: DOS, Win, GNU/Linux, *BSD Language: Asm, C, or anything else that compiles into binary executable (i.e. no bash scripts and stuff ;)

    Read the article

  • flash game lags every 20 seg

    - by ZoserLock
    My game has delta time for frame independent movement, at 250 fps run perfectly smooth, but if i limit the fps to 60, the game slow down for a 2-4 seg every 20 seg aprox, even in small programs i have this same problem. no memory is created or released i comment everything i can and the problem persist thanks and sorry for my english

    Read the article

  • Why limit WCF ServiceContracts to 10-20 OperationContracts?

    - by Gary B
    I've seen recommendations (Juval Lowy, et al) that a service contract should have "no more than 20 members...twelve is probably the practical limit". Why? It seems that if you wish to provide a service as the interface to a relatively large db (50-100 tables) you're going to go way past that in just CRUD alone. I've worked with plenty of other services that provided hundreds of 'OperationContracts'...is there something peculiar about WCF? Is there something I'm missing here?

    Read the article

  • MySQL Optimization 20 gig table

    - by user169743
    I have a 20 gig table that has a large amount of inserts and updates daily. This table is also frequently searched. I'd like to know if the MySQL indices can become fragmented and perhaps need to be rebuilt or something similar. I'm finding it difficult to figure out which of the CHECK TABLE, REPAIR TABLE or something similar? Any guidance appreciated, I'm a db newb.

    Read the article

  • c source code to remove subset transactions form text file

    - by user324887
    I have a file containing data as follows 10 20 30 40 70 20 30 70 30 40 10 20 29 70 80 90 20 30 40 40 45 65 10 20 80 45 65 20 I want to remove all subset transaction from this file. output file should be like follows 10 20 30 40 70 29 70 80 90 20 30 40 40 45 65 10 20 80 Where records like 20 30 70 30 40 10 20 45 65 20 are removed because of they are subset of other records.

    Read the article

  • c source code to remove subset transactions from text file

    - by user324887
    I have a file containing data as follows 10 20 30 40 70 20 30 70 30 40 10 20 29 70 80 90 20 30 40 40 45 65 10 20 80 45 65 20 I want to remove all subset transaction from this file. output file should be like follows 10 20 30 40 70 29 70 80 90 20 30 40 40 45 65 10 20 80 Where records like 20 30 70 30 40 10 20 45 65 20 are removed because of they are subset of other records.

    Read the article

  • What is the network address (x.x.x.0) used for?

    - by Shtééf
    It appears to be common practice to not use the first address in a subnet, that is the IP 192.168.0.0/24, or a more exotic example would be 172.20.20.64/29. The ipcalc tool I frequently use follows the same practice: $ ipcalc -n -b 172.20.20.64/29 Address: 172.20.20.64 Netmask: 255.255.255.248 = 29 Wildcard: 0.0.0.7 => Network: 172.20.20.64/29 HostMin: 172.20.20.65 HostMax: 172.20.20.70 Broadcast: 172.20.20.71 Hosts/Net: 6 Class B, Private Internet But why is that HostMin is not simply 64 in this case? The 64 address is a valid address, right? And whatever the answer, does the same apply to IPv6? Perhaps slightly related: it also appears possible to use a TCP port 0 and an UDP port 0. Are these valid or used anywhere?

    Read the article

  • removing subset transactions form file

    - by user324887
    I have a file containing data as follows 10 20 30 40 70 20 30 70 30 40 10 20 29 70 80 90 20 30 40 40 45 65 10 20 80 45 65 20 I want to remove all subset transaction from this file. output file should be like follows 10 20 30 40 70 29 70 80 90 20 30 40 40 45 65 10 20 80 Where records like 20 30 70 30 40 10 20 45 65 20 are removed because of they are subset of other records.

    Read the article

  • toURI method of File transform space character into %20

    - by piero
    toURI method of File transform space character into %20 On windows XP with Java 6 public static void main(String[] args) { File f = new File("C:\\My dir\\test.txt"); URI uri = f.toURI(); System.out.println(f.getAbsolutePath()); System.out.println(uri); } C:\My dir\test.txt file:/C:/My%20dir/test.txt

    Read the article

  • Msg 64, Level 20, State 0, Line 0 SQL Server Error

    - by Brettski
    I am running a sproc on an SQL Server 2005 server which is resulting in the following error: Msg 64, Level 20, State 0, Line 0 A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.) Once the error occurs I loose my connection to the server, but able to reconnect. There is nothing in the Event logs. The database is still functional and running its website fine. EDIT: This occurs every time I run this sproc, or it's called by an application. Any suggestions on what may be causing this error?

    Read the article

  • SharePoint Lookup Field 20 Item JS error

    - by Chops
    Hi Everyone, I've got an issue with SharePoint Server 2007 SP1 which seems to be documented in various forms, but I haven't been able to find an answer to my seemingly simpler question. http://social.msdn.microsoft.com/Forums/en-US/sharepointcustomization/thread/040533d2-c738-4ac2-b2d6-65a1602fa2d1 Essentially, we have a form with a lookup field to another SharePoint list. The lookup field has more than 20 items, and so SharePoint changes the type/behaviour of the field as per the standard behaviour. However, with my custom skin applied, clicking on this drop down causes a JS error from within the Core.js file, as described here: http://splitnut.blogspot.com/2009/06/lookup-fields-in-moss-javascript-error.html What I haven't been able to figure out is why this is only an issue when my custom master page is applied, and not the OOTB master pages. I've been through and tried to see what might cause this, but haven't been able to track down the cause of the behaviour. Any help would be much appreciated. Many thanks.

    Read the article

  • Why is there a limit of max 20 parameters to a clojure function

    - by GuyC
    Hi, there seems to be a limit to the number of parameters a clojure function can take. When defining a function with more than 20 parameters I receive the following: Obviously this can be avoided, but I was hitting this limit porting the execution model of an existing DSL to clojure, and I have constructs in my DSL like the following, which by macro expansion can be mapped to functions quite easily except for this limit: (defAlias nn1 ((element ?e1) (element ?e2)) number "@doc features of the elements are calculated for entry into the first neural network, the result is the score computed by the latter" (nn1-recall (nn1-feature00 ?e1 ?e2) (nn1-feature01 ?e1 ?e2) ... (nn1-feature89 ?e1 ?e2))) which is a DSL statement to call a neural network with 90 input nodes. Can work around it of course, but was wondering where the limit comes from. Thanks.

    Read the article

  • Handling Datetime with decimal '2010-02-14 20:18:58.313000000'

    - by AaronLS
    In SQL Server I have some textual data in varchar fields I am trying to convert to datetime's. The funny thing is this data at some point was in a datetime field, exported to flat file, and now I am reimporting it. The problem is it is in this format 2010-02-14 20:18:58.313000000 and the conversion to datetime fails. I have no idea how it ended up like this when it was originally extracted from a datetime column. Basically a table was exported to a flat file by someone else. The original table was lost. I am reimporting back from the flatfile. I could just drop the decimal but this would be like throwing out some of the data. I'd like to maintain as much precision as possible. How can I import this data from the varchar column back into a datetime column and preserve as much accuracy as possible?

    Read the article

  • mysql - filtering a list against keywords, both list and keywords > 20 million records

    - by threecheeseopera
    I have two tables, both having more than 20 million records; table1 is a list of terms, and table2 is a list of keywords that may or may not appear in those terms. I need to identify the terms that contain a keyword. My current strategy is: SELECT table1.term, table2.keyword FROM table1 INNER JOIN table2 ON table1.term LIKE CONCAT('%', table2.keyword, '%'); This is not working, it takes f o r e v e r. It's not the server (see notes). How might I rewrite this so that it runs in under a day? Notes: As for server optimization: both tables are myisam and have unique indexes on the matching fields; the myisam key buffer is greater than the sum of both index file sizes, and it is not even being fully taxed (key_blocks_unused is ... large); the server is a dual-xeon 2U beast with fast sas drives and 8G of ram, fine-tuned for the mysql workload.

    Read the article

  • Higher speed options for executing very large (20 GB) .sql file in MySQL

    - by Jonogan
    My firm was delivered a 20+ GB .sql file in reponse to a request for data from the gov't. I don't have many options for getting the data in a different format, so I need options for how to import it in a reasonable amount of time. I'm running it on a high end server (Win 2008 64bit, MySQL 5.1) using Navicat's batch execution tool. It's been running for 14 hours and shows no signs of being near completion. Does anyone know of any higher speed options for such a transaction? Or is this what I should expect given the large file size? Thanks

    Read the article

  • Enumerating large (20-digit) [probable] prime numbers

    - by Paul Baker
    Given A, on the order of 10^20, I'd like to quickly obtain a list of the first few prime numbers greater than A. OK, my needs aren't quite that exact - it's alright if occasionally a composite number ends up on the list. What's the fastest way to enumerate the (probable) primes greater than A? Is there a quicker way than stepping through all of the integers greater than A (other than obvious multiples of say, 2 and 3) and performing a primality test for each of them? If not, and the only method is to test each integer, what primality test should I be using?

    Read the article

  • Rotating images with PHP reduces quality especially over about 10-20 actions

    - by Dylan Cross
    I am using this code below to rotate my jpeg images, the problem is that after around 10-20 times of rotating the image the image is dramatically lower quality, especially blue skies and such, my question is how can I keep these images the same high quality image? There must be a way. I mean, i keep the original image on the server for each image uploaded, and I don't do anything to that, so if need be it, I guess I could always come up with some way of using that over whenever possible.. But would rather not have to. $source = imagecreatefromjpeg($filename); $rotate = imagerotate($source, 90, 0); imagejpeg($rotate, $filename ,100);

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >