Search Results

Search found 602 results on 25 pages for 'chunks'.

Page 13/25 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How to transform phrases and words into MD5 hash?

    - by brilliant
    Can anyone, please, explain to me how to transform a phrase like "I want to buy some milk" into MD5? I read Wikipedia article on MD5, but the explanation given there is beyond my comprehension: "MD5 processes a variable-length message into a fixed-length output of 128 bits. The input message is broken up into chunks of 512-bit blocks (sixteen 32-bit little endian integers)" "sixteen 32-bit little endian integers" is already hard for me. I checked the article on little endians and didn't understand a bit. However, the examples of some phrases and their MD5 hashes are very nice: MD5("The quick brown fox jumps over the lazy dog") = 9e107d9d372bb6826bd81d3542a419d6 MD5("The quick brown fox jumps over the lazy dog.") = e4d909c290d0fb1ca068ffaddf22cbd0 Can anyone, please, explain to me how this MD5 algorithm works on some very simple example? And also, perhaps you know some software or a code that would transform phrases into their MD5. If yes, please, let me know.

    Read the article

  • ctypes and pointer manipulation

    - by Chris
    I am dealing with image buffers, and I want to be able to access data a few lines into my image for analysis with a c library. I have created my 8-bit pixel buffer in Python using create_string_buffer. Is there a way to get a pointer to a location within that buffer without re-creating a new buffer? My goal is to analyze and change data within that buffer in chunks, without having to do a lot of buffer creation and data copying. In this case, ultimately, the C library is doing all the manipulation of the buffer, so I don't actually have to change values within the buffer using Python. I just need to give my C function access to data within the buffer.

    Read the article

  • Is it possible to Update Sharepoint List Without "ID" ? : c#

    - by Pari
    Hi, I want to Upload File on Sharepoint and while apploading only i want to add all properties of Uploaded Document. We get ID field only when Document is uploaded on Sharepoint. Is there any other way to Update List without passing ID Field. Example: <Batch OnError="Continue" ListVersion="1" ViewName="270C0508-A54F-4387-8AD0-49686D685EB2"> <Method ID="1" Cmd="Update"> <Field Name="ID">4<Field> <Field Name="Field_Name">Value</Field> </Method> <Method ID="2" Cmd="Update"> <Field Name="ID" >6</Field> <Field Name="Field_Name">Value</Field> </Method> </Batch> Refering Link I am using Sharepoint Web Services.And Uploading Document in Chunks.**

    Read the article

  • mysql - select pagination chunk that includes a given id

    - by sean smith
    ...before the pagination chunks have been determined. I know you can do this in multiple statements, but there must be a better way. my results are returned ordered by date and I want to return the pagination chunk that contains a given id. So I could, for example, select the date of the given id and then select a chunk of results where date is less than or greater than the date. That would work. But is there some native mysql method of doing this sort of thing in one statement? It just seems reasonable to expect that we could ask for X results in which a given id exists if results are ordered by date.

    Read the article

  • Android OutOfMemoryError - Loading JSON File

    - by jeremynealbrown
    The app I am working on needs to read a JSON file that may be anywhere from 1.5 to 3 MB in size. It seems to have no problem opening the file and converting the data to a string, but when it attempts to convert the string to a JSONArray, OutOfMemoryErrors are thrown. The exceptions look something like this: E/dalvikvm-heap( 5307): Out of memory on a 280-byte allocation. W/dalvikvm( 5307): Exception thrown (Ljava/lang/OutOfMemoryError;) while throwing internal exception (Ljava/lang/OutOfMemoryError;) One strange thing about this is that the crash only occurs every 2nd or 3rd time the app is run, leaving me to believe that the memory consumed by the app is not being garbage collected each time the app closes. Any insight into how I might get around this issue would be greatly appreciated. I am open to the idea of loading the file in chunks, but I'm not quite sure what the best approach is for such a task. Thank you

    Read the article

  • On WindowsMobile, how can i tell what other processes are reserving shared memory space?

    - by glutz78
    On WindowMobile 6.1, I am using VirtualAlloc to reserve 2MB chunks, which will return me an address from the large shared memory area so allocations do not count against my per process virtual space. (doc here: http://msdn.microsoft.com/en-us/library/aa908768.aspx) However, on some devices i notice that I am not able to reserve memory after a certain point. VirtualAlloc will return NULL (getlasterror() says out of memory). The only explanation for this that I see is that another process has already reserved a bunch of memory and my process is therefore unable to. Any idea where I can find a tool to show me the shared mem region of a WM device? Thanks.

    Read the article

  • large amount of data in many text files - how to process?

    - by Stephen
    Hi, I have large amounts of data (a few terabytes) and accumulating... They are contained in many tab-delimited flat text files (each about 30MB). Most of the task involves reading the data and aggregating (summing/averaging + additional transformations) over observations/rows based on a series of predicate statements, and then saving the output as text, HDF5, or SQLite files, etc. I normally use R for such tasks but I fear this may be a bit large. Some candidate solutions are to 1) write the whole thing in C (or Fortran) 2) import the files (tables) into a relational database directly and then pull off chunks in R or Python (some of the transformations are not amenable for pure SQL solutions) 3) write the whole thing in Python Would (3) be a bad idea? I know you can wrap C routines in Python but in this case since there isn't anything computationally prohibitive (e.g., optimization routines that require many iterative calculations), I think I/O may be as much of a bottleneck as the computation itself. Do you have any recommendations on further considerations or suggestions? Thanks

    Read the article

  • Django file uploads - Just can't work it out

    - by phoebebright
    OK I give up - after 5 solid hours trying to get a django form to upload a file, I've checked out all the links in stackoverflow and googled and googled. Why is it so hard, I just want it to work like the admin file upload? So I get that I need code like: if submitForm.is_valid(): handle_uploaded_file(request.FILES['attachment']) obj = submitForm.save() and I can see my file in request.FILES['attachment'] (yes I have enctype set) but what am I supposed to do in handle_uploaded_file? The examples all have a fixed file name but obviously I want to upload the file to the directory I defined in the model, but I can't see where I can find that. def handle_uploaded_file(f): destination = open('fyi.xml', 'wb+') for chunk in f.chunks(): destination.write(chunk) destination.close() Bet I'm going to feel really stupid when someone points out the obvious!

    Read the article

  • What language is to binary, as Perl is to text?

    - by ehdr
    I am looking for a scripting (or higher level programming) language (or e.g. modules for Python or similar languages) for effortlessly analyzing and manipulating binary data in files (e.g. core dumps), much like Perl allows manipulating text files very smoothly. Things I want to do include presenting arbitrary chunks of the data in various forms (binary, decimal, hex), convert data from one endianess to another, etc. That is, things you normally would use C or assembly for, but I'm looking for a language which allows for writing tiny pieces of code for highly specific, one-time purposes very quickly. Any suggestions?

    Read the article

  • LINQ to SQL SubmitChangess() progress

    - by Emir
    I'm using LINQ to SQLto import old DBF files into MSSQL. I'm reading all rows and adding them to database using ctx.MyTable.InsertOnSubmit(row) After reading phase is completed I have around 100 000 pending inserts. ctx.SubmitChanges() naturally is taking a long time. Is there any way to track progress of the ctx.submitchanges()? Can ctx.Log somehow be used for this purpose? Update: Is it possible to use ctx.GetChangeSet().Inserts.Count and track insert statements using the Log? Dividing ctx.SubmitChanges() into smaller chunks is not working for me, because I need transaction, all or nothing. Update 2: I've found nice class ActionTextWriter using which I will try to count inserts. http://damieng.com/blog/2008/07/30/linq-to-sql-log-to-debug-window-file-memory-or-multiple-writers

    Read the article

  • Solving iPhone/iPad out of memory issues

    - by Joonas Trussmann
    I have a strange issue where I'm scrolling through a paged UIScrollView which displays the pages of a PDF document (using Quartz 2D and CATiledLayer). When I page through memory allocation looks fine with it going up with a few initial pages and then keeping it steady as it obviously releases the memory kept for earlier pages. Upon hitting page x (not a certain PDF page or a certain number per se) memory usage goes from a couple of megs to 308 megs and the app crashes. So my question is: how to best try to find what's causing this? The object alloc tool in instruments shows the memory as simply going to malloc. (in huge chunks).

    Read the article

  • SDK for writing DVD's

    - by Matt Warren
    I need to add DVD writing functionality to an application I'm working on. However it needs to be able to write out files that are being grabbed "live" from a camera, over a long period of time. I can't wait until all the files are captured before I start writing them to the DVD, I need to write them out in chunks as I go along. I've looked at IMAPI v2, but the main problems seems to be that you need to point it to all the files you plan to write out to disk before you start the burning process. I know it has to concept of "sessions", which means you can write to the DVD in several parts, before you finally "close" it. But I was wondering if there were any other DVD writing SDK's that allow you to be constantly writing files to a DVD and in particular files that are only in memory. It would be more efficient if I didn't have to write the captured images out to hard before they are burned to DVD. The solution needs to work under .NET on Windows XP and vista

    Read the article

  • Simple non-network concurrency with Twisted

    - by Rince
    Dear pythoners, I have a problem with using Twisted for simple concurrency in python. The problem is - I don't know how to do it and all online resources are about Twisted networking abilities. So I am turning to SO-gurus for some guidance. Python 2.5 is used. Simplified version of my problem runs as follows: A bunch of scientific data A function that munches on the data and creates output ??? < here enters concurrency, it takes chunks of data from 1 and feeds it to 2 Output from 3 is joined and stored My guess is that Twisted reactor can do the number three job. But how? Thanks a lot for any help and suggestions.

    Read the article

  • PHP: Coding long-running scripts when servers impose an execution time limit

    - by thomasrutter
    FastCGI servers, for example, impose an execution time limit on PHP scripts which cannot be altered using set_time_limit() in PHP. IIS does this too I believe. I wrote an import script for a PHP application that works well under mod_php but fails under FastCGI (mod_fcgid) because the script is killed after a certain number of seconds. I don't yet know of a way of detecting what your time limit is in this case, and haven't decided how I'm going to get around it. Doing it in small chunks with redirects seems like one kludge, but how? What techniques would you use when coding a long-running task such as an import or export task, where an individual PHP script may be terminated by the server after a certain number of seconds? Please assume you're creating a portable script, so you don't necessarily know whether PHP will eventually be run under mod_php, FastCGI or IIS or whether a maximum execution time is enforced at the server level.

    Read the article

  • Sorting By Multiple Conditions in Ruby

    - by viatropos
    I have a collection of Post objects and I want to be able to sort them based on these conditions: First, by category (news, events, labs, portfolio, etc.) Then by date, if date, or by position, if a specific index was set for it Some posts will have dates (news and events), others will have explicit positions (labs, and portfolio). I want to be able to call posts.sort!, so I've overridden <=>, but am looking for the most effective way of sorting by these conditions. Below is a pseudo method: def <=>(other) # first, everything is sorted into # smaller chunks by category self.category <=> other.category # then, per category, by date or position if self.date and other.date self.date <=> other.date else self.position <=> other.position end end It seems like I'd have to actually sort two separate times, rather than cramming everything into that one method. Something like sort_by_category, then sort!. What is the most ruby way to do this?

    Read the article

  • How can I capture multiple matches from the same Perl regex?

    - by Sho Minamimoto
    I'm trying to parse a single string and get multiple chunks of data out from the same string with the same regex conditions. I'm parsing a single HTML doc that is static (For an undisclosed reason, I can't use an HTML parser to do the job.) I have an expression that looks like: $string =~ /\<img\ssrc\="(.*)"/; and I want to get the value of $1. However, in the one string, there are many img tags like this, so I need something like an array returned (@1?) is this possible?

    Read the article

  • Whats wrong with this task queue setup?

    - by Peter Farmer
    I've setup this task queue implementation on a site I host for a customer, it has a cron job which runs each morning at 2am "/admin/tasks/queue", this queues up emails to be sent out, "/admin/tasks/email", and uses cursors so as to do the queuing in small chunks. For some reason last night /admin/tasks/queue kept getting run by this code and so sent out my whole quota of emails :/. Have I done something wrong with this code? class QueueUpEmail(webapp.RequestHandler): def post(self): subscribers = Subscriber.all() subscribers.filter("verified =", True) last_cursor = memcache.get('daily_email_cursor') if last_cursor: subscribers.with_cursor(last_cursor) subs = subscribers.fetch(10) logging.debug("POST - subs count = %i" % len(subs)) if len(subs) < 10: logging.debug("POST - Less than 10 subscribers in subs") # Subscribers left is less than 10, don't reschedule the task for sub in subs: task = taskqueue.Task(url='/admin/tasks/email', params={'email': sub.emailaddress, 'day': sub.day_no}) task.add("email") memcache.delete('daily_email_cursor') else: logging.debug("POST - Greater than 10 subscibers left in subs - reschedule") # Subscribers is 10 or greater, reschedule for sub in subs: task = taskqueue.Task(url='/admin/tasks/email', params={'email': sub.emailaddress, 'day': sub.day_no}) task.add("email") cursor = subscribers.cursor() memcache.set('daily_email_cursor', cursor) task = taskqueue.Task(url="/admin/tasks/queue", params={}) task.add("queueup")

    Read the article

  • ruby memory leak Gdk::PixbufLoader

    - by Reed Debaets
    So I'm beginning to wonder how leaky the gnome2 libraries for ruby1.8.6 are. #!/usr/bin/env ruby require 'gtk2' while true sleep 0.1 pixbuf = Gdk::PixbufLoader.new pixbuf = nil end this leaks about 16kb/sec according to watch -n 1 ps -o rss -p <process id> This is compounded if you start trying to write a chunk of large chunks of image data to it using pixbuf.last_write img_data Any ideas how to get around this (and the second issue)? I need to update image data within my code but it seems like anything that ends up using a pixbuf leaks like a sieve.

    Read the article

  • Web Performance testing using VS2010 "Testing a file download"

    - by cheedep
    Hi All, I am trying out the VS 2010 testing tools for the first time. And I tried recording a web performance test and my actions had a file download implemented as in the KB article here http://support.microsoft.com/kb/812406 by streaming chunks of 10000 bytes. However my test is failing at the download saying "The response stream has been closed". Please help me understand why it is happening this way also any suggestions how you would test such a file download. My main aim was to see how the download was performing for a load test with Intercontinental 350kbps connection on files of about 30-50 MB. Thanks.

    Read the article

  • Django admin interface upload failing on request data read error

    - by Jake
    Hi All, This is an updated version of an old question I asked. I've now done a lot more testing, plus the old question got hijacked. I'm getting a request data read error when trying to upload files to the Django admin interface. Files under about 150k work, but bigger files always fail and almost always at around 192k (that's 3 chunks) completed, sometimes at around 160k. The Exception I get is below. File "/usr/lib/python2.4/site-packages/django/http/multipartparser.py", line 405, in read return self._file.read(num_bytes) IOError: request data read error I've tried Chrome and Firefox on Windows and Firefox on Mac - Same results. I can upload to other sites so I don't think it's my connection. I'm running python 2.4, django 1.1, mod_wsgi, on CentOS (a media temple DV server) Locally it's fine (Django development server) Everything I've found on this issue says it's a mod_python issue and that changing to mod_wsgi will fix it, but I am running mod_wsgi. Can anyone help?

    Read the article

  • MySqlDataAdapter or MySqlDataReader for bulk transfer?

    - by Jeff Meatball Yang
    I'm using the MySql connector for .NET to copy data from MySql servers to SQL Server 2008. Has anyone experienced better performance using one of the following, versus the other? DataAdapter and calling Fill to a DataTable in chunks of 500 DataReader.Read to a DataTable in a loop of 500 I am then using SqlBulkCopy to load the 500 DataTable rows, then continue looping until the MySql record set is completely transferred. I am primarily concerned with using a reasonable amount of memory and completing in a short amount of time. Any help would be appreciated!

    Read the article

  • Random access gzip stream

    - by jkff
    I'd like to be able to do random access into a gzipped file. I can afford to do some preprocessing on it (say, build some kind of index), provided that the result of the preprocessing is much smaller than the file itself. Any advice? My thoughts were: Hack on an existing gzip implementation and serialize its decompressor state every, say, 1 megabyte of compressed data. Then to do random access, deserialize the decompressor state and read from the megabyte boundary. This seems hard, especially since I'm working with Java and I couldn't find a pure-java gzip implementation :( Re-compress the file in chunks of 1Mb and do same as above. This has the disadvantage of doubling the required disk space. Write a simple parser of the gzip format that doesn't do any decompressing and only detects and indexes block boundaries (if there even are any blocks: I haven't yet read the gzip format description)

    Read the article

  • Perl Regex Multiple Items in Single String

    - by Sho Minamimoto
    I'm trying to parse a single string and get multiple chunks of data out from the same string with the same regex conditions. I'm parsing a single HTML doc that is static (For an undisclosed reason, I can't use an HTML parser to do the job.) I have an expression that looks like $string =~ /\<img\ssrc\="(.*)"/; and I want to get the value of $1. However, in the one string, there are many img tags like this, so I need something like an array returned (@1?) is this possible?

    Read the article

  • How does one use dynamic recompilation?

    - by acidzombie24
    It came to my attention some emulators and virtual machines use dynamic recompilation. How do they do that? In C i know how to call a function in ram using typecasting (although i never tried) but how does one read opcodes and generate code for it? Does the person need to have premade assembly chunks and copy/batch them together? is the assembly written in C? If so how do you find the length of the code? How do you account for system interrupts?

    Read the article

  • Where should the partitioning column go in the primary key on SQL Server?

    - by Bialecki
    Using SQL Server 2005 and 2008. I've got a potentially very large table (potentially hundreds of millions of rows) consisting of the following columns: CREATE TABLE ( date SMALLDATETIME, id BIGINT, value FLOAT ) which is being partitioned on column date in daily partitions. The question then is should the primary key be on date, id or value, id? I can imagine that SQL Server is smart enough to know that it's already partitioning on date and therefore, if I'm always querying for whole chunks of days, then I can have it second in the primary key. Or I can imagine that SQL Server will need that column to be first in the primary key to get the benefit of partitioning. Can anyone lend some insight into which way the table should be keyed?

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >