Search Results

Search found 3923 results on 157 pages for 'binary worrier'.

Page 35/157 | < Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >

  • How do I configure the binary log file for auditing in MySQL?

    - by Parth
    How to use Binary Log file for Auditing in MySQL? I want to track the change in a DB using Binary Log so that I can replicate these changes to other DB please do not give me hyperlinks for MySQL website. please direct me to find the solution EDIT I have looked for auditing options and created a script using Triggers for that, but due toi the Joomla DB structure it did'nt worked for me, hence I have to move on to Binary Log file concept now i am stucked in initiating the concept as I am not getting the concept of making the server master/slave, so can any body guide me how to actually initiate it via PHP?

    Read the article

  • Is there a Tool for see files created with binary serialization?

    - by Néstor Sánchez A.
    I've working without problems serializating object graphs to and from files. Everything was fine until today: A dictionary, created in a constructor and NEVER deleted, was lost (null referece) just after deserialization from file, for the first time in more than a year doing the same without troubles. So, is there a Software Tool to look into binary serialization content showing a human/developer-readable version (a la Reflector) of what is stored? AKA: How to analyze (easy, no binary to IL translation. That would take months) binary serialized content? Thanks!

    Read the article

  • Haskell. Numbers in binary numbers. words

    - by Katja
    Hi! I need to code words into binary numbers. IN: "BCD..." OUT:1011... I have written already funktion for coding characters into siple numbers IN: 'C' OUT: 3 IN: 'c' OUT: 3 lett2num :: Char -> Int lett2num x | (ord 'A' <= ord x) && (ord x <= ord 'Z') = (ord x - ord 'A') + 1 | (ord 'a' <= ord x) && (ord x <= ord 'z') = (ord x - ord 'a') +1 num2lett :: Int -> Char num2lett n | (n <= ord 'A') && (n <= ord 'Z') = chr(ord 'A'+ n - 1) | (n <= ord 'a') && (n <= ord 'Z') = chr(ord 'A'+ n - 1) I wrote as well function for codind simple numbers into binary. num2bin :: Int->[Int] num2bin 0 = [] num2bin n | n>=0 = n `mod` 2 : (num2bin( n `div` 2)) | otherwise = error but I donw want those binary numbers to be in a list how can I get rid of the lists? Thanks

    Read the article

  • Linux binary built for 2.0 kernel wouldn't execute on 2.6.x kernel.

    - by lorin
    I was installing a binary Linux application on Ubuntu 9.10 x86_64. The app shipped with an old version of gzip (1.2.4), that was compiled for a much older kernel: $ file gzip gzip: ELF 32-bit LSB executable, Intel 80386, version 1 (SYSV), dynamically linked (uses shared libs), for GNU/Linux 2.0.0, stripped I wasn't able to execute this program. If I tried, this happened: $ ./gzip -bash: ./gzip: No such file or directory ldd was similarly unhappy with this binary: $ ldd gzip not a dynamic executable This isn't a showstopper for me, since my installation has a working version of gzip I can use. But I'm curious: What's the most likely source of this problem? A corrupted file? Or a binary incompatibility due to being built for a much older {kernel,libc,...}?

    Read the article

  • How to switch iostream from binary to text mode and vice versa?

    - by Mad Fish
    I want to read both formatted text and binary data from the same iostream. How can I do that? Why? Imagine this situation: You have different resources, and resource loaders for them, that take a std::istream as a parameter. And there are a "resource source" that provides these streams. Resources can be both text and binary and I need to handle both cases with resource loaders. Or other situation: Image that you have an archive with resources of mixed types. How can I get a text stream from inside the binary archive stream?

    Read the article

  • How do I write raw binary data in Python?

    - by Chris B.
    I've got a Python program that stores and writes data to a file. The data is raw binary data, stored internally as str. I'm writing it out through a utf-8 codec. However, I get UnicodeDecodeError: 'charmap' codec can't decode byte 0x8d in position 25: character maps to <undefined> in the cp1252.py file. This looks to me like Python is trying to interpret the data using the default code page. But it doesn't have a default code page. That's why I'm using str, not unicode. I guess my questions are: How do I represent raw binary data in memory, in Python? When I'm writing raw binary data out through a codec, how do I encode/unencode it?

    Read the article

  • How can I convert floating point values in text to binary using Perl?

    - by YoDar
    I have text file looks like that: float a[10] = { 7.100000e+000 , 9.100000e+000 , 2.100000e+000 , 1.100000e+000 , 8.200000e+000 , 7.220000e+000 , 7.220000e+000 , 7.222000e+000 , 1.120000e+000 , 1.987600e+000 }; unsigned int col_ind[10] = { 1 , 4 , 3 , 4 , 5 , 2 , 3 , 4 , 1 , 5 }; Now, I want to convert each array (float / unsigned int) to different binary files - big endian type. Binary file for all float values and binary file for all integer values. What is the simple way to do it in Perl, consider I have over 2 millon elements in each array?

    Read the article

  • How to output list of float text to binary file in Perl ?

    - by YoDar
    Hi, I have text file looks like that: float a[10] = { 7.100000e+000 , 9.100000e+000 , 2.100000e+000 , 1.100000e+000 , 8.200000e+000 , 7.220000e+000 , 7.220000e+000 , 7.222000e+000 , 1.120000e+000 , 1.987600e+000 }; unsigned int col_ind[10] = { 1 , 4 , 3 , 4 , 5 , 2 , 3 , 4 , 1 , 5 }; Now, I want to convert each array (float / unsigned int) to different binary files - big endian type. Binary file for all float values and binary file for all integer values. What is the simple way to do it in Perl, consider I have over 2 millon elements in each array? Thanks, Yodar.

    Read the article

  • How to encrypt and save a binary stream after serialization and read it back?

    - by Anindya Chatterjee
    I am having some problems in using CryptoStream when I want to encrypt a binary stream after binary serialization and save it to a file. I am getting the following exception System.ArgumentException : Stream was not readable. Can anybody please show me how to encrypt a binary stream and save it to a file and deserialize it back correctly? The code is as follows: class Program { public static void Main(string[] args) { var b = new B {Name = "BB"}; WriteFile<B>(@"C:\test.bin", b, true); var bb = ReadFile<B>(@"C:\test.bin", true); Console.WriteLine(b.Name == bb.Name); Console.ReadLine(); } public static T ReadFile<T>(string file, bool decrypt) { T bObj = default(T); var _binaryFormatter = new BinaryFormatter(); Stream buffer = null; using (var stream = new FileStream(file, FileMode.OpenOrCreate)) { if(decrypt) { const string strEncrypt = "*#4$%^.++q~!cfr0(_!#$@$!&#&#*&@(7cy9rn8r265&$@&*E^184t44tq2cr9o3r6329"; byte[] dv = {0x12, 0x34, 0x56, 0x78, 0x90, 0xAB, 0xCD, 0xEF}; CryptoStream cs; DESCryptoServiceProvider des = null; var byKey = Encoding.UTF8.GetBytes(strEncrypt.Substring(0, 8)); using (des = new DESCryptoServiceProvider()) { cs = new CryptoStream(stream, des.CreateEncryptor(byKey, dv), CryptoStreamMode.Read); } buffer = cs; } else buffer = stream; try { bObj = (T) _binaryFormatter.Deserialize(buffer); } catch(SerializationException ex) { Console.WriteLine(ex.Message); } } return bObj; } public static void WriteFile<T>(string file, T bObj, bool encrypt) { var _binaryFormatter = new BinaryFormatter(); Stream buffer; using (var stream = new FileStream(file, FileMode.Create)) { try { if(encrypt) { const string strEncrypt = "*#4$%^.++q~!cfr0(_!#$@$!&#&#*&@(7cy9rn8r265&$@&*E^184t44tq2cr9o3r6329"; byte[] dv = {0x12, 0x34, 0x56, 0x78, 0x90, 0xAB, 0xCD, 0xEF}; CryptoStream cs; DESCryptoServiceProvider des = null; var byKey = Encoding.UTF8.GetBytes(strEncrypt.Substring(0, 8)); using (des = new DESCryptoServiceProvider()) { cs = new CryptoStream(stream, des.CreateEncryptor(byKey, dv), CryptoStreamMode.Write); buffer = cs; } } else buffer = stream; _binaryFormatter.Serialize(buffer, bObj); buffer.Flush(); } catch(SerializationException ex) { Console.WriteLine(ex.Message); } } } } [Serializable] public class B { public string Name {get; set;} } It throws the serialization exception as follows The input stream is not a valid binary format. The starting contents (in bytes) are: 3F-17-2E-20-80-56-A3-2A-46-63-22-C4-49-56-22-B4-DA ...

    Read the article

  • How can I read a binary string in C#?

    - by Sergey
    There is a php script which sends a binary string to the client application: $binary_string = pack('i',count($result_matrix)); foreach ($result_matrix as $row) { foreach ($row as $cell) { $binary_string .= pack('d',$cell); } } echo $binary_string; Silverlight application receives $binary_string via POST protocol. How can I parse this binary string? Or maybe there is a better way to send matrix from PHP to Silverlight?

    Read the article

  • How do I read a binary file in C#?

    - by tomcamara
    I have a file that exists within a text and a binary image, I need to read from 0 to 30 position the text in question, and the position on 31 would be the image in binary format. What are the steps that I have to follow to proceed with that problem? Currently, I am trying to read it using FileStream, and then I move the FileStream var to one BinaryReader as shown below: FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read) BinaryReader br = new BinaryReader(fs) From there forward, I'm lost.

    Read the article

  • Command line solution for removing parts from a binary file?

    - by zsero
    I have a binary file and I would like to remove parts from. By removing I mean deleting those parts and thus making the file's size smaller. The parts would be between two ASCII strings. So, for example the file would look like this ........ start ABCD end ..... start EFGH end ..... start IJKL end ........... So in this file, I would like to search for strings "start" and "end" and remove the parts between them. The way I think I can do it is to lookup all the locations for "start" and "end" calculate ranges from that delete those parts Now I am using some GUI based Hex editor and I use the "Search All", "Select Range" and "Delete" commands, but I am sure it would be possible to solve it using some powerful command line hex/text editors. Do you know any solution for this problem which doesn't require using a GUI for looking up, copy & paste on clipboard, select range and delete commands but is just a few lines of command line? I am interested ini both Linux shell scripts or using some command line hex editors under Windows, or even Python scrips are welcome. Do you think it is possible to solve this problem just by a simple Regex replace? Are there any regex replace util which handles binary files well?

    Read the article

  • LightDM will not start after stopping it

    - by Sweeters
    I am running Ubuntu 11.10 "Oneiric Ocelot", and in trying to install the nvidia CUDA developer drivers I switched to a virtual terminal (Ctrl-Alt-F5) and stopped lightdm (installation required that no X server instance be running) through sudo service lightdm stop. Re-starting lightdm with sudo service lightdm start did not work: A couple of * Starting [...] lines where displayed, but the process hanged. (I do not remember at which point, but I think it was * Starting System V runlevel compatibility. I manually rebooted my laptop, and ever since booting seems to hang, usually around the * Starting anac(h)ronistic cron [OK] log line (not consistently at that point, though). From that point on, I seem to be able to interact with my system only through a tty session (Ctrl-Alt-F1). I've tried purging and reinstalling both lightdm and gdm, as well as selecting both as the default display managers (through sudo dpkg-reconfigure [lightdm / gdm] or by manually editing /etc/X11/default-display-manager) through both apt-get and aptitude (that shouldn't make a difference anyway) after updating the packages, but the problem persists. Some of the responses I'm getting are the following: After running sudo dpkg-reconfigure lightdm (but not ... gdm) I get the following message: dpkg-maintscript-helper:warning: environment variable DPKG_MATINSCRIPT_NAME missing dpkg-maintscript-helper:warning: environment variable DPKG_MATINSCRIPT_PACKAGE missing After trying sudo service lightdm start or sudo start lightdm I get to see the boot loading screen again but nothing changes. If I go back to the tty shell I see lightdm start/running, process <num> but ps -e | grep lightdm gives no output. After trying sudo service gdm start or sudo starg gdm I get the gdm start/running, process <num> message, and gdm-binary is supposedly an active process, but all that happens is that the screen blinks a couple of times and nothing else. Other candidate solutions that I'd found on the web included running startx but when I try that I get an error output [...] Fatal server error: no screens found [...]. Moreover, I made sure that lightdm-gtk-greeter is installed but that did not help either. Please excuse my not including complete outputs/logs; I am writing this post from another computer and it's hard to manually copy the complete logs. Also, I've seen several posts that had to do with similar problems, but either there was no fix, or the one suggested did not work for me. In closing: Please help! I very much hope to avoid re-installing Ubuntu from scratch! :) Alex @mosi I did not manage to fix the NVIDIA kernel driver as per your instructions. I should perhaps mention that I'm on a Dell XPS15 laptop with an NVIDIA Optimus graphics card, and that I have bumblebee installed (which installs nvidia drivers during its installation, I believe). Issuing the mentioned commands I get the following: ~$uname -r 3.0.0-12-generic ~$lsmod | grep -i nvidia nvidia 11713772 0 ~$dmesg | grep -i nvidia [ 8.980041] nvidia: module license 'NVIDIA' taints kernel. [ 9.354860] nvidia 0000:01:00.0: power state changed by ACPI to D0 [ 9.354864] nvidia 0000:01:00.0: power state changed by ACPI to D0 [ 9.354868] nvidia 0000:01:00.0: enabling device (0006 -> 0007) [ 9.354873] nvidia 0000:01:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 9.354879] nvidia 0000:01:00.0: setting latency timer to 64 [ 9.355052] NVRM: loading NVIDIA UNIX x86_64 Kernel Module 280.13 Wed Jul 27 16:53:56 PDT 2011 Also, running aptitude search nvidia gives me the following: p nvidia-173 - NVIDIA binary Xorg driver, kernel module a p nvidia-173-dev - NVIDIA binary Xorg driver development file p nvidia-173-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-173-updates-dev - NVIDIA binary Xorg driver development file p nvidia-96 - NVIDIA binary Xorg driver, kernel module a p nvidia-96-dev - NVIDIA binary Xorg driver development file p nvidia-96-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-96-updates-dev - NVIDIA binary Xorg driver development file p nvidia-cg-toolkit - Cg Toolkit - GPU Shader Authoring Language p nvidia-common - Find obsolete NVIDIA drivers i nvidia-current - NVIDIA binary Xorg driver, kernel module a p nvidia-current-dev - NVIDIA binary Xorg driver development file c nvidia-current-updates - NVIDIA binary Xorg driver, kernel module a p nvidia-current-updates-dev - NVIDIA binary Xorg driver development file i nvidia-settings - Tool of configuring the NVIDIA graphics dr p nvidia-settings-updates - Tool of configuring the NVIDIA graphics dr v nvidia-va-driver - v nvidia-va-driver - I've tried manually installing (sudo aptitude install <package>) packages nvidia-common and nvidia-settings-updates but to no avail. For example, sudo aptitude install nvidia-settings-updates returns the following log: Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... No packages will be installed, upgraded, or removed. 0 packages upgraded, 0 newly installed, 0 to remove and 83 not upgraded. Need to get 0 B of archives. After unpacking 0 B will be used. Writing extended state information... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... The same happens with the Linux headers (i.e. I cannot seem to be able to install linux-headers-3.0.0-12-generic). The output of aptitude search linux-headers is as follows: v linux-headers - v linux-headers - v linux-headers-2.6 - i linux-headers-2.6.38-11 - Header files related to Linux kernel versi i linux-headers-2.6.38-11-generic - Linux kernel headers for version 2.6.38 on i A linux-headers-2.6.38-8 - Header files related to Linux kernel versi i A linux-headers-2.6.38-8-generic - Linux kernel headers for version 2.6.38 on v linux-headers-3 - v linux-headers-3.0 - v linux-headers-3.0 - i A linux-headers-3.0.0-12 - Header files related to Linux kernel versi p linux-headers-3.0.0-12-generic - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-generic- - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-server - Linux kernel headers for version 3.0.0 on p linux-headers-3.0.0-12-virtual - Linux kernel headers for version 3.0.0 on p linux-headers-generic - Generic Linux kernel headers p linux-headers-generic-pae - Generic Linux kernel headers v linux-headers-lbm - v linux-headers-lbm - v linux-headers-lbm-2.6 - v linux-headers-lbm-2.6 - p linux-headers-lbm-3.0.0-12-gene - Header files related to linux-backports-mo p linux-headers-lbm-3.0.0-12-gene - Header files related to linux-backports-mo p linux-headers-lbm-3.0.0-12-serv - Header files related to linux-backports-mo p linux-headers-server - Linux kernel headers on Server Equipment. p linux-headers-virtual - Linux kernel headers for virtual machines @heartsmagic I did try purging and reinstalling any nvidia driver packages, but it did not seem to make a difference, My xorg.conf file contains the following: # nvidia-xconfig: X configuration file generated by nvidia-xconfig # nvidia-xconfig: version 280.13 ([email protected]) Wed Jul 27 17:15:58 PDT 2011 Section "ServerLayout" Identifier "Layout0" Screen 0 "Screen0" 0 0 InputDevice "Keyboard0" "CoreKeyboard" InputDevice "Mouse0" "CorePointer" EndSection Section "Files" EndSection Section "InputDevice" # generated from default Identifier "Mouse0" Driver "mouse" Option "Protocol" "auto" Option "Device" "/dev/psaux" Option "Emulate3Buttons" "no" Option "ZAxisMapping" "4 5" EndSection Section "InputDevice" # generated from default Identifier "Keyboard0" Driver "kbd" EndSection Section "Monitor" Identifier "Monitor0" VendorName "Unknown" ModelName "Unknown" HorizSync 28.0 - 33.0 VertRefresh 43.0 - 72.0 Option "DPMS" EndSection Section "Device" Identifier "Device0" Driver "nvidia" VendorName "NVIDIA Corporation" EndSection Section "Screen" Identifier "Screen0" Device "Device0" Monitor "Monitor0" DefaultDepth 24 SubSection "Display" Depth 24 EndSubSection EndSection

    Read the article

  • apt-get update getting 404 on debian lenny

    - by JoelFan
    Here is my /etc/apt/sources.list ###### Debian Main Repos deb http://ftp.us.debian.org/debian/ lenny main contrib non-free ###### Debian Update Repos deb http://security.debian.org/ lenny/updates main contrib non-free deb http://ftp.us.debian.org/debian/ lenny-proposed-updates main contrib non-free When I do: # apt-get update I'm getting some good lines, then: Err http://ftp.us.debian.org lenny/contrib Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny/non-free Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/main Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/contrib Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny-proposed-updates/non-free Packages 404 Not Found [IP: 35.9.37.225 80] Err http://ftp.us.debian.org lenny/main Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/main/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/contrib/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://security.debian.org/dists/lenny/updates/non-free/binary-i386/Packages 404 Not Found [IP: 149.20.20.6 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/contrib/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/non-free/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/main/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/contrib/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny-proposed-updates/non-free/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] W: Failed to fetch http://ftp.us.debian.org/debian/dists/lenny/main/binary-i386/Packages 404 Not Found [IP: 35.9.37.225 80] E: Some index files failed to download, they have been ignored, or old ones used instead. Now what?

    Read the article

  • What should developers know about Windows executable binary file compression?

    - by Peter Turner
    I'd never heard of this before, so shame on me, but programs like UPX can compress my files by 80% which is totally sweet, but I have no idea what the the disadvantages are in doing this. Or even what the compressor does. Website linked above doesn't say anything about dynamically linking DLLs but it mentions about compressing DESCENT 2 and about compressing Netscape 4.06. Also, it doesn't say what the tradeoffs are, only the benefits. If there weren't tradeoffs why wouldn't my linker compress the file? If I have an environment where I have one executable and 20-30 DLL's, some of which are dynamically loaded an unloaded fairly arbitrarily, but not in loops (hopefully), do I take a big hit in processing time decompressing these DLL's when they're used?

    Read the article

  • Falied installing ATI X.Org binary driver!! (ATI Radeon HD 5400)

    - by Naveen
    The open source driver worked well with the fresh installation of Ubuntu 12.04, but it drains my laptop battery faster. So I installed the proprietary driver from jockey-gtk. It gave me this error message at the end of the installation: Here is the jockey.log After restarting the computer, I lost all the compositing animations. Compiz visual effects don't work. There is a black boarder around Docky. Please somebody help me...! Thanks!

    Read the article

  • Store image as logic file (in db by using binary format) or physical file (in the server)

    - by Michel Ayres
    In those study cases of image storage, An image that change only once in a while, if it changes at all (like an image for an article) The image case from above is not only one image but over 10, that link to the same article An image that have changes very often (like a banner image for a website) The image above is huge What is the best approach for each case? What is the "right/faster" way to do this task in each scenario ?

    Read the article

  • Linux, GNU GCC, ld, version scripts and the ELF binary format -- How does it work? [closed]

    - by themoondothshine
    I'm trying to learn more about library versioning in Linux and how to put it all to work. Here's the context: I have two versions of a dynamic library which expose the same set of interfaces, say libsome1.so and libsome2.so. An application is linked against libsome1.so. This application uses libdl.so to dynamically load another module, say libmagic.so. Now libmagic.so is linked against libsome2.so. Obviously, without using linker scripts to hide symbols in libmagic.so, at run-time all calls to interfaces in libsome2.so are resolved to libsome1.so. This can be confirmed by checking the value returned by libVersion() against the value of the macro LIB_VERSION. So I try next to compile and link libmagic.so with a linker script which hides all symbols except 3 which are defined in libmagic.so and are exported by it. This works... Or at least libVersion() and LIB_VERSION values match (and it reports version 2 not 1). However, when some data structures are serialized to disk, I noticed some corruption. In the application's directory if I delete libsome1.so and create a soft link in its place to point to libsome2.so, everything works as expected and the same corruption does not happen. I can't help but think that this may be caused due to some conflict in the run-time linker's resolution of symbols. I've tried many things, like trying to link libsome2.so so that all symbols are alised to symbol@@VER_2 (which I am still confused about because the command nm -CD libsome2.so still lists symbols as symbol and not symbol@@VER_2), but nothing seems to work. What am I doing wrong?

    Read the article

  • I need some help creating a non-binary tree (or some other data structure that will better solve my problem)

    - by EDO
    I have about ten lists of numbers and some strings. Each list has about <= 30K lines. Each line on a list has a distinct number. I need to build an efficient way of finding all the lines in each list that has the same 'control' number (or key for dB guys) and comparing what is in their string parts. I am writing this in Java. I have thought about using trees but my brain cells are about burnt now. I need some help.

    Read the article

  • Should I move big data blobs in JSON or in separate binary connection?

    - by Amagrammer
    QUESTION: Is it better to send large data blobs in JSON for simplicity, or send them as binary data over a separate connection? If the former, can you offer tips on how to optimize the JSON to minimize size? If the latter, is it worth it to logically connect the JSON data to the binary data using an identifier that appears in both, e.g., as "data" : "< unique identifier " in the JSON and with the first bytes of the data blob being < unique identifier ? CONTEXT: My iPhone application needs to receive JSON data over the 3G network. This means that I need to think seriously about efficiency of data transfer, as well as the load on the CPU. Most of the data transfers will be relatively small packets of text data for which JSON is a natural format and for which there is no point in worrying much about efficiency. However, some of the most critical transfers will be big blobs of binary data -- definitely at least 100 kilobytes of data, and possibly closer to 1 megabyte as customers accumulate a longer history with the product. (Note: I will be caching what I can on the iPhone itself, but the data still has to be transferred at least once.) It is NOT streaming data. I will probably use a third-party JSON SDK -- the one I am using during development is here. Thanks

    Read the article

< Previous Page | 31 32 33 34 35 36 37 38 39 40 41 42  | Next Page >