Search Results

Search found 5946 results on 238 pages for 'heavy bytes'.

Page 167/238 | < Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >

  • Memory leak in Qt signal and slots

    - by Ajay
    Hello, I am running valgrind on my Qt code,and even on successful exit of the application, get the following report from valgrind 8,832 bytes in 92 blocks are still reachable in loss record 12 of 12 at 0x4025390: operator new(unsigned int) (vg_replace_malloc.c:214) ==3339== by 0x4B75F05: QMutex::QMutex(QMutex::RecursionMode) (qmutex.cpp:123) ==3339== by 0x4B77602: QMutexPool::get(void const*) (qmutexpool.cpp:137) ==3339== by 0x4CA0EC2: signalSlotLock(QObject const*) (qobject.cpp:112) ==3339== by 0x4CA3939: QMetaObjectPrivate::connect(QObject const*, int, QObject const*, int, int, int*) (qobject.cpp:2900) ==3339== by 0x4CA5C00: QObject::connect(QObject const*, char const*, QObject const*, char const*, Qt::ConnectionType) (qobject.cpp:2599) I disconnect all signal connections and also delete the objects. The above mentioned leak increases if i increase the amount of signal and slot connections? Can anybody help with this?

    Read the article

  • Using Active Objects and BLOBs

    - by Andrew L.
    I am in a group of people who are creating a Defect Tracking program as a project. We have been using Active Objects and have run into some issues. Currently maximum file size for the blob is approx. 2Mb but we want to be able to increase it up to 2Gb. We currently have been looking at many sites and have not been able to find out how to increase the size. We are currently storing the blob as an array of bytes. Our current error says, Packet for Query is too large? We don't know how to set the variable, and we don't know how to set it using AO. We are programming this in Java, too. We are wondering if anyone has a solution to this problem. Thanks for the Help.

    Read the article

  • Postgresql 8.4 reading OID style BLOBs with Hibernate

    - by peter
    I am getting this weird case when querying Postgres 8.4 for some records with Blobs (of type OIDs) with Hibernate. The query does return all right but when my code wants to read the content of the BLOB with the simple code below, it gets 0 bytes back public static byte[] readBlob(Blob blob) throws Exception { InputStream is = null; try { is = blob.getBinaryStream(); return org.apache.commons.io.IOUtils.toByteArray(is); } finally { if (is != null) try { is.close(); } catch(Exception e) {} } } Funny think is that I am getting this behavior only since I've started adding more then one such records to the table. The underlying JDBC library is type 3 (postgresq 8.4-701). Can someone give me a hint as to how to solve this issue? Thanks Peter

    Read the article

  • write() causes fatal crash when filedescriptor becomes invalid

    - by ckrames1234
    I'm writing an iPhone App with a webserver in it. To handle a web request, I take the web request and write() to it the data that I want to send back. When I try to download a moderately sized file (3-6MB) it works fine, but if I cancel the download halfway through, the app crashes and leaves no trace of an error. I'm thinking that the file descriptor becomes invalid halfway through the write, and causes the crash. I really don't know if this is what causes the crash, i'm just assuming. I'm basing my webserver off of this example. NSString *header = @""; NSData *data = [NSData dataWithContentsOfFile:fullPath]; write (fd, [header UTF8String], [header length]); write(fd, [data bytes], [data length]); close(fd); Does anyone know how to fix this? I was thinking about chunking the data and then writing each part, but I don't think it would help.

    Read the article

  • how 2 use logmath twice in same form(sphinx4)

    - by Pradeep
    i have configured sphinx with netbeans and its wroking fine. but im using a button to do the process. but after it recognisers. i want to do the process again. but then it gives a error saying the "logmath instance is already present" and saying cannot open the microphone. can someone give me a solution. what i want to do is use speech recogntion in several times in the same form. till it gives the correct answer. please help me this is the error i get "Creating new instance of LogMath while another instance is already present 10:53:27.833 SEVERE microphone Can't open microphone line with format PCM_SIGNED 16000.0 Hz, 16 bit, mono, 2 bytes/frame, big-endian not supported."

    Read the article

  • Safe non-tamperable URL component in Perl using symmetric encryption?

    - by Randal Schwartz
    OK, I'm probably just having a bad Monday, but I have the following need and I'm seeing lots of partial solutions but I'm sure I'm not the first person to need this, so I'm wondering if I'm missing the obvious. $client has 50 to 500 bytes worth of binary data that must be inserted into the middle of a URL and roundtrip to their customer's browser. Since it's part of the URL, we're up against the 1K "theoretical" limit of a GET URL. Also, $client doesn't want their customer decoding the data, or tampering with it without detection. $client would also prefer not to store anything server-side, so this must be completely standalone. Must be Perl code, and fast, in both encoding and decoding. I think the last step can be base64. But what are the steps for encryption and hashing that make the most sense?

    Read the article

  • LNK1106 with big binary resource

    - by E Dominique
    I have a rather huge .dat-file (896MB) included as a BIN resource in my project. Now I get a LNK1106 link error ("fatal error LNK1106: invalid file or disk full: cannot seek to 0x382A3920".) I use Visual Studio 2005 under Windows XP, and have tried on a 4GB RAM machine with high Virtual Memory settings and lots of disk space. I have tried a number of different optimization flags, but to no avail. Does anyone have a clue? EDIT: I have narrowed it down to a specific size of the compiled resource. If the .res file is 544078588 bytes (about 518.9MB) or larger, the error occurs. If it is smaller it works just fine. Still no solution, though...

    Read the article

  • java: how to compress data into a String and uncompress data from the String

    - by Guillaume
    I want to put some compressed data into a remote repository. To put data on this repository I can only use a method that take the name of the resource and its content as a String. (like data.txt + "hello world"). The repository is moking a filesystem but is not, so I can not use File directly. I want to be able to do the following: client send to server a file 'data.txt' server compress 'data.txt' into data.zip server send to repository content of data.zip repository store data.zip client download from repository data.zip and his able to open it with its favorite zip tool I have tried a lots of compressing example found on the web but each time a send the data to the repository, my resulting zip file is corrupted. Here is a sample class, using the zip*stream and that emulate the repository showcasing my problem. The created zip file is working, but after its 'serialization' it's get corrupted. (the sample class use jakarta commons.io ) Many thanks for your help. package zip; import java.io.File; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.IOException; import java.io.InputStream; import java.util.zip.ZipEntry; import java.util.zip.ZipInputStream; import java.util.zip.ZipOutputStream; import org.apache.commons.io.FileUtils; /** * Date: May 19, 2010 - 6:13:07 PM * * @author Guillaume AME. */ public class ZipMe { public static void addOrUpdate(File zipFile, File ... files) throws IOException { File tempFile = File.createTempFile(zipFile.getName(), null); // delete it, otherwise you cannot rename your existing zip to it. tempFile.delete(); boolean renameOk = zipFile.renameTo(tempFile); if (!renameOk) { throw new RuntimeException("could not rename the file " + zipFile.getAbsolutePath() + " to " + tempFile.getAbsolutePath()); } byte[] buf = new byte[1024]; ZipInputStream zin = new ZipInputStream(new FileInputStream(tempFile)); ZipOutputStream out = new ZipOutputStream(new FileOutputStream(zipFile)); ZipEntry entry = zin.getNextEntry(); while (entry != null) { String name = entry.getName(); boolean notInFiles = true; for (File f : files) { if (f.getName().equals(name)) { notInFiles = false; break; } } if (notInFiles) { // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(name)); // Transfer bytes from the ZIP file to the output file int len; while ((len = zin.read(buf)) > 0) { out.write(buf, 0, len); } } entry = zin.getNextEntry(); } // Close the streams zin.close(); // Compress the files if (files != null) { for (File file : files) { InputStream in = new FileInputStream(file); // Add ZIP entry to output stream. out.putNextEntry(new ZipEntry(file.getName())); // Transfer bytes from the file to the ZIP file int len; while ((len = in.read(buf)) > 0) { out.write(buf, 0, len); } // Complete the entry out.closeEntry(); in.close(); } // Complete the ZIP file } tempFile.delete(); out.close(); } public static void main(String[] args) throws IOException { final String zipArchivePath = "c:/temp/archive.zip"; final String tempFilePath = "c:/temp/data.txt"; final String resultZipFile = "c:/temp/resultingArchive.zip"; File zipArchive = new File(zipArchivePath); FileUtils.touch(zipArchive); File tempFile = new File(tempFilePath); FileUtils.writeStringToFile(tempFile, "hello world"); addOrUpdate(zipArchive, tempFile); //archive.zip exists and contains a compressed data.txt that can be read using winrar //now simulate writing of the zip into a in memory cache String archiveText = FileUtils.readFileToString(zipArchive); FileUtils.writeStringToFile(new File(resultZipFile), archiveText); //resultingArchive.zip exists, contains a compressed data.txt, but it can not //be read using winrar: CRC failed in data.txt. The file is corrupt } }

    Read the article

  • What is the best way to read the uploaded files from Request.Files, StreamReader or BinaryReader or

    - by ramesh.nagul
    I have a form where the user can upload multiple files. I am using MVC 2.0 and in my controller I need to call a webservice that is a common import interface requires the files to passed in as byte[]. .NET exposes Request.Files as a HttpFileCollectionBase and I access the filehandle using HttpPostedFile or HttpPostedFileBase that provides access to the Stream member. What is the best way for me to read the bytes from the stream? BinaryReader? StreamReader? BufferedStream?

    Read the article

  • gcc does not generate debugger info when using -g, -ggdb, -g3, or -ggdb3

    - by CJJ
    I'm using GCC 4.4.1 and GDB 7.0-ubuntu on Ubuntu 9.10. However, GCC won't generate debugger info when using any of the following switches: -g, -g3, -ggdb, or -ggdb3. So when I run the program with GDB, its as if there was no debugger information generated. I have created very simple test source files in a new, empty folder. Here is one example: #include <stdlib.h> #include <stdio.h> int main (int argc, char **argv) { char msg[4]; // allocate 4 bytes on the stack strcpy (msg, "hello world"); // overflow printf ("%s\n", msg); return 0; }

    Read the article

  • .NET connecting with RDP

    - by user311130
    I want to connect to a remote desktop connection to a specified server/username from c#. I have found: __http://www.codeproject.com/KB/cs/RemoteDesktop_CSharpNET.aspx a AxMSTSCLib dll should be referenced to the solution. I don't want to download this dll from anywhere as I'm not sure if I can trust it. However it also says: "After research on the web I found that I have to create new AxMSTSCLib and MSTSCLib DLLs. So I did" How do I "create" this new AxMSTSCLib ? Other link, doesn't use this dll but run an script instead. http://bytes.com/topic/c-sharp/answers/517024-remote-desktop-connection-c but that code throws Security Exception. So I cannot use it.

    Read the article

  • String Field Sizes for unicode database fields using different data access components

    - by Serg
    mjustin in his question 1 and question 2 says that TWideStringField.Size property for UTF8 fields in Delphi 2009 dbExpress is 4 times larger than the logical field size (max number of characters in the field). I inclined to consider this a dbExpress bug. That is what Delphi 2009 Help says: The interpretation of Size depends on the data type. The meaning of Size for data types that use it is given in the following table. For all other data types, Size is not used and its value is always 0. ftString - Size is the maximum number of characters in the string. I am using FibPlus 6.9.9 and it follows the above documentation - the string field size is the maximum number of characters, not bytes. So the question also implies the following question: Are DbExpress drivers in Delphi 2009 unusable for unicode databases?

    Read the article

  • OpenAL causing leaks in my iPhone game. Help appreciated

    - by AptoTech
    Hi, I am integrating OpenAL in my iPhone game from code I found in this post, but the compiler gave me an error on this line of code: unsigned char *outData = malloc(fileSize);, so I changed it to this: unsigned char *outData = (unsigned char*) malloc(fileSize);. This got rid of the compiler errors, but seems to have thrown up two leaks: Malloc 32 Bytes 0x505cb40 AudioToolbox SimAggregateDevice::CreateAggregateDevice(__CFString const*, __CFString const*, unsigned long&) and NSCFDictionary 0x505be30 64 AudioToolbox SimAggregateDevice::CreateAggregateDevice(__CFString const*, __CFString const*, unsigned long&) Is this due to me changing the unsigned char line? I would be very grateful if someone could help me to remove these leaks.

    Read the article

  • Make compiler copy characters using movsd

    - by Suma
    I would like to copy a relatively short sequence of memory (less than 1 KB, typically 2-200 bytes) in a time critical function. The best code for this on CPU side seems to be rep movsd. However I somehow cannot make my compiler to generate this code. I hoped (and I vaguely remember seeing so) using memcpy would do this using compiler built-in instrinsic, but based on disassembly and debugging it seems compiler is using call to memcpy/memmove library implementation instead. I also hoped the compiler might be smart enough to recognize following loop and use rep movsd on its own, but it seems it does not. char *dst; const char *src; // ... for (int r=size; --r>=0; ) *dst++ = *src++; Is there some way to make the Visual Studio compiler to generate rep movsd sequence other than using inline assembly?

    Read the article

  • C# int to byte[]

    - by Petoj
    If I need to convert an int to byte[] I could use Bitconvert.GetBytes(). But if I should follow this: An XDR signed integer is a 32-bit datum that encodes an integer in the range [-2147483648,2147483647]. The integer is represented in two's complement notation. The most and least significant bytes are 0 and 3, respectively. Integers are declared as follows: Taken from RFC1014 3.2. What method should I use then if there is no method to do this? How would it look like if you write your own? I don't understand the text 100% so I can't implement it on my own.

    Read the article

  • Perl TDS character sets

    - by skiphoppy
    I'm using the FreeTDS driver with DBD::Sybase, connecting to an MS SQL Server. When I query certain values of certain records, I get this error: DBD::Sybase::st fetchrow_arrayref failed: OpenClient message: LAYER = (0) ORIGIN = (0) SEVERITY = (9) NUMBER = (99) Server , database Message String: WARNING! Some character(s) could not be converted into client's character set. Unconverted bytes were changed to question marks ('?'). This seems to happen for records that contain special Windows character-set characters, such as curly quotes, copied and pasted from people's Outlook and Word messages. Unfortunately, I do not have any control of this database; sanitizing the input on the way in is obviously the way to go, but is not available to me. What FreeTDS settings do I need to change to be able to successfully query these records?

    Read the article

  • Type casting needed for byte = byte - byte?

    - by Vaccano
    I have the following code: foreach (byte b in bytes) { byte inv = byte.MaxValue - b; // Add the new value to a list.... } When I do this I get the following error: Cannot implicitly convert type 'int' to 'byte'. An explicit conversion exists (are you missing a cast?) Each part of this statement is a byte. Why does C# want to convert the byte.MaxValue - b to an int? Shouldn't you be able to do this some how without casting? (i.e. I don't want to have to do this: byte inv = (byte) (byte.MaxValue - b);)

    Read the article

  • OOP Design Question - Where/When do you Validate properties?

    - by JW
    I have read a few books on OOP DDD/PoEAA/Gang of Four and none of them seem to cover the topic of validation - it seems to be always assumed that data is valid. I gather from the answers to this post (http://stackoverflow.com/questions/1651964/oop-design-question-validating-properties) that a client should only attempt to set a valid property value on a domain object. This person has asked a similar question that remains unanswered: http://bytes.com/topic/php/answers/789086-php-oop-setters-getters-data-validation#post3136182 So how do you ensure it is valid? Do you have a 'validator method' alongside every getter and setter? isValidName() setName() getName() I seem to be missing some key basic knowledge about OOP data validation - can you point me to a book that covers this topic in detail? - ie. covering different types of validation / invariants/ handling feedback / to use Exceptions or not etc

    Read the article

  • problem using base64 encoder and InputStreamReader

    - by karoberts
    I have some CLOB columns in a database that I need to put Base64 encoded binary files in. These files can be large, so I need to stream them, I can't read the whole thing in at once. I'm using org.apache.commons.codec.binary.Base64InputStream to do the encoding, and I'm running into a problem. My code is essentially this FileInputStream fis = new FileInputStream(file); Base64InputStream b64is = new Base64InputStream(fis, true, -1, null); InputStreamReader reader = new InputStreamReader(b64is); preparedStatement.setCharacterStream(1, reader); When I run the above code, I get one of these during the execution of the update java.io.IOException: Underlying input stream returned zero bytes, it is thrown deep in the InputStreamReader code. Why would this not work? It seems to me like the reader would attempt to read from the base 64 stream, which would read from the file stream, and everything should be happy.

    Read the article

  • fastest (low latency) method for Inter Process Communication between Java and C/C++

    - by Bastien
    Hello, I have a Java app, connecting through TCP socket to a "server" developed in C/C++. both app & server are running on the same machine, a Solaris box (but we're considering migrating to Linux eventually). type of data exchanged is simple messages (login, login ACK, then client asks for something, server replies). each message is around 300 bytes long. Currently we're using Sockets, and all is OK, however I'm looking for a faster way to exchange data (lower latency), using IPC methods. I've been researching the net and came up with references to the following technologies: - shared memory - pipes - queues but I couldn't find proper analysis of their respective performances, neither how to implement them in both JAVA and C/C++ (so that they can talk to each other), except maybe pipes that I could imagine how to do. can anyone comment about performances & feasibility of each method in this context ? any pointer / link to useful implementation information ? thanks for your help

    Read the article

  • exchanging 2 memory positions

    - by Jordi
    I am working with OpenCV and Qt, Opencv use BGR while Qt uses RGB , so I have to swap those 2 bytes for very big images. There is a better way of doing the following? I can not think of anything faster but looks so simple and lame... int width = iplImage->width; int height = iplImage->height; uchar *iplImagePtr = (uchar *) iplImage->imageData; uchar buf; int limit = height * width; for (int y = 0; y < limit; ++y) { buf = iplImagePtr[2]; iplImagePtr[2] = iplImagePtr[0]; iplImagePtr[0] = buf; iplImagePtr += 3; } QImage img((uchar *) iplImage->imageData, width, height, QImage::Format_RGB888);

    Read the article

  • to print local xml file through NSData?

    - by senthilmuthu
    hi, i want to take some xml data from local path,and to parse,but when i use following code NSLog returns(content) different texts which is differed from xml file, how can i get exact xml data to check ,it consists correct xml data or not? any help please? when i parse , it returns nothing..i have saved the file as .xml and copied to local resource folder? NSString *xmlFilePath = [[[NSBundle mainBundle] resourcePath] stringByAppendingPathComponent:@"samp.xml"]; NSString *xmlFileContents = [NSString stringWithContentsOfFile:xmlFilePath]; NSData *data = [NSData dataWithBytes:[xmlFileContents UTF8String] length:[xmlFileContents lengthOfBytesUsingEncoding: NSUTF8StringEncoding]]; NSString *content=[[NSString alloc] initWithBytes:[data bytes] length:[data length] encoding:NSUTF8StringEncoding]; NSLog(@"%@",content);

    Read the article

  • binary file to string

    - by andrew
    i'm trying to read a binary file (for example an executable) into a string, then write it back FileStream fs = new FileStream("C:\\tvin.exe", FileMode.Open); BinaryReader br = new BinaryReader(fs); byte[] bin = br.ReadBytes(Convert.ToInt32(fs.Length)); System.Text.Encoding enc = System.Text.Encoding.ASCII; string myString = enc.GetString(bin); fs.Close(); br.Close(); System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); byte[] rebin = encoding.GetBytes(myString); FileStream fs2 = new FileStream("C:\\tvout.exe", FileMode.Create); BinaryWriter bw = new BinaryWriter(fs2); bw.Write(rebin); fs2.Close(); bw.Close(); this does not work (the result has exactly the same size in bytes but can't run) if i do bw.Write(bin) the result is ok, but i must save it to a string

    Read the article

  • FileInputStream negative skip

    - by Peter Štibraný
    I'm trying to find more about history of FileInputStream.skip(negative) operation. According to InputStream documentation: If n is negative, no bytes are skipped. It seems that implementation of FileInputStream from Sun used to throw IOException instead, which is now also documented in Javadoc: If n is negative, an IOException is thrown, even though the skip method of the InputStream superclass does nothing in this case. I just tried that, and found that FileInputStream.skip(-10) did in fact return -10! It didn't threw exception, it didn't even return 0, it returned -10. (I've tried with Java 1.5.0_22 from Sun, and Java 1.6.0_18 from Sun). Is this a known bug? Why hasn't it been fixed, or why documentation is kept the way it is? Can someone point me to some discussion about this issue? I can't find anything.

    Read the article

  • Flash CS5 font is largest part of the SWF

    - by dev.e.loper
    I'm transferring a project from CS4 to CS5 and (without any changes) my SWF file gets to be 10 times bigger. It was 7kb and now it's 77kb. I generated a size report and it looks like the font is taking up most of the space. I haven't changed settings. I'm not sure why font is taking up so much space. Is there a way around this? Here is my size report: Font Name Bytes Characters ----------------- ---------- ---------- _sans 12 MilkyWell 317 .blsu Calibri-Bold Bold 75960 %.0123456789 As you can see Calibri-Bold is taking up 75kb and I only have 12 characters in it.

    Read the article

< Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >