Search Results

Search found 3768 results on 151 pages for 'lite byte'.

Page 110/151 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • I get error when trying to write response stream to a file

    - by MemphisDeveloper
    I am trying to test a rest webservice but when I do a post and try to retreive the save the response stream to a file I get an exception saying "Stream was not readable." What am I doing wrong? Public Sub PostAndRead() Dim flReader As FileStream = New FileStream("~\testRequest.xml", FileMode.Open, FileAccess.Read) Dim flWriter As FileStream = New FileStream("~\testResponse.xml", FileMode.Create, FileAccess.Write) Dim address As Uri = New Uri(restAddress) Dim req As HttpWebRequest = DirectCast(WebRequest.Create(address), HttpWebRequest) req.Method = "POST" req.ContentLength = flReader.Length req.AllowWriteStreamBuffering = True Dim reqStream As Stream = req.GetRequestStream() ' Get data from upload file to inData Dim inData(flReader.Length) As Byte flReader.Read(inData, 0, flReader.Length) ' put data into request stream reqStream.Write(inData, 0, flReader.Length) flReader.Close() reqStream.Close() ' Post Response req.GetResponse() ' Save results in a file Copy(req.GetRequestStream(), flWriter) End Sub

    Read the article

  • memory usage in C# (.NET) app is very high, until I call System.GC.Collect()

    - by Chris Gray
    I've written an app that spins a few threads each of which read several MB of memory. Each thread then connects to the Internet and uploads the data. this occurs thousands of times and each upload takes some time I'm seeing a situation where (verified with windbg/sos and !dumpheap) that the Byte[] are not getting collected automatically, causing 100/150MB of memory to be reported in task manager if I call System.GC.Collect() i'm seeing a huge drop in memory, a drop of over 100MB I dont like calling System.GC.Collect() and my PC has tons of free memory. however if anyone looks at TaskManager they're going to be concerned, thinking my app is leaking horribly. tips?

    Read the article

  • Grouping a generic list via LINQ in VB.NET

    - by CD Smith
    I need to take a collection and group it via Linq but all the examples I've seen fail in some manner or other with some syntax difference that I can't quite lick. My collection: Dim a As New List(Of ProcessAlert) a.Add(New ProcessAlert("0000112367", "[email protected]", "Alert", 2)) a.Add(New ProcessAlert("0000112367", "[email protected]", "Document", 2)) a.Add(New ProcessAlert("0000112367", "[email protected]", "Note", 2)) a.Add(New ProcessAlert("0000112367", "[email protected]", "Alert", 1)) a.Add(New ProcessAlert("0000112367", "[email protected]", "Document", 1)) a.Add(New ProcessAlert("0000112367", "[email protected]", "Note", 1)) Return a I need to turn this collection into a simple way to give this final outcome: "[email protected]", "Alert, Document, Note" "[email protected]", "Alert, Document, Note" Here's the definition of the ProcessAlert class: Public Class ProcessAlert Public LoanNumber As String Public EmailAddress As String Public AlertType As String Public AlertMethodID As Byte End Class Thanks in advance, CD

    Read the article

  • How do you use Java 1.6 Annotation Processing to perform compile time weaving?

    - by Steve
    I have created an annotation, applied it to a DTO and written a Java 1.6 style annotationProcessor. I can see how to have the annotationProcessor write a new source file, which isn't what I want to do, I cannot see or find out how to have it modify the existing class (ideally just modify the byte code). The modification is actually fairly trivial, all I want the processor to do is to insert a new getter and setter where the name comes from the value of the annotation being processed. My annotation processor looks like this; @SupportedSourceVersion(SourceVersion.RELEASE_6) @SupportedAnnotationTypes({ "com.kn.salog.annotation.AggregateField" }) public class SalogDTOAnnotationProcessor extends AbstractProcessor { @Override public boolean process(final Set<? extends TypeElement> annotations, final RoundEnvironment roundEnv) { //do some stuff } }

    Read the article

  • What is the best way to interoperably serialize a message?

    - by iwein
    I'm considering message serialization support for spring-integration. This would be useful for various wire level transports to implement guaranteed delivery, but also to allow interoperability with other messaging systems (e.g. through AMQP). The fundamental problem that arises is that a message containing Java object in it's payload and headers should be converted to a byte[] and/or written to a stream. Java's own serialization is clearly not going to cut it because that is not interoperable. My preference would be to create an interface that allows the user to implement the needed logic for all Objects that take part in serialization. Is this a sensible idea and what would the interface look like? Is there a standard interoperable way to serialize Objects that would make sense in this context?

    Read the article

  • [Java] Implement a RSA algorithm

    - by Robin Monjo
    Hello everyone. I want to implement a RSA algorithm to encrypt an image (byte[]). To generate my two keys I used this piece of code : KeyPairGenerator keygen = KeyPairGenerator.getInstance("RSA"); keygen.initialize(512); keyPair = keygen.generateKeyPair(); Once public and private key are generated, I would like to show them to the user so he can distribute the public key and use the private key to decode. How can I get back those key ? Using keygen.getPrivateKey() and keygen.getPublicKey() give me all the information of the RSA algorithm, not only the keys I need. Thanks

    Read the article

  • format specifier for short integer

    - by cateof
    I don't use correctly the format specifiers in C. A few lines of code: int main() { char dest[]="stack"; unsigned short val = 500; char c = 'a'; char* final = (char*) malloc(strlen(dest) + 6); snprintf(final, strlen(dest)+6, "%c%c%hd%c%c%s", c, c, val, c, c, dest); printf("%s\n", final); return 0; } I want my executable to print aa500aastack and not aa500aasta Why I am loosing 2 byte? What is the correct format specifier for an unsighed short integer? thanks.

    Read the article

  • Any difference in compiler behavior for each of these snippets?

    - by HotHead
    Please consider following code: 1. uint16 a = 0x0001; if(a < 0x0002) { // do something } 2. uint16 a = 0x0001; if(a < uint16(0x0002)) { // do something } 3. uint16 a = 0x0001; if(a < static_cast<uint16>(0x0002)) { // do something } 4. uint16 a = 0x0001; uint16 b = 0x0002; if(a < b) { // do something } What compiler does in backgorund and what is the best (and correct) way to do above testing? p.s. sorry, but I couldn't find the better title :) EDIT: values 0x0001 and 0x0002 are only example. There coudl be any 2 byte value instead. Thank you in advance!

    Read the article

  • Login method Customization using GINA [closed]

    - by netseng
    DUPLICATE:http://stackoverflow.com/questions/523912/login-method-customization-using-gina Hi All, I know it's not easy to find a master in GINA, but my question is most near to Interprocess Communication(IPC), I wrote my custom GINA in unmanaged c++, I included it a method that checks for validity of a fingerprint for the user try to login, this function will call some method in a running system windows service written in c#, the code follows: in GINA, unmanaged c++ if(Fingerprint.Validate(userName,finerprintTemplate) { //perform login } in windows service, C# public class Fingerprint { public static bool Validate(string userName, byte[] finerprintTemplate) { //Preform Some code to validate fingerprintTemplate with userName //and retuen result } } Does anyone know how to do such Communication between GINA and the windows service, or simply between c++ written service and C# written service. Thanks

    Read the article

  • Is there any "standard" htonl-like function for 64 bits integers in C++ ?

    - by ereOn
    Hi, I'm working on an implementation of the memcache protocol which, at some points, uses 64 bits integer values. These values must be stored in "network byte order". I wish there was some uint64_t htonll(uint64_t value) function to do the change, but unfortunately, if it exist, I couldn't find it. So I have 1 or 2 questions: Is there any portable (Windows, Linux, AIX) standard function to do this ? If there is no such function, how would you implement it ? I have in mind a basic implementation but I don't know how to check the endianness at compile-time to make the code portable. So your help is more than welcome here ;) Thank you.

    Read the article

  • Worst side effects from chars signedness. (Explanation of signedness effects on chars and casts)

    - by JustSmith
    I frequently work with libraries that use char when working with bytes in C++. The alternative is to define a "Byte" as unsigned char but that not the standard they decided to use. I frequently pass bytes from C# into the C++ dlls and cast them to char to work with the library. When casting ints to chars or chars to other simple types what are some of the side effects that can occur. Specifically, when has this broken code that you have worked on and how did you find out it was because of the char signedness? Lucky i haven't run into this in my code, used a char signed casting trick back in an embedded systems class in school. I'm looking to better understand the issue since I feel it is relevant to the work I am doing.

    Read the article

  • C# socket blocking behavior

    - by Gearoid Murphy
    My situation is this : I have a C# tcp socket through which I receive structured messages consisting of a 3 byte header and a variable size payload. The tcp data is routed through a network of tunnels and is occasionally susceptible to fragmentation. The solution to this is to perform a blocking read of 3 bytes for the header and a blocking read of N bytes for the variable size payload (the value of N is in the header). The problem I'm experiencing is that occasionally, the blocking receive operation returns a partial packet. That is, it reads a volume of bytes less than the number I explicitly set in the receive call. After some debugging, it appears that the number of bytes it returns is equal to the number of bytes in the Available property of the socket before the receive op. This behavior is contrary to my expectation. If the socket is blocking and I explicitly set the number of bytes to receive, shouldn't the socket block until it recv's those bytes?, any help, pointers, etc would be much appreciated.

    Read the article

  • problems calling webservices through the https connection

    - by shivaji123
    i have done an application in BlackBerry which takes username & password with url link which is a link of server here i am calling some webservices but it is doing the connection in https so when i take the username password & url link & hit the login button it basically calls a webservice but then the application connecting to the webservice for ever & after some time i get the error massage something "unreported exception the application is not responding" .& then the application crashes out.Also i am using the SOAP client library . this is the piece of code synchronized (this) { try { _httpconn = (HttpConnection) Connector.open(url,Connector.READ_WRITE);//Connector.READ_WRITE //_httpconn =(StreamConnection)Connector.open(url); //System.out.println("-----------httpsconnection() PART--------------------"); _httpconn.setRequestMethod(HttpConnection.POST); //_httpconn.setRequestProperty("Content-Type", "application/x-www-form-urlencoded"); //System.out.println("-----------httpsconnection() PART- **-------------------"); _httpconn.setRequestProperty("SOAPAction", Constants.EXIST_STR); //System.out.println("-----------httpsconnection() PART-REQUEST -------------------"); _httpconn.setRequestProperty("Content-Type", "text/soap+xml"); //System.out.println("-----------httpsconnection() PART- CONTENT-------------------"); _httpconn.setRequestProperty("User-Agent", "kSOAP/1.0"); //System.out.println("-----------httpsconnection() PART-USER Agent-------------------"); String clen = Integer.toBinaryString(input.length()); _httpconn.setRequestProperty("Content-Length", clen); //System.out.println("-----------httpsconnection() Content-Length--------------------"); _out = _httpconn.openDataOutputStream(); //System.out.println(input+"-----------input--------------------"+url); _out.write(input.getBytes()); _out.flush(); // may or may not be needed. //int rc = _httpconn.getResponseCode(); int rc = _httpconn.getResponseCode(); if(rc == HttpConnection.HTTP_OK) { isComplete = true; _in = _httpconn.openInputStream(); msg = new StringBuffer(); byte[] data = new byte[1024]; int len = 0; int size = 0; while ( -1 != (len = _in.read(data)) ) { msg.append(new String(data, 0, len)); size += len; } responsData = msg.toString(); System.out.println("-----------responsData "+responsData); } if(responsData!=null) isSuccessful = true; stop(); } catch (InterruptedIOException interrIO) { //errStr = "Network Connection hasn't succedded. "+ //"Please check APN setting."; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Network Connection hasn't succedded. "+ "Please try again later."); } }); isComplete = true; System.out.println(interrIO); stop(); } catch (IOException interrIO) { System.out.println("-----------IO EXCEPTION--------- "+interrIO); //errStr = "Network Connection hasn't succedded. "+ //"Please check APN setting." ; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Network Connection hasn't succedded. "+ "Please try again later." ); } }); isComplete = true; System.out.println(interrIO); stop(); } catch (Exception e) { System.out.println(e); //errStr = "Unable to connect to the internet at this time. "+ //"Please try again later."; UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Unable to connect to the internet at this time. "+ "Please try again later." ); } }); isComplete = true; stop(); } finally { try { if(_httpconn != null) { _httpconn.close(); _httpconn = null; } if(_in != null) { _in.close(); _in = null; } if(_out != null) { _out.close(); _out = null; } } catch(Exception e) { System.out.println(e); UiApplication.getUiApplication().invokeLater(new Runnable() { public void run() { Status.show("Unable to connect to the internet at this time. "+ "Please try again later." ); } }); } } } } can anybody help me out. Thanks in advance

    Read the article

  • SHA-256 encryption wrong result in Android

    - by user642966
    I am trying to encrypt 12345 using 1111 as salt using SHA-256 encoding and the answer I get is: 010def5ed854d162aa19309479f3ca44dc7563232ff072d1c87bd85943d0e930 which is not same as the value returned by this site: http://hash.online-convert.com/sha256-generator Here's the code snippet: public String getHashValue(String entity, String salt){ byte[] hashValue = null; try { MessageDigest digest = MessageDigest.getInstance("SHA-256"); digest.update(entity.getBytes("UTF-8")); digest.update(salt.getBytes("UTF-8")); hashValue = digest.digest(); } catch (NoSuchAlgorithmException e) { Log.i(TAG, "Exception "+e.getMessage()); } catch (UnsupportedEncodingException e) { // TODO Auto-generated catch block e.printStackTrace(); } return BasicUtil.byteArrayToHexString(hashValue); } I have verified my printing method with a sample from SO and result is fine. Can someone tell me what's wrong here? And just to clarify - when I encrypt same value & salt in iOS code, the returned value is same as the value given by the converting site.

    Read the article

  • How does Hibernate detect dirty state of an entity object?

    - by ???'Lenik
    Is it using some kind of byte codes modification to the original classes? Or, maybe Hibernate get the dirty state by compare the given object with previously persisted version? I'm having a problem with hashCode() and equals() methods for complicated objects. I feel it would be very slow to compute hash code if the object has collection members, and cyclic references are also a problem. If Hibernate won't use hashCode()/equals() to check the dirty state, I guess I should not use equals()/hashCode() for the entity object (not value object), but I'm also afraid if the same operator (==) is not enough. So, the questions are: How does Hibernate know if a property of an object is changed? Do you suggest to override the hashCode()/equals() methods for complicated objects? What if they contains cyclic references? And, also, Would hashCode()/equals() with only the id field be enough?

    Read the article

  • Marshal managed string[] to unmanaged char**

    - by Vince
    This is my c++ struct (Use Multi-Byte Character Set) typedef struct hookCONFIG { int threadId; HWND destination; const char** gameApps; const char** profilePaths; } HOOKCONFIG; And .Net struct [StructLayout(LayoutKind.Sequential, CharSet = CharSet.Auto)] public struct HOOKCONFIG { public int threadId; public IntPtr destination; // MarshalAs? public string[] gameApps; // MarshalAs? public string[] profilePaths; } I got some problem that how do I marshal the string array? When I access the struct variable "profilePaths" in C++ I got an error like this: An unhandled exception of type 'System.AccessViolationException' occurred in App.exe Additional information: Attempted to read or write protected memory. This is often an indication that other memory is corrupt. MessageBox(0, cfg.profilePaths[0], "Title", MB_OK); // error ... Orz

    Read the article

  • How to calculate the correct image size in out pdf using itextsharp ?

    - by MK
    I' am trying to add an image to a pdf using itextsharp, regardless of the image size it always appears to be mapped to a different greater size inside the pdf ? The image I add is 624x500 pixel (DPI:72): And here is a screen of the output pdf: And here is how I created the document: Document document = new Document(); System.IO.MemoryStream stream = new MemoryStream(); PdfWriter writer = PdfWriter.GetInstance(document, stream); document.Open(); System.Drawing.Image pngImage = System.Drawing.Image.FromFile("test.png"); Image pdfImage = Image.GetInstance(pngImage, System.Drawing.Imaging.ImageFormat.Png); document.Add(pdfImage); document.Close(); byte[] buffer = stream.GetBuffer(); FileStream fs = new FileStream("test.pdf", FileMode.Create); fs.Write(buffer, 0, buffer.Length); fs.Close(); Any idea why on how to calculate the correct size ?

    Read the article

  • C# web request with POST encoding question

    - by rlandster
    On the MSDN site there is an example of some C# code that shows how to make a web request with POST'ed data. Here is an excerpt of that code: WebRequest request = WebRequest.Create ("http://www.contoso.com/PostAccepter.aspx "); request.Method = "POST"; string postData = "This is a test that posts this string to a Web server."; byte[] byteArray = Encoding.UTF8.GetBytes (postData); // (*) request.ContentType = "application/x-www-form-urlencoded"; request.ContentLength = byteArray.Length; Stream dataStream = request.GetRequestStream (); dataStream.Write (byteArray, 0, byteArray.Length); dataStream.Close (); WebResponse response = request.GetResponse (); ...more... The line marked (*) is the line that puzzles me. Shouldn't the data be encoded using the UrlEncode function than UTF8? Isn't that what application/x-www-form-urlencoded implies?

    Read the article

  • About enumerations in Delphi and c++ in 64-bit environments

    - by sum1stolemyname
    I recently had to work around the different default sizes used for enumerations in Delphi and c++ since i have to use a c++ dll from a delphi application. On function call returns an array of structs (or records in delphi), the first element of which is an enum. To make this work, I use packed records (or aligned(1)-structs). However, since delphi selects the size of an enum-variable dynamically by default and uses the smallest datatype possible (it was a byte in my case), but C++ uses an int for enums, my data was not interpreted correctly. Delphi offers a compiler switch to work around this, so the declaration of the enum becomes {$Z4} TTypeofLight = ( V3d_AMBIENT, V3d_DIRECTIONAL, V3d_POSITIONAL, V3d_SPOT ); {$Z1} My Questions are: What will become of my structs when they are compiled on/for a 64-bit environment? Does the default c++ integer grow to 8 Bytes? Are there other memory alignment / data type size modifications (other than pointers)?

    Read the article

  • C#: how to update winform from thread?

    - by JackN
    A C# thread (Read()) causes System.NotSupportedException when it tries to update a winform based on received content. The full error message is Read() System.NotSupportedException: An error message cannot be displayed because an optional resource assembly containing it cannot be found at Microsoft.AGL.Common.MISC.HandelAr() at System.Windows.Forms.ProgressBar._SetInfo() at System.Windows.Forms.ProgressBar.set_Value() at ...ProcessStatus() at ...Read() The Build/Target Environment is: Microsoft.NET\SDK\CompactFramework\v2.0\WindowsCE. Is the problem writing to the ProgressBar from a Thread? If so, what is the correct C#/winforms method to update a ProgressBar from a Thread? In this application the Read() Thread is continuous: it is started when the application starts and runs forever. void ProcessStatus(byte[] status) { Status.Speed = status[5]; var Speed = Status.Speed/GEAR_RATIO; Status.Speed = (int) Speed; progressBarSpeed.Value = Status.Speed; ...

    Read the article

  • create zip files with arabic characters

    - by fatiDev
    i have the following situation i have to modify an existing files and return a zip containing this modified files , i'm in web application context what i done up to now is : ///////////////// modifying the existing file with poi librairy FileInputStream inpoi = new FileInputStream("file_path"); POIFSFileSystem fs = new POIFSFileSystem(inpoi); HWPFDocument doc = new HWPFDocument(fs); Range r = doc.getRange(); r.replaceText("<nomPrenom>","test"); byte[] b = doc.getDataStream(); //////////////////////// create the zip file and copy the modified files into it ZipOutputStream out = new ZipOutputStream(new FileOutputStream("my.zip")); out.putNextEntry(new ZipEntry("file")); for (int j = 0; j < b.length; j++) { out.write(b[j]); } the created zipped file can't be read correctly with word given that the original file is wrotten in arabic

    Read the article

  • Why is conversion from UTF-8 to ISO-8859-1 not the same in Windows and Linux?

    - by user1895307
    I have the following in code to convert from UTF-8 to ISO-8859-1 in a jar file and when I execute this jar in Windows I get one result and in CentOS I get another. Might anyone know why? public static void main(String[] args) { try { String x = "Ä, ä, É, é, Ö, ö, Ü, ü, ß, «, »"; Charset utf8charset = Charset.forName("UTF-8"); Charset iso88591charset = Charset.forName("ISO-8859-1"); ByteBuffer inputBuffer = ByteBuffer.wrap(x.getBytes()); CharBuffer data = utf8charset.decode(inputBuffer); ByteBuffer outputBuffer = iso88591charset.encode(data); byte[] outputData = outputBuffer.array(); String z = new String(outputData); System.out.println(z); } catch(Exception e) { System.out.println(e.getMessage()); } } In Windows, java -jar test.jar test.txt creates a file containing: Ä, ä, É, é, Ö, ö, Ü, ü, ß, «, » but in CentOS I get: ??, ä, ??, é, ??, ö, ??, ü, ??, «, » Help please!

    Read the article

  • String codification to Twitter

    - by Miguel Ribeiro
    I'm developing a program that sends tweets. I have this piece of code: StringBuilder sb = new StringBuilder("Recomendo "); sb.append(lblName.getText()); sb.append(" no canal "+lblCanal.getText()); sb.append(" no dia "+date[2]+"/"+date[1]+"/"+date[0]); sb.append(" às "+time[0]+"h"+time[1]); byte[] defaultStrBytes = sb.toString().getBytes("ISO-8859-1"); String encodedString = new String(defaultStrBytes, "UTF-8"); But When I send it to tweet I get the "?" symbol or other strage characters because of the accents like "à" . I've also tried with only String encodedString = new String(sb.toString().getBytes(), "UTF-8"); //also tried with ISO-8859-1 but the problem remains...

    Read the article

  • Should I convert overlong UTF-8 strings to their shortest normal form?

    - by Grant McLean
    I've just been reworking my Encoding::FixLatin Perl module to handle overlong UTF-8 byte sequences and convert them to the shortest normal form. My question is quite simply "is this a bad idea"? A number of sources (including this RFC) suggest that any over-long UTF-8 should be treated as an error and rejected. They caution against "naive implementations" and leave me with the impression that these things are inherently unsafe. Since the whole purpose of my module is to clean up messy data files with mixed encodings and convert them to nice clean utf8, this seems like just one more thing I can clean up so the application layer doesn't have to deal with it. My code does not concern itself with any semantic meaning the resulting characters might have, it simply converts them into a normalised form. Am I missing something. Is there a hidden danger I haven't considered?

    Read the article

  • Inserting into a bitstream

    - by evilertoaster
    I'm looking for a way to efficiently insert bits into a bitstream and have it 'overflow', padding with 0's. So for example if you had a byte array with 2 bytes: 231 and 109 (11100111 01101101), and did BitInsert(byteArray,4,00) it would insert two bits at bit offset 4 making 11100001 11011011 01000000 (225,219,24). It would be ok even the method only allowed 1 bit insertions e.g. BitInsert(byteArray,4,true) or BitInsert(byteArray,4,false). I have one method of doing it, but it has to walk the stream with a bitmask bit by bit, so I'm wondering if there's a simpler approach... Answers in assembly or a C derivative would be appreciated.

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >