Search Results

Search found 3512 results on 141 pages for 'circular buffer'.

Page 45/141 | < Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >

  • BN_hex2bn magically segfaults in openSSL

    - by xunil154
    Greetings, this is my first post on stackoverflow, and i'm sorry if its a bit long. I'm trying to build a handshake protocol for my own project and am having issues with the server converting the clients RSA's public key to a Bignum. It works in my clent code, but the server segfaults when attempting to convert the hex value of the clients public RSA to a bignum. I have already checked that there is no garbidge before or after the RSA data, and have looked online, but i'm stuck. header segment: typedef struct KEYS { RSA *serv; char* serv_pub; int pub_size; RSA *clnt; } KEYS; KEYS keys; Initializing function: // Generates and validates the servers key /* code for generating server RSA left out, it's working */ //Set client exponent keys.clnt = 0; keys.clnt = RSA_new(); BN_dec2bn(&keys.clnt->e, RSA_E_S); // RSA_E_S contains the public exponent Problem code (in Network::server_handshake): // *Recieved an encrypted message from the network and decrypt into 'buffer' (1024 byte long)* cout << "Assigning clients RSA" << endl; // I have verified that 'buffer' contains the proper key if (BN_hex2bn(&keys.clnt->n, buffer) < 0) { Error("ERROR reading server RSA"); } cout << "clients RSA has been assigned" << endl; The program segfaults at BN_hex2bn(&keys.clnt->n, buffer) with the error (valgrind output) Invalid read of size 8 at 0x50DBF9F: BN_hex2bn (in /usr/lib/libcrypto.so.0.9.8) by 0x40F23E: Network::server_handshake() (Network.cpp:177) by 0x40EF42: Network::startNet() (Network.cpp:126) by 0x403C38: main (server.cpp:51) Address 0x20 is not stack'd, malloc'd or (recently) free'd Process terminating with default action of signal 11 (SIGSEGV) Access not within mapped region at address 0x20 at 0x50DBF9F: BN_hex2bn (in /usr/lib/libcrypto.so.0.9.8) And I don't know why it is, Im using the exact same code in the client program, and it works just fine. Any input is greatly appriciated!

    Read the article

  • How to Capture a live stream from Windows Media Server 2008

    - by Hummad Hassan
    I want to capture the live stream from windows media server to filesystem on my pc I have tried with my own media server with the following code. but when i have checked the out put file i have found this in it. FileStream fs = null; try { HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://mywmsserver/test"); CookieContainer ci = new CookieContainer(1000); req.Timeout = 60000; req.Method = "Get"; req.KeepAlive = true; req.MaximumAutomaticRedirections = 99; req.UseDefaultCredentials = true; req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3"; req.ReadWriteTimeout = 90000000; req.CookieContainer = ci; //req.MediaType = "video/x-ms-asf"; req.AllowWriteStreamBuffering = true; HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); Stream resps = resp.GetResponseStream(); fs = new FileStream("d:\\dump.wmv", FileMode.Create, FileAccess.ReadWrite); byte[] buffer = new byte[1024]; int bytesRead = 0; while ((bytesRead = resps.Read(buffer, 0, buffer.Length)) > 0) { fs.Write(buffer, 0, bytesRead); } } catch (Exception ex) { } finally { if (fs != null) fs.Close(); }

    Read the article

  • How to inherit from a non-prototype object

    - by Andres Jaan Tack
    The node-binary binary parser builds its object with the following pattern: exports.parse = function parse (buffer) { var self = {...} self.tap = function (cb) {...}; self.into = function (key, cb) {...}; ... return self; }; How do I inherit my own, enlightened parser from this? Is this pattern designed intentionally to make inheritance awkward? My only successful attempt thus far at inheriting all the methods of binary.parse(<something>) is to use _.extend as: var clever_parser = function(buffer) { if (this instanceof clever_parser) { this.parser = binary.parse(buffer); // I guess this is super.constructor(...) _.extend(this.parser, this); // Really? return this.parser; } else { return new clever_parser(buffer); } } This has failed my smell test, and that of others. Is there anything about this that makes in tangerous?

    Read the article

  • How to Capture a live stream from Windows Media Server 2008 using c#.net

    - by Hummad Hassan
    I want to capture the live stream from windows media server to filesystem on my pc I have tried with my own media server with the following code. but when i have checked the out put file i have found this in it. please help me with this. Thanks [Reference] Ref1=http://mywindowsmediaserver/test?MSWMExt=.asf Ref2=http://mywindowsmediaserver/test?MSWMExt=.asf FileStream fs = null; try { HttpWebRequest req = (HttpWebRequest)WebRequest.Create("http://mywmsserver/test"); CookieContainer ci = new CookieContainer(1000); req.Timeout = 60000; req.Method = "Get"; req.KeepAlive = true; req.MaximumAutomaticRedirections = 99; req.UseDefaultCredentials = true; req.UserAgent = "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3"; req.ReadWriteTimeout = 90000000; req.CookieContainer = ci; //req.MediaType = "video/x-ms-asf"; req.AllowWriteStreamBuffering = true; HttpWebResponse resp = (HttpWebResponse)req.GetResponse(); Stream resps = resp.GetResponseStream(); fs = new FileStream("d:\\dump.wmv", FileMode.Create, FileAccess.ReadWrite); byte[] buffer = new byte[1024]; int bytesRead = 0; while ((bytesRead = resps.Read(buffer, 0, buffer.Length)) > 0) { fs.Write(buffer, 0, bytesRead); } } catch (Exception ex) { } finally { if (fs != null) fs.Close(); }

    Read the article

  • how to send binary data within an xml string

    - by daemonkid
    I want to send a binary file to .net c# component in the following xml format <BinaryFileString fileType='pdf'> <!--binary file data string here--> </BinaryFileString> In the component that is called I will use the above xml string and convert the binary string recieved within the BinaryFileString tag, into a file as specified by the filetype='' attribute. The file type could be doc/pdf/xls/rtf I have the code in the calling application to get out the bytes from the file to be sent. How do I prepare it to be sent with xml tags wrapped around it? I want the application to send out a string to the component and not a byte stream. This is because there is no way I can decipher the file type [pdf/doc/xls] by just looking at the byte stream. Hence the xml string with the filetype attribute. Any ideas on this? method for extracting Bytes below FileStream fs = new FileStream(_filePath, FileMode.Open, FileAccess.Read); using (Stream input = fs) { byte[] buffer = new byte[8192]; int bytesRead; while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0) { } } return buffer; Thanks.

    Read the article

  • From where starts the process' memory space and where does it end?

    - by nhaa123
    Hi, I'm trying to dump memory from my application where the variables lye. Here's the function: void MyDump(const void *m, unsigned int n) { const unsigned char *p = reinterpret_cast<const unsigned char *(m); char buffer[16]; unsigned int mod = 0; for (unsigned int i = 0; i < n; ++i, ++mod) { if (mod % 16 == 0) { mod = 0; std::cout << " | "; for (unsigned short j = 0; j < 16; ++j) { switch (buffer[j]) { case 0xa: case 0xb: case 0xd: case 0xe: case 0xf: std::cout << " "; break; default: std::cout << buffer[j]; } } std::cout << "\n0x" << std::setfill('0') << std::setw(8) << std::hex << (long)i << " | "; } buffer[i % 16] = p[i]; std::cout << std::setw(2) << std::hex << static_cast<unsigned int(p[i]) << " "; if (i % 4 == 0 && i != 1) std::cout << " "; } } Now, how can I know from which address starts my process memory space, where all the variables are stored? And how do I now, how long the area is? For instance: MyDump(0x0000 /* <-- Starts from here? */, 0x1000 /* <-- This much? */); Best regards, nhaa123

    Read the article

  • exec() in BeanShell macro causes jEdit to hang when it returns non-zero exit code

    - by rossmeissl
    I have a jEdit BeanShell macro that runs my Markdown files through Maruku when I save them: if (buffer.getMode().toString().equals("markdown")) { cmd = "C:\\Ruby\\bin\\maruku.bat -o " + buffer.getDirectory() + buffer.getName().replaceAll("markdown$", "html") + " " + buffer.getPath(); exec(cmd); } This works great when the Markdown file is valid. But if I've made a mistake, jEdit just waits around forever for the exec() call to "succeed," which it never will. When this happens, I have to kill jEdit's javaw.exe process and run Maruku manually from the command line to discover the error, e.g.: E:\bp\plan\supply_chain>maruku business_plan.markdown ___________________________________________________________________________ | Maruku tells you: +--------------------------------------------------------------------------- | Could not find ref_id = "17" for md_link(["17"],"17") | Available refs are [] +--------------------------------------------------------------------------- !C:/Ruby/lib/ruby/gems/1.8/gems/maruku-0.6.0/lib/maruku/errors_management.rb:49:in `maruku_error' !C:/Ruby/lib/ruby/gems/1.8/gems/maruku-0.6.0/lib/maruku/output/to_html.rb:716:in `to_html_link' !C:/Ruby/lib/ruby/gems/1.8/gems/maruku-0.6.0/lib/maruku/output/to_html.rb:970:in `send' !C:/Ruby/lib/ruby/gems/1.8/gems/maruku-0.6.0/lib/maruku/output/to_html.rb:970:in `array_to_html' !C:/Ruby/lib/ruby/gems/1.8/gems/maruku-0.6.0/lib/maruku/output/to_html.rb:961:in `each' \___________________________________________________________________________ Not creating a link for ref_id = "17". Then I restart jEdit, fix the error, and re-save the file, at which point the macro succeeds. How can I make my macro more resilient to either die helpfully (display Maruku's error output) or, at the very least, die silently so I don't have to kill jEdit?

    Read the article

  • Facebook app request in java not working

    - by Arpit Solanki
    I am trying to send a facebook app request to a user through the code below.But it gives an IO Exception and HTTP status code 400 in running.I dont see a any app request being sent to a user on running this. StringBuffer buffer = new StringBuffer(); buffer.append("access_token").append('=').append(this.app_access_token); buffer.append('&').append("message=").append("sent an app request!"); String content = buffer.toString(); try{ URLConnection connection = new URL("https://graph.facebook.com/me/apprequests").openConnection(); connection.setDoOutput(true); connection.setRequestProperty("Content-Type","application/x-www-form-urlencoded"); connection.setRequestProperty("Content-Length",Integer.toString(content.length())); DataOutputStream outs = new DataOutputStream(connection.getOutputStream()); outs.writeBytes(content); outs.flush(); outs.close(); BufferedReader in = new BufferedReader(new InputStreamReader(connection.getInputStream())); String inputLine; while ((inputLine = in.readLine()) != null) { System.out.println(inputLine); } in.close(); } catch(Exception e){ System.out.println(e); }

    Read the article

  • Java equivalent of the VB Request.InputStream

    - by Android Addict
    I have a web service that I am re-writing from VB to a Java servlet. In the web service, I want to extract the body entity set on the client-side as such: StringEntity stringEntity = new StringEntity(xml, HTTP.UTF_8); stringEntity.setContentType("application/xml"); httppost.setEntity(stringEntity); In the VB web service, I get this data by using: Dim objReader As System.IO.StreamReader objReader = New System.IO.StreamReader(Request.InputStream) Dim strXML As String = objReader.ReadToEnd and this works great. But I am looking for the equivalent in Java. I have tried this: ServletInputStream dataStream = req.getInputStream(); byte[] data = new byte[dataStream.toString().length()]; dataStream.read(data); but all it gets me is an unintelligible string: data = [B@68514fec Please advise. Edit Per the answers, I have tried: ServletInputStream dataStream = req.getInputStream(); ByteArrayOutputStream buffer = new ByteArrayOutputStream(); int r; byte[] data = new byte[1024*1024]; while ((r = dataStream.read(data, 0, data.length)) != -1) { buffer.write(data, 0, r); } buffer.flush(); byte[] data2 = buffer.toByteArray(); System.out.println("DATA = "+Arrays.toString(data2)); whichs yields: DATA = [] and when I try: System.out.println("DATA = "+data2.toString()); I get: DATA = [B@15282c7f So what am I doing wrong? As stated earlier, the same call to my VB service gives me the xml that I pass in.

    Read the article

  • Bluetooth in Java Mobile: Handling connections that go out of range

    - by Albus Dumbledore
    I am trying to implement a server-client connection over the spp. After initializing the server, I start a thread that first listens for clients and then receives data from them. It looks like that: public final void run() { while (alive) { try { /* * Await client connection */ System.out.println("Awaiting client connection..."); client = server.acceptAndOpen(); /* * Start receiving data */ int read; byte[] buffer = new byte[128]; DataInputStream receive = client.openDataInputStream(); try { while ((read = receive.read(buffer)) > 0) { System.out.println("[Recieved]: " + new String(buffer, 0, read)); if (!alive) { return; } } } finally { System.out.println("Closing connection..."); receive.close(); } } catch (IOException e){ e.printStackTrace(); } } } It's working fine for I am able to receive messages. What's troubling me is how would the thread eventually die when a device goes out of range? Firstly, the call to receive.read(buffer) blocks so that the thread waits until it receives any data. If the device goes out of range, it would never proceed onward to check if meanwhile it has been interrupted. Secondly, it would never close the connection, i.e. the server would not accept the device once it goes back in range. Thanks! Any ideas would be highly appreciated! Merry Christmas!

    Read the article

  • OpenGLES - Rendering a background image only once and not wiping it

    - by chaosbeaker
    Hello, first time asking a question here but been watching others answers for a while. My own question is one for improving the performance of my program. Currently I'm wiping the viewFrameBuffer on each pass through my program and then rendering the background image first followed by the rest of my scene. I was wondering how I go about rendering the background image once, and only wiping the rest of the scene for updating/re-rendering. I tried using a seperate buffer but I'm not sure how to present this new buffer to the render buffer. // Set the current EAGLContext and bind to the framebuffer. This will direct all OGL commands to the // framebuffer and the associated renderbuffer attachment which is where our scene will be rendered [EAGLContext setCurrentContext:context]; glBindFramebufferOES(GL_FRAMEBUFFER_OES, viewFramebuffer); // Define the viewport. Changing the settings for the viewport can allow you to scale the viewport // as well as the dimensions etc and so I'm setting it for each frame in case we want to change i glViewport(0, 0, screenBounds.size.width , screenBounds.size.height); // Clear the screen. If we are going to draw a background image then this clear is not necessary // as drawing the background image will destroy the previous image glClearColor(0.0f, 1.0f, 0.0f, 1.0f); glClear(GL_COLOR_BUFFER_BIT); // Setup how the images are to be blended when rendered. This could be changed at different points during your // render process if you wanted to apply different effects glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); switch (currentViewInt) { case 1: { [background render:CGPointMake(240, 0) fromTopLeftBottomRightCenter:@"Bottom"]; // Other Rendering Code }} // Bind to the renderbuffer and then present this image to the current context glBindRenderbufferOES(GL_RENDERBUFFER_OES, viewRenderbuffer); [context presentRenderbuffer:GL_RENDERBUFFER_OES]; Hopefully by solving this I'll also be able to implement another buffer just for rendering particles as I can set them to always use a black background as their alpha source. Any help is greatly appreciated

    Read the article

  • Clone existing structs with different alignment in Visual C++

    - by Crend King
    Is there a way to clone an existing struct with different member alignment in Visual C++? Here is the background: I use an 3rd-party library, which uses several structs. To fill up the structs, I pass the address of the struct instances to some functions. Unfortunately, the functions only returns unaligned buffer, so that data of some members are always wrong. /Zp is out of choice, since it breaks the other parts of the program. I know #pragma pack modifies the alignment of the following struct, but I would like to avoid copying the structs into my code, for the definitions in the library might change in the future. Sample code: test.h: struct am_aligned { BYTE data1[10]; ULONG data2; }; test.cpp: #include "test.h" // typedef alignment(1) struct am_aligned am_unaligned; int APIENTRY wWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow) { char buffer[20] = {}; for (int i = 0; i < sizeof(unaligned_struct); i++) { buffer[i] = i; } am_aligned instance = *(am_aligned*) buffer; return 0; } Consider am_aligned is defined in the library header file. am_unaligned is my custom declaration, and only effective in test.cpp. The commented line does not work of course. instance.data2 is 0x0f0e0d0c, while 0x0d0c0b0a is desired. Thanks for help!

    Read the article

  • Bluetooth txt file byte increases????

    - by cheesebunz
    Hi everyone, i am able to xfer files from 1 mobile device to another. When the sender sends this text file of 8 bytes, the receiver end will become a 256bytes txt file and when i open the contents of the txt file, there are my infos plus alot of square boxes. Here is my code from the sender: string fileName = @"SendTest.txt"; System.Uri uri = new Uri("obex://" + selectedAddr + "/" + System.IO.Path.GetFileName(fileName)); ObexWebRequest request = new ObexWebRequest(uri); Stream requestStream = request.GetRequestStream(); FileStream fs = File.OpenRead(fileName); byte[] buffer = new byte[1024]; int readBytes = 1; while (readBytes != 0) { readBytes = fs.Read(buffer,0, buffer.Length); requestStream.Write(buffer,0, readBytes); } requestStream.Close(); ObexWebResponse response = (ObexWebResponse)request.GetResponse(); MessageBox.Show(response.StatusCode.ToString()); response.Close(); Any1 knws how do i solve it?

    Read the article

  • jQuery ajax post of jpg image to .net webservice. Image results corrupted

    - by sosergio
    I have a phonegap jquery app that opens the camera and take a picture. I then POST this picture to a .net webservice, wich I've coded. I can't use phonegap FileTransfer because such isn't supported by Bada os, wich is a requirement. I believe I've successfully loaded the image from phonegap FileSystem API, I've attached it into an .ajax type:post, I've even received it from .net side, but when .net save the image into the server, the image results corrupted. It seems to me that two sides of the communication have different data type. Has anyone experience in this? Any help will be appreciated. This is my code: //PHONEGAP CAMERA ACCESS (summed up) navigator.camera.getPicture(onGetPictureSuccess, onGetPictureFail, { quality: 50, destinationType:Camera.DestinationType.FILE_URI }); window.resolveLocalFileSystemURI(imageURI, onResolveFileSystemURISuccess, onResolveFileSystemURIError); fileEntry.file(gotFileSuccess, gotFileError); new FileReader().readAsDataURL(file); //UPLOAD FILE function onDataReadSuccess(evt) { var image_data = evt.target.result; var filename = unique_id(); var filext = "jpg"; $.ajax({ type : 'POST', url : SERVICE_BASE_URL+"/fotos/"+filename+"?ext="+filext, cache: false, timeout: 100000, processData: false, data: image_data, contentType: 'image/jpeg', success : function(data) { console.log("Data Uploaded with success. Message: "+ data); $.mobile.hidePageLoadingMsg(); $.mobile.changePage("ok.html"); } }); } On my .net Web Service this is the method that gets invoked: public string FotoSave(string filename, string extension, Stream fileContent) { string filePath = HttpContext.Current.Server.MapPath("~/foto_data/") + "\\" + filename; FileStream writeStream = new FileStream(filePath, FileMode.OpenOrCreate, FileAccess.Write); int Length = 256; Byte[] buffer = new Byte[Length]; int bytesRead = readStream.Read(buffer, 0, Length); // write the required bytes while (bytesRead > 0) { writeStream.Write(buffer, 0, bytesRead); bytesRead = readStream.Read(buffer, 0, Length); } readStream.Close(); writeStream.Close(); }

    Read the article

  • c++ File input/output

    - by Myx
    Hi: I am trying to read from a file using fgets and sscanf. In my file, I have characters on each line of the while which I wish to put into a vector. So far, I have the following: FILE *fp; fp = fopen(filename, "r"); if(!fp) { fprintf(stderr, "Unable to open file %s\n", filename); return 0; } // Read file int line_count = 0; char buffer[1024]; while(fgets(buffer, 1023, fp)) { // Increment line counter line_count++; char *bufferp = buffer; ... while(*bufferp != '\n') { char *tmp; if(sscanf(bufferp, "%c", tmp) != 1) { fprintf(stderr, "Syntax error reading axiom on " "line %d in file %s\n", line_count, filename); return 0; } axiom.push_back(tmp); printf("put %s in axiom vector\n", axiom[axiom.size()-1]); // increment buffer pointer bufferp++; } } my axiom vector is defined as vector<char *> axiom;. When I run my program, I get a seg fault. It happens when I do the sscanf. Any suggestions on what I'm doing wrong?

    Read the article

  • Modify existing struct alignment in Visual C++

    - by Crend King
    Is there a way to modify the member alignment of an existing struct in Visual C++? Here is the background: I use an 3rd-party library, which uses several structs. To fill up the structs, I pass the address of the struct instance to some functions. Unfortunately, the functions only returns unaligned buffer, so that data of some members are always wrong. /Zp is out of choice, since it breaks the other parts of the program. I know #pragma pack modifies the alignment of the following struct, but I do not want to copy the structs into my code, for the definitions in the library might change in the future. Sample code: test.h: struct am_aligned { BYTE data1[10]; ULONG data2; }; test.cpp: include "test.h" // typedef alignment(1) struct am_aligned am_unaligned int APIENTRY wWinMain(HINSTANCE hInstance, HINSTANCE hPrevInstance, LPTSTR lpCmdLine, int nCmdShow) { char buffer[20] = {}; for (int i = 0; i < sizeof(unaligned_struct); i++) { buffer[i] = i; } am_aligned instance = *(am_aligned*) buffer; return 0; } instance.data2 is 0x0f0e0d0c, while 0x0d0c0b0a is desired. The commented line does not work of course. Thanks for help!

    Read the article

  • Low Level Console Input

    - by Soulseekah
    I'm trying to send commands to to the input of a cmd.exe application using the low level read/write console functions. I have no trouble reading the text (scraping) using the ReadConsole...() and WriteConsole() functions after having attached to the process console, but I've not figured out how to write for example "dir" and have the console interpret it as a sent command. Here's a bit of my code: CreateProcess(NULL, "cmd.exe", NULL, NULL, FALSE, CREATE_NEW_CONSOLE, NULL, NULL, &si, &pi); AttachConsole(pi.dwProcessId); strcpy(buffer, "dir"); WriteConsole(GetStdHandle(STD_INPUT_HANDLE), buffer, strlen(buffer), &charRead, NULL); STARTUPINFO attributes of the process are all set to zero, except, of course, the .cb attribute. Nothing changes on the screen, however I'm getting an Error 6: Invalid Handle returned from WriteConsole to STD_INPUT_HANDLE. If I write to (STD_OUTPUT_HANDLE) I do get my dir written on the screen, but nothing of course happens. I'm guessing SetConsoleMode() might be of help, but I've tried many mode combinations, nothing helped. I've also created a quick console application that waits for input (scanf()) and echoes back whatever goes in, didn't work. I've also tried typing into the scanf() promp and then peek into the input buffer using PeekConsoleInput(), returns 0, but the INPUT_RECORD array is empty. I'm aware that there is another way around this using WriteConsoleInput() to directly inject INPUT_RECORD structured events into the console, but this would be way too long, I'll have to send each keypress into it. I hope the question is clear. Please let me know if you need any further information. Thanks for your help.

    Read the article

  • problem with kCFSocketReadCallBack

    - by zp26
    Hello. I have a problem with my program. I created a socket with "kCFSocketReadCallBack. My intention was to call the "acceptCallback" only when it receives a string to the socket. Instead my program does not just accept the connection always goes into "startReceive" stop doing so and sometimes crash the program. Can anybody help? Thanks readSocket = CFSocketCreateWithNative( NULL, fd, kCFSocketReadCallBack, AcceptCallback, &context ); static void AcceptCallback(CFSocketRef s, CFSocketCallBackType type, CFDataRef address, const void *data, void *info) // Called by CFSocket when someone connects to our listening socket. // This implementation just bounces the request up to Objective-C. { ServerVistaController * obj; #pragma unused(address) // assert(address == NULL); assert(data != NULL); obj = (ServerVistaController *) info; assert(obj != nil); #pragma unused(s) assert(s == obj->listeningSocket); if (type & kCFSocketAcceptCallBack){ [obj acceptConnection:*(int *)data]; } if (type & kCFSocketAcceptCallBack){ [obj startReceive:*(int *)data]; } } -(void)startReceive:(int)fd { CFReadStreamRef readStream = NULL; CFIndex bytes; UInt8 buffer[MAXLENGTH]; CFStreamCreatePairWithSocket( kCFAllocatorDefault, fd, &readStream, NULL); if(!readStream){ close(fd); [self updateLabel:@"No readStream"]; } CFReadStreamOpen(readStream); [self updateLabel:@"OpenStream"]; bytes = CFReadStreamRead( readStream, buffer, sizeof(buffer)); if (bytes < 0) { [self updateLabel:(NSString*)buffer]; close(fd); } CFReadStreamClose(readStream); }

    Read the article

  • How to limit TCP writes to particular size and then block untlil the data is read

    - by ustulation
    {Qt 4.7.0 , VS 2010} I have a Server written in Qt and a 3rd party client executable. Qt based server uses QTcpServer and QTcpSocket facilities (non-blocking). Going through the articles on TCP I understand the following: the original implementation of TCP mentioned the negotiable window size to be a 16-bit value, thus maximum being 65535 bytes. But implementations often used the RFC window-scale-extension that allows the sliding window size to be scalable by bit-shifting to yield a maximum of 1 gigabyte. This is implementation defined. This could have resulted in majorly different window sizes on receiver and sender end as the server uses Qt facilities without hardcoding any window size limit. Client 1st asks for all information it can based on the previous messages from the server before handling the new (accumulating) incoming messages. So at some point Server receives a lot of messages each asking for data of several MB's. This the server processes and puts it into the sender buffer. Client however is unable to handle the messages at the same pace and it seems that client’s receiver buffer is far smaller (65535 bytes maybe) than sender’s transmit window size. The messages thus get accumulated at sender’s end until the sender’s buffer is full too after which the TCP writes on sender would block. This however does not happen as sender buffer is much larger. Hence this manifests as increase in memory consumption on the sender’s end. To prevent this from happening, I used Qt’s socket’s waitForBytesWritten() with timeout set to -1 for infinite waiting period. This as I see from the behaviour blocks the thread writing TCP data until the data has actually been sensed by the receiver’s window (which will happen when earlier messages have been processed by the client at application level). This has caused memory consumption at Server end to be almost negligible. is there a better alternative to this (in Qt) if i want to restrict the memory consumption at server end to say x MB's? Also please point out if any of my understandings is incorrect.

    Read the article

  • How to backup using backup API's in c++

    - by user1603185
    I am writing an application that used to backup some specified file, therefore using the backup API calls i.e CreateFile BackupRead and WriteFile API's. getting errors Access violation reading location. I have attached code below. #include <windows.h> int main() { HANDLE hInput, hOutput; //m_filename is a variable holding the file path to read from hInput = CreateFile(L"C:\\Key.txt", GENERIC_READ, 0, NULL, OPEN_EXISTING, FILE_FLAG_BACKUP_SEMANTICS, NULL); //strLocation contains the path of the file I want to create. hOutput= CreateFile(L"C:\\tmp\\", GENERIC_WRITE, NULL, NULL, CREATE_ALWAYS, NULL, NULL); DWORD dwBytesToRead = 1024 * 1024 * 10; BYTE *buffer; buffer = new BYTE[dwBytesToRead]; BOOL bReadSuccess = false,bWriteSuccess = false; DWORD dwBytesRead,dwBytesWritten; LPVOID lpContext; //Now comes the important bit: do { bReadSuccess = BackupRead(hInput, buffer, sizeof(BYTE) *dwBytesToRead, &dwBytesRead, false, true, &lpContext); bWriteSuccess= WriteFile(hOutput, buffer, sizeof(BYTE) *dwBytesRead, &dwBytesWritten, NULL); }while(dwBytesRead == dwBytesToRead); return 0; } Any one suggest me how to use these API's? Thanks.

    Read the article

  • Strange compiler complaint when using similar code

    - by Jason
    For a project, I have to ask the user for a file name, and I'm reading it in character by character using getchar. From the main, I call the function char *coursename= introPrint(); //start off to print the usage directions and get the first bit of input. That function is defined as char *introPrint(){ int size= 20; int c; int length=0; char buffer[size]; //instructions printout, cut for brevity //get coursename from user and return it while ( (c=getchar()) != EOF && (c != '\n') ){ buffer[length++]= c; if (length==size-1) break; } buffer[length]=0; return buffer; } This is basically identical code I wrote to ask the user for input, replace the character echo with asterisks, then print out the results. Here, though, I'm getting a function returns address of local variable warning for the return statement. So why am I getting no warnings from the other program, but trigger one for this code?

    Read the article

  • GetLongPathName Undeclared

    - by iwizardpro
    When I try to compile my code with the function GetLongPathName(), the compiler tells me that the function is undeclared. I have already read the MSDN documentation located @ http://msdn.microsoft.com/en-us/library/aa364980%28VS.85%29.aspx. But, even though I included those header files, I am still getting the undeclared function error. Which header file(s) am I supposed to include when using the function? #include <Windows.h> #include <WinBase.h> #define DLLEXPORT extern "C" __declspec(dllexport) DLLEXPORT char* file_get_long(char* path_original) { long length = 0; TCHAR* buffer = NULL; if(!path_original) { return "-10"; } length = GetLongPathName(path_original, NULL, 0); if(length == 0) { return "-10"; } buffer = new TCHAR[length]; length = GetLongPathName(path_original, buffer, length); if(length == 0) { return "-10"; } return buffer; } And, if it makes a difference, I am currently compiling using Dev-C++ on a Windows Vista 64-bit.

    Read the article

  • Java: Inputting text from a file using split

    - by 00PS
    I am inputting an adjacency list for a graph. There are three columns of data (vertex, destination, edge) separated by a single space. Here is my implementation so far: FileStream in = new FileStream("input1.txt"); Scanner s = new Scanner(in); String buffer; String [] line = null; while (s.hasNext()) { buffer = s.nextLine(); line = buffer.split("\\s+"); g.add(line[0]); System.out.println("Added vertex " + line[0] + "."); g.addEdge(line[0], line[1], Integer.parseInt(line[2])); System.out.println("Added edge from " + line[0] + " to " + line[1] + " with a weight of " + Integer.parseInt(line[2]) + "."); } System.out.println("Size of graph = " + g.size()); Here is the output: Added vertex a. Added edge from a to b with a weight of 9. Exception in thread "main" java.lang.NullPointerException at structure5.GraphListDirected.addEdge(GraphListDirected.java:93) at Driver.main(Driver.java:28) I was under the impression that line = buffer.split("\\s+"); would return a 2 dimensional array of Strings to the variable line. It seemed to work the first time but not the second. Any thoughts? I would also like some feedback on my implementation of this problem. Is there a better way? Anything to help out a novice! :)

    Read the article

  • i not find how in powershell pass through http autentification then use a webservices (lotus/domino)

    - by user1716616
    We have here a domino/lotus webservices i want use with powershell. probleme is in front of webservices lotus admin ask a http autentification. how i can use this webservice?? here what i tryed first scrap the first page and get cookie. $url = "http://xxxxxxx/names.nsf?Login" $CookieContainer = New-Object System.Net.CookieContainer $postData = "Username=web.services&Password=jesuisunestar" $buffer = [text.encoding]::ascii.getbytes($postData) [net.httpWebRequest] $req = [net.webRequest]::create($url) $req.method = "POST" $req.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8" $req.Headers.Add("Accept-Language: en-US") $req.Headers.Add("Accept-Encoding: gzip,deflate") $req.Headers.Add("Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7") $req.AllowAutoRedirect = $false $req.ContentType = "application/x-www-form-urlencoded" $req.ContentLength = $buffer.length $req.TimeOut = 50000 $req.KeepAlive = $true $req.Headers.Add("Keep-Alive: 300"); $req.CookieContainer = $CookieContainer $reqst = $req.getRequestStream() $reqst.write($buffer, 0, $buffer.length) $reqst.flush() $reqst.close() [net.httpWebResponse] $res = $req.getResponse() $resst = $res.getResponseStream() $sr = new-object IO.StreamReader($resst) $result = $sr.ReadToEnd() this seem work but now no idea how i can use cookie with a webservicesproxy??? ps: i success have this to work with c# + visualstudio (just the class reference is autobuilt and i don't understand half of this but it allow me to use .CookieContenaire on the generated webservice )

    Read the article

  • how to tune BufferedInputStream read()?

    - by technomax
    I am reading a BLOB column from a Oracle database, then writing it to a file as follows: public static int execute(String filename, BLOB blob) { int success = 1; try { File blobFile = new File(filename); FileOutputStream outStream = new FileOutputStream(blobFile); BufferedInputStream inStream = new BufferedInputStream(blob.getBinaryStream()); int length = -1; int size = blob.getBufferSize(); byte[] buffer = new byte[size]; while ((length = inStream.read(buffer)) != -1) { outStream.write(buffer, 0, length); outStream.flush(); } inStream.close(); outStream.close(); } catch (Exception e) { e.printStackTrace(); System.out.println("ERROR(img_exportBlob) Unable to export:"+filename); success = 0; } } The file size is around 3MB and it takes 40-50s to read the buffer. Its actually a 3D image data. So, is there any way by which I can reduce this time?

    Read the article

< Previous Page | 41 42 43 44 45 46 47 48 49 50 51 52  | Next Page >