Search Results

Search found 4304 results on 173 pages for 'bytes'.

Page 95/173 | < Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >

  • objective-c uncompress/inflate a NSData object which contains a gzipped string

    - by user141146
    Hi, I'm using objective-c (iPhone) to read a sqlite table that contains blob data. The blob data is actually a gzipped string. The blob data is read into memory as an NSData object I then use a method (gzipInflate) which I grabbed from here (http://www.cocoadev.com/index.pl?NSDataCategory) -- see method below. However, the gzipInflate method is returning nil. If I step through the gzipInflate method, I can see that the status returned near the bottom of the method is -3, which I assume is some type of error (Z_STREAM_END = 1 and Z_OK = 0, but I don't know precisely what a status of -3 is). Consequently, the function returns nil even though the input NSData is 841 bytes in length. Any thoughts on what I'm doing wrong? Is there something wrong with the method? Something wrong with how I'm using the method? Any thoughts on how to test this? Thanks for any help! - (NSData *)gzipInflate { // NSData *compressed_data = [self.description dataUsingEncoding: NSUTF16StringEncoding]; NSData *compressed_data = self.description; if ([compressed_data length] == 0) return compressed_data; unsigned full_length = [compressed_data length]; unsigned half_length = [compressed_data length] / 2; NSMutableData *decompressed = [NSMutableData dataWithLength: full_length + half_length]; BOOL done = NO; int status; z_stream strm; strm.next_in = (Bytef *)[compressed_data bytes]; strm.avail_in = [compressed_data length]; strm.total_out = 0; strm.zalloc = Z_NULL; strm.zfree = Z_NULL; if (inflateInit2(&strm, (15+32)) != Z_OK) return nil; while (!done) { // Make sure we have enough room and reset the lengths. if (strm.total_out >= [decompressed length]) [decompressed increaseLengthBy: half_length]; strm.next_out = [decompressed mutableBytes] + strm.total_out; strm.avail_out = [decompressed length] - strm.total_out; // Inflate another chunk. status = inflate (&strm, Z_SYNC_FLUSH); NSLog(@"zstreamend = %d", Z_STREAM_END); if (status == Z_STREAM_END) done = YES; else if (status != Z_OK) break; //status here actually is -3 } if (inflateEnd (&strm) != Z_OK) return nil; // Set real length. if (done) { [decompressed setLength: strm.total_out]; return [NSData dataWithData: decompressed]; } else return nil; }

    Read the article

  • How do I obtain a code point integer from a 1 to 4 byte UTF-8 encoded sequence in Windows?

    - by Patrick Niedzielski
    Hello, I am Patrick Niedzielski, a programmer for the Free Software 3D adventure game Humm and Strumm. I'm working on a minimal Unicode character class in C++. I currently have an array of four bytes representing a UTF-8 sequence. On GNU/Linux, I can just convert to UTF-32 with iconv(), but on Windows, I cannot do this. Is it possible to convert the array to a single code point? Thanks, Patrick

    Read the article

  • What should I do or don't do to avoid Delphi "push dword" bug.

    - by Maksee
    I found that Delphi 5 generates invalid assembly code in specific cases. I can't understand in what cases in general. The example below produces access violation since a very strange optimization occurs. For a byte in a record or array Delphi generates push dword [...], pop ebx, mov .., bl that works correctly if there are data after this byte (we need at least three to push dword correctly), but fails if the data is inaccessible. I emulated the strict boundaries here with win32 Virtual* functions Specifically the error occurs when the last byte from the block accessed inside FeedBytesToClass procedure. And if I try to change something like using data array instead of object property of remove actionFlag variable, Delphi generates correct assembly instructions. const BlockSize = 4096; type TSomeClass = class private fBytes: PByteArray; public property Bytes: PByteArray read fBytes; constructor Create; destructor Destroy;override; end; constructor TSomeClass.Create; begin inherited Create; GetMem(fBytes, BlockSize); end; destructor TSomeClass.Destroy; begin FreeMem(fBytes); inherited; end; procedure FeedBytesToClass(SrcDataBytes: PByteArray; Count: integer); var j: integer; Ofs: integer; actionFlag: boolean; AClass: TSomeClass; begin AClass:=TSomeClass.Create; try actionFlag:=true; for j:=0 to Count-1 do begin Ofs:=j; if actionFlag then begin AClass.Bytes[Ofs]:=SrcDataBytes[j]; end; end; finally AClass.Free; end; end; procedure TForm31.Button1Click(Sender: TObject); var SrcDataBytes: PByteArray; begin SrcDataBytes:=VirtualAlloc(Nil, BlockSize, MEM_COMMIT, PAGE_READWRITE); try if VirtualLock(SrcDataBytes, BlockSize) then try FeedBytesToClass(SrcDataBytes, BlockSize); finally VirtualUnLock(SrcDataBytes, BlockSize); end; finally VirtualFree(SrcDataBytes, MEM_DECOMMIT, BlockSize); end; end; Initially the error occured when I used access to RGB data of bitmap bits, but the code there is too complex so I narrowed it to this fragment. So the question is what is here so specific that makes Delphi produce push,pop,mov optimization. I need to know this in order to avoid such side effects in general.

    Read the article

  • how to save excel file in MS SQL Server 2005

    - by Anish Mohan
    I have a table in which there are two columns : 1. import type, 2. Excel import template. The second column - "Excel import template" should store the whole excel file. How would I save excel file in databse...can I use binary datatype column, convert excel file to bytes and save the same ? Thanks in advance !

    Read the article

  • LF/CR issue with RS232 in Linux

    - by Albinoswordfish
    I've been having this problem where anytime I send a 0xA through an RS-232 in a Linux OS the receiver interprets that as 2 bytes, 0xD and 0xA. Also whenever I receive 0xD the serial port interprets that as 0xA. I've been reading that there are known issues regarding this, has anybody been able to find a solution?

    Read the article

  • Byte = 8bits, but why doesn't BitConverter think so

    - by Paul Farry
    Given the following information Public Enum Request As Byte None = 0 Identity = 1 License = 2 End Enum Protected mType As Communication.Request mType = Communication.Request.Identity Debug.Print (BitConverter.GetBytes(mType).Length.tostring) 2 Why does bitconverter report that mType is a length of 2. I would have thought that passing a Byte into BitConverter.GetBytes would just return the Byte. I mean it's no big deal because it's only sending a very small block of data across a TCP Socket, but I'm just intrigued why it thinks it's 2 bytes.

    Read the article

  • Converting a int to a BCD byte array

    - by Lily
    I want to convert an int to a byte[2] array using BCD. The int in question will come from DateTime representing the Year and must be converted to two bytes. Is there any pre-made function that does this or can you give me a simple way of doing this? example: int year = 2010 would output: byte[2]{0x20, 0x10};

    Read the article

  • example of a utf-8 format octet string

    - by erik
    I'm working w/ a function that expects a string formatted as a utf-8 encoded octet string. Can someone give me an example of what a utf-8 encoded octet string would look like? Put another way, if I convert 'foo' to bytes, I get 112, 111, 111. What would these char codes look like as a utf-8 encoded octet string? Would it be "0x70 0x6f 0x6f"? Thanks

    Read the article

  • How to load a Bluetooth .apk file in Android G1 phone through Linux OS?

    - by Praween k
    Hi, I want to install a Bluetooth Application over my G1 device in linux environment.Please any Body provide me the procedure for this ? Whenever i am installing the application following error was thrown:- adb install /home/parveen/workspace/BluetoothChat/bin/BluetoothChat.apk 337 KB/s (28084 bytes in 0.081s) pkg: /data/local/tmp/BluetoothChat.apk Failure [INSTALL_FAILED_ALREADY_EXISTS]** Please help me out of this? Thanks in Advance Praween.

    Read the article

  • Server doesn't notice when client closes socket (.NET CF & GPRS)

    - by HansA
    Client written in .NET Compact Framework running over GPRS connection. Client connects a socket to the server. The server accepts the connection. Client sends 62 bytes of data and then closes the socket. Server never detects that the client has closed the socket and is therefore not able to know that the transfer has completed. This code works fine when run over a wireless connection. Any ideas?

    Read the article

  • Sharing a file from Android to Gmail or to Dropbox

    - by Calaf
    To share a simple text file, I started by copying verbatim from FileProvider's manual page: <application android:allowBackup="true" android:icon="@drawable/ic_launcher" android:label="@string/app_name" android:theme="@style/AppTheme" > <provider android:name="android.support.v4.content.FileProvider" android:authorities="com.mycorp.helloworldtxtfileprovider.MainActivity" android:exported="false" android:grantUriPermissions="true" > <meta-data android:name="android.support.FILE_PROVIDER_PATHS" android:resource="@xml/my_paths" /> </provider> <activity android:name="com.mycorp.helloworldtxtfileprovider.MainActivity" ... Then I saved a text file and used, again nearly verbatim, the code under Sending binary content. (Notice that this applies more accurately in this case than "Sending text content" since we are sending a file, which happens to be a text file, rather than just a string of text.) For the convenience of duplication on your side, and since the code is in any case so brief, I'm including it here in full. public class MainActivity extends Activity { @Override protected void onCreate(Bundle savedInstanceState) { super.onCreate(savedInstanceState); setContentView(R.layout.activity_main); String filename = "hellow.txt"; String fileContents = "Hello, World!\n"; byte[] bytes = fileContents.getBytes(); FileOutputStream fos = null; try { fos = this.openFileOutput(filename, MODE_PRIVATE); fos.write(bytes); } catch (IOException e) { e.printStackTrace(); } finally { try { fos.close(); } catch (IOException e) { e.printStackTrace(); } } File file = new File(filename); Intent shareIntent = new Intent(); shareIntent.setAction(Intent.ACTION_SEND); shareIntent.putExtra(Intent.EXTRA_STREAM, Uri.fromFile(file)); shareIntent.setType("application/txt"); startActivity(Intent.createChooser(shareIntent, getResources().getText(R.string.send_to))); file.delete(); } } Aside from adding a value for send_to in res/values/strings.xml, the only other change I did to the generic Hello, World that Eclipse creates is to add the following in res/xml/my_paths.xml (as described on the page previously referenced. <paths xmlns:android="http://schemas.android.com/apk/res/android"> <Files-path name="files" path="." /> </paths> This code runs fine. It shows a list of intent recipients. But sending the text file to either Dropbox or to Gmail fails. Dropbox sends the notification "Uploading to Dropbox" followed by "Upload failed: my_file.txt". After "sending message.." Gmail sends "Couldn't send attachment". What is wrong?

    Read the article

  • C# Begin/EndReceive - how do I read large data?

    - by ryeguy
    When reading data in chunks of say, 1024, how do I continue to read from a socket that receives a message bigger than 1024 bytes until there is no data left? Should I just use BeginReceive to read a packet's length prefix only, and then once that is retrieved, use Receive() (in the async thread) to read the rest of the packet? Or is there another way? edit: I thought Jon Skeet's link had the solution, but there is a bit of a speedbump with that code. The code I used is: public class StateObject { public Socket workSocket = null; public const int BUFFER_SIZE = 1024; public byte[] buffer = new byte[BUFFER_SIZE]; public StringBuilder sb = new StringBuilder(); } public static void Read_Callback(IAsyncResult ar) { StateObject so = (StateObject) ar.AsyncState; Socket s = so.workSocket; int read = s.EndReceive(ar); if (read > 0) { so.sb.Append(Encoding.ASCII.GetString(so.buffer, 0, read)); if (read == StateObject.BUFFER_SIZE) { s.BeginReceive(so.buffer, 0, StateObject.BUFFER_SIZE, 0, new AyncCallback(Async_Send_Receive.Read_Callback), so); return; } } if (so.sb.Length > 0) { //All of the data has been read, so displays it to the console string strContent; strContent = so.sb.ToString(); Console.WriteLine(String.Format("Read {0} byte from socket" + "data = {1} ", strContent.Length, strContent)); } s.Close(); } Now this corrected works fine most of the time, but it fails when the packet's size is a multiple of the buffer. The reason for this is if the buffer gets filled on a read it is assumed there is more data; but the same problem happens as before. A 2 byte buffer, for exmaple, gets filled twice on a 4 byte packet, and assumes there is more data. It then blocks because there is nothing left to read. The problem is that the receive function doesn't know when the end of the packet is. This got me thinking to two possible solutions: I could either have an end-of-packet delimiter or I could read the packet header to find the length and then receive exactly that amount (as I originally suggested). There's problems with each of these, though. I don't like the idea of using a delimiter, as a user could somehow work that into a packet in an input string from the app and screw it up. It also just seems kinda sloppy to me. The length header sounds ok, but I'm planning on using protocol buffers - I don't know the format of the data. Is there a length header? How many bytes is it? Would this be something I implement myself? Etc.. What should I do?

    Read the article

  • How to install/replace on Android without using Eclipse

    - by Peter vdL
    A buddy sent me a later version of a .apk file. I already had the earlier version on my device. When I tried to adb install the file, I got this: $ adb install ../FlashLite.apk 320 KB/s (18311 bytes in 0.055s) pkg: /data/local/tmp/FlashLite.apk Failure [INSTALL_FAILED_ALREADY_EXISTS] $ adb uninstall FlashLite.apk Failure $ adb uninstall /data/local/tmp/FlashLite.apk Failure How do you install/replace from the cmd line? I don't have the source, so I cannot do it from Eclipse. Thanks, Peter

    Read the article

  • How to maximize http.sys file upload performance

    - by anelson
    I'm building a tool that transfers very large streaming data sets (possibly on the order of terabytes in a single stream; routinely in the tens of gigabytes) from one server to another. The client portion of the tool will read blocks from the source disk, and send them over the network. The server side will read these blocks off the network and write them to a file on the server disk. Right now I'm trying to decide which transport to use. Options are raw TCP, and HTTP. I really, REALLY want to be able to use HTTP. The HttpListener (or WCF if I want to go that route) make it easy to plug in to the HTTP Server API (http.sys), and I can get things like authentication and SSL for free. The problem right now is performance. I wrote a simple test harness that sends 128K blocks of NULL bytes using the BeginWrite/EndWrite async I/O idiom, with async BeginRead/EndRead on the server side. I've modified this test harness so I can do this with either HTTP PUT operations via HttpWebRequest/HttpListener, or plain old socket writes using TcpClient/TcpListener. To rule out issues with network cards or network pathways, both the client and server are on one machine and communicate over localhost. On my 12-core Windows 2008 R2 test server, the TCP version of this test harness can push bytes at 450MB/s, with minimal CPU usage. On the same box, the HTTP version of the test harness runs between 130MB/s and 200MB/s depending upon how I tweak it. In both cases CPU usage is low, and the vast majority of what CPU usage there is is kernel time, so I'm pretty sure my usage of C# and the .NET runtime is not the bottleneck. The box has two 6-core Xeon X5650 processors, 24GB of single-ranked DDR3 RAM, and is used exclusively by me for my own performance testing. I already know about HTTP client tweaks like ServicePointManager.MaxServicePointIdleTime, ServicePointManager.DefaultConnectionLimit, ServicePointManager.Expect100Continue, and HttpWebRequest.AllowWriteStreamBuffering. Does anyone have any ideas for how I can get HTTP.sys performance beyond 200MB/s? Has anyone seen it perform this well on any environment?

    Read the article

  • casting udp streams in perl

    - by user314536
    hello. my Perl scripts gets a udp response that is built out of 2 integers + float numbers. the problem is that the udp streams is one long stream of bytes. how do i cast the stream into parameters using Perl ?

    Read the article

  • read a binary file (python)

    - by beratch
    Hi, I cant read a file, and I dont understand why: f = open("test/test.pdf", "r") data = list(f.read()) print data Returns : [] I would like to open a PDF, and extract every bytes, and put it in a List. What's wrong with my code ? :( Thanks,

    Read the article

  • How to integrate Purify into Hudson CI?

    - by Martin
    Hello everybody! I have a Hudson CI system set up and for the moment it is used for building a project and running some unit tests. My next step is to integrate the memory leak detector Purify into the build cycle. Now I want to start the unit tests also inside purify and for this I have created a new batch task which runs following command: purify.exe /SaveTextData MyExecutable.exe --test TestLibrary.dll --output xml As I read in the Purify documentation the /SaveTextData option is used in order to run purify not in GUI mode. If I run this command on my local workstation in the command line it works perfectly. But in case it is started by Hudson, nothing happens. Unfortunetly there are no logs of purify... Has someone ever tried to start purify either by Hudson or any other CI system? Thanks in advance. Best regards Martin EDIT: I forgot to tell you, that I have Hudson running as master and slave on different computers. On the master I have configured a task which should start the unit tests within purify on the slave. I am running the slave via JNLP. EDIT 18.03.2010: Ok, so finally I am a bit closer the source of the problem. I have discovered, that running my unit tests in purify locally the log file EngineCmdLine.log contains three commands. I am starting purify with following command: purify.exe /SaveTextData TestRunnerConsoleWD.exe --test TestDemoWD.dll Output of EngineCmdLine.log when starting purify manually: File: D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestRunnerConsoleWD.exe File: C:\WINDOWS\system32\ws2_32.dll File: D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestDemoWD.dll Output when starting via Hudson: File: D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestRunnerConsoleWD.exe File: C:\WINDOWS\system32\ws2_32.dll File: D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestDemoWD.dll File: D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestDemoWD.dll The error output of purify: Instrumenting: BtcTestDemoWD.dll 313856 bytes Purify: While processing file D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TESTFWWD.DLL: Error: Cannot replace file c:\Programme\IBM\RationalPurifyPlus\PurifyPlus\cache\BTCTESTFWWD$Purify_D_workspace_hudson_workspace_Purify_TestFW_CommonsCoreTest_Cpp_msvs9.DLL. Is it in use? TESTFWWD.DLL 505344 bytes Unable to instrument D:\workspace\hudson\workspace\Purify_TestFW_CommonsCoreTest_Cpp_msvs9\TestDemoWD.dll (0x1) Question is, why is purify starting twice a command with the TestDemoWD.dll library?

    Read the article

  • Unable to capture standard output of process using Boost.Process

    - by Chris Kaminski
    Currently am using Boost.Process from the Boost sandbox, and am having issues getting it to capture my standard output properly; wondering if someone can give me a second pair of eyeballs into what I might be doing wrong. I'm trying to take thumbnails out of RAW camera images using DCRAW (latest version), and capture them for conversion to QT QImage's. The process launch function: namespace bf = ::boost::filesystem; namespace bp = ::boost::process; QImage DCRawInterface::convertRawImage(string path) { // commandline: dcraw -e -c <srcfile> -> piped to stdout. if ( bf::exists( path ) ) { std::string exec = "bin\\dcraw.exe"; std::vector<std::string> args; args.push_back("-v"); args.push_back("-c"); args.push_back("-e"); args.push_back(path); bp::context ctx; ctx.stdout_behavior = bp::capture_stream(); bp::child c = bp::launch(exec, args, ctx); bp::pistream &is = c.get_stdout(); ofstream output("C:\\temp\\testcfk.jpg"); streamcopy(is, output); } return (NULL); } inline void streamcopy(std::istream& input, std::ostream& out) { char buffer[4096]; int i = 0; while (!input.eof() ) { memset(buffer, 0, sizeof(buffer)); int bytes = input.readsome(buffer, sizeof buffer); out.write(buffer, bytes); i++; } } Invoking the converter: DCRawInterface DcRaw; DcRaw.convertRawImage("test/CFK_2439.NEF"); The goal is to simply verify that I can copy the input stream to an output file. Currently, if I comment out the following line: args.push_back("-c"); then the thumbnail is written by DCRAW to the source directory with a name of CFK_2439.thumb.jpg, which proves to me that the process is getting invoked with the right arguments. What's not happening is connecting to the output pipe properly. FWIW: I'm performing this test on Windows XP under Eclipse 3.5/Latest MingW (GCC 4.4).

    Read the article

  • Create big buffer on a pic18f with microchip c18 compiler

    - by acemtp
    Using Microchip C18 compiler with a pic18f, I want to create a "big" buffer of 3000 bytes in the program data space. If i put this in the main() (on stack): char tab[127]; I have this error: Error [1300] stack frame too large If I put it in global, I have this error: Error - section '.udata_main.o' can not fit the section. Section '.udata_main.o' length=0x0000007f How to create a big buffer? Do you have tutorial on how to manage big buffer on pic18f with c18?

    Read the article

  • Facing Memory Leaks in AES Encryption Method.

    - by Mubashar Ahmad
    Can anyone please identify is there any possible memory leaks in following code. I have tried with .Net Memory Profiler and it says "CreateEncryptor" and some other functions are leaving unmanaged memory leaks as I have confirmed this using Performance Monitors. but there are already dispose, clear, close calls are placed wherever possible please advise me accordingly. its a been urgent. public static string Encrypt(string plainText, string key) { //Set up the encryption objects byte[] encryptedBytes = null; using (AesCryptoServiceProvider acsp = GetProvider(Encoding.UTF8.GetBytes(key))) { byte[] sourceBytes = Encoding.UTF8.GetBytes(plainText); using (ICryptoTransform ictE = acsp.CreateEncryptor()) { //Set up stream to contain the encryption using (MemoryStream msS = new MemoryStream()) { //Perform the encrpytion, storing output into the stream using (CryptoStream csS = new CryptoStream(msS, ictE, CryptoStreamMode.Write)) { csS.Write(sourceBytes, 0, sourceBytes.Length); csS.FlushFinalBlock(); //sourceBytes are now encrypted as an array of secure bytes encryptedBytes = msS.ToArray(); //.ToArray() is important, don't mess with the buffer csS.Close(); } msS.Close(); } } acsp.Clear(); } //return the encrypted bytes as a BASE64 encoded string return Convert.ToBase64String(encryptedBytes); } private static AesCryptoServiceProvider GetProvider(byte[] key) { AesCryptoServiceProvider result = new AesCryptoServiceProvider(); result.BlockSize = 128; result.KeySize = 256; result.Mode = CipherMode.CBC; result.Padding = PaddingMode.PKCS7; result.GenerateIV(); result.IV = new byte[] { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 }; byte[] RealKey = GetKey(key, result); result.Key = RealKey; // result.IV = RealKey; return result; } private static byte[] GetKey(byte[] suggestedKey, SymmetricAlgorithm p) { byte[] kRaw = suggestedKey; List<byte> kList = new List<byte>(); for (int i = 0; i < p.LegalKeySizes[0].MaxSize; i += 8) { kList.Add(kRaw[(i / 8) % kRaw.Length]); } byte[] k = kList.ToArray(); return k; }

    Read the article

  • Non ASCII char in PHP?

    - by domagoj412
    Hello, I am trying to send something to serial port (r232) with PHP. I am using this class: http://www.phpclasses.org/browse/package/3679.html The problem is that I am allowed to send only 1 byte. But if I send something like "1", I am actually sending 49 (ASCII for 1). Instead of send("1"), I tried with send(1) but it is no good, because this is integer which has 2 bytes. So is there a way to send a "real" char, not ASCII equivalent?

    Read the article

  • How get page size from MemoryStream or Byte ?

    - by Nakul Chaudhary
    I have images of active report in database, when i get into bytes and convert into memory stream so it can pass to active report viewer then how i get paper size of paper display in active report? My code : Dim repmem As New System.IO.MemoryStream(rptBytes) repmem.Position = 0 Viewer1.Document.Load(repmem)

    Read the article

  • SocketChannel in Java sends data, but it doesn't get to destination application

    - by Peterson
    Hi Everybody, I'm suffering a lot to create a simple ChatServer in Java, using the NIO libraries. Wonder if someone could help me. I am doing that by using SocketChannel and Selector to handle multiple clients in a single thread. The problem is: I am able to accept new connections and get it's data, but when I try to send data back, the SocketChannel simply doesn't work. In the method write(), it returns a integer that is the same size of the data i'm passing to it, but the client never receives that data. Strangely, when I close the server application, the client receives the data. It's like the socketchannel maintains a buffer, and it only get flushed when I close the application. Here are some more details, to give you more information to help. I'm handling the events in this piece of code: private void run() throws IOException { ServerSocketChannel ssc = ServerSocketChannel.open(); // Set it to non-blocking, so we can use select ssc.configureBlocking( false ); // Get the Socket connected to this channel, and bind it // to the listening port this.serverSocket = ssc.socket(); InetSocketAddress isa = new InetSocketAddress( this.port ); serverSocket.bind( isa ); // Create a new Selector for selecting this.masterSelector = Selector.open(); // Register the ServerSocketChannel, so we can // listen for incoming connections ssc.register( masterSelector, SelectionKey.OP_ACCEPT ); while (true) { // See if we've had any activity -- either // an incoming connection, or incoming data on an // existing connection int num = masterSelector.select(); // If we don't have any activity, loop around and wait // again if (num == 0) { continue; } // Get the keys corresponding to the activity // that has been detected, and process them // one by one Set keys = masterSelector.selectedKeys(); Iterator it = keys.iterator(); while (it.hasNext()) { // Get a key representing one of bits of I/O // activity SelectionKey key = (SelectionKey)it.next(); // What kind of activity is it? if ((key.readyOps() & SelectionKey.OP_ACCEPT) == SelectionKey.OP_ACCEPT) { // Aceita a conexão Socket s = serverSocket.accept(); System.out.println( "LOG: Conexao TCP aceita de " + s.getInetAddress() + ":" + s.getPort() ); // Make sure to make it non-blocking, so we can // use a selector on it. SocketChannel sc = s.getChannel(); sc.configureBlocking( false ); // Registra a conexao no seletor, apenas para leitura sc.register( masterSelector, SelectionKey.OP_READ ); } else if ( key.isReadable() ) { SocketChannel sc = null; // It's incoming data on a connection, so // process it sc = (SocketChannel)key.channel(); // Verifica se a conexão corresponde a um cliente já existente if((clientsMap.getClient(key)) != null){ boolean closedConnection = !processIncomingClientData(key); if(closedConnection){ int id = clientsMap.getClient(key); closeClient(id); } } else { boolean clientAccepted = processIncomingDataFromNewClient(key); if(!clientAccepted){ // Se o cliente não foi aceito, sua conexão é simplesmente fechada sc.socket().close(); sc.close(); key.cancel(); } } } } // We remove the selected keys, because we've dealt // with them. keys.clear(); } } This piece of code is simply handles new clients that wants to connect to the chat. So, a client makes a TCP connection to the server, and once it gets accepted, it sends data to the server following a simply text protocol, informing his id and asking to get registrated to the server. I handle this in the method processIncomingDataFromNewClient(key). I'm also keeping a map of clients and its connections in a data structure similar to a hashtable. I? doing that because I need to recover a client Id from a connection and a connection from a client Id. This is can be shown in: clientsMap.getClient(key). But the problem itself resides in the method processIncomingDataFromNewClient(key). There, I simply read the data that the client sent to me, validate it, and if it's ok, I send a message back to the client to tell that it is connected to the chat server. Here is a similar piece of code: private boolean processIncomingDataFromNewClient(SelectionKey key){ SocketChannel sc = (SocketChannel) key.channel(); String connectionOrigin = sc.socket().getInetAddress() + ":" + sc.socket().getPort(); int id = 0; //id of the client buf.clear(); int bytesRead = 0; try { bytesRead = sc.read(buf); if(bytesRead<=0){ System.out.println("Conexão fechada pelo: " + connectionOrigin); return false; } System.out.println("LOG: " + bytesRead + " bytes lidos de " + connectionOrigin); String msg = new String(buf.array(),0,bytesRead); // Do validations with the client sent me here // gets the client id }catch (Exception e) { e.printStackTrace(); System.out.println("LOG: Oops. Cliente não conhece o protocolo. Fechando a conexão: " + connectionOrigin); System.out.println("LOG: Primeiros 10 caracteres enviados pelo cliente: " + msg); return false; } } } catch (IOException e) { System.out.println("LOG: Erro ao ler dados da conexao: " + connectionOrigin); System.out.println("LOG: "+ e.getLocalizedMessage()); System.out.println("LOG: Fechando a conexão..."); return false; } // If it gets to here, the protocol is ok and we can add the client boolean inserted = clientsMap.addClient(key, id); if(!inserted){ System.out.println("LOG: Não foi possível adicionar o cliente. Ou ele já está conectado ou já têm clientes demais. Id: " + id); System.out.println("LOG: Fechando a conexão: " + connectionOrigin); return false; } System.out.println("LOG: Novo cliente conectado! Enviando mesnsagem de confirmação. Id: " + id + " Conexao: " + connectionOrigin); /* Here is the error */ sendMessage(id, "Servidor pet: connection accepted"); System.out.println("LOG: Novo cliente conectado! Id: " + id + " Conexao: " + connectionOrigin); return true; } And finally, the method sendMessage(SelectionKey key) looks like this: private void sendMessage(int destId, String msg) { Charset charset = Charset.forName("ISO-8859-1"); CharBuffer charBuffer = CharBuffer.wrap(msg, 0, msg.length()); ByteBuffer bf = charset.encode(charBuffer); //bf.flip(); int bytesSent = 0; SelectionKey key = clientsMap.getClient(destId); SocketChannel sc = (SocketChannel) key.channel(); try { / int total_bytes_sent = 0; while(total_bytes_sent < msg.length()){ bytesSent = sc.write(bf); total_bytes_sent += bytesSent; } System.out.println("LOG: Bytes enviados para o cliente " + destId + ": "+ total_bytes_sent + " Tamanho da mensagem: " + msg.length()); } catch (IOException e) { System.out.println("LOG: Erro ao mandar mensagem para: " + destId); System.out.println("LOG: " + e.getLocalizedMessage()); } } So, what is happening is that the server, when send a message, prints something like this: LOG: Bytes sent to the client: 28 Size of the message: 28 So, it tells that it sent the data, but the chat client keeps blocking, waiting in the recv() method. So, the data never gets to it. When I close the server application, though, all the data appears in the client. I wonder why. It is important to say that the client is in C and the server JAVA, and I'm running both in the same machine, an Ubuntu Guest in virtualbox under windows. I also run both under windows host and under linuxes hosts, and keep getting the same strange problem. I'm sorry for the great lenght of this question, but I already searched a lot of places for an answer, found a lot of tutorials and questions, including here at StackOverflow, but coundn't find a reasonable explanation. I am really not liking this Java NIO, and i saw a lot of people complaining about it too. I am thinking that if I had done that in C it would have been a lot easier :-D So, if someone could help me and even discuss this behavor, it would be great! :-) Thanks everybody in advance, Péterson

    Read the article

  • can i expose SSMS 2005 Schema Change Report in sharepoint?

    - by dg
    a schema change was made on a production server that generates feeds to our parters, removing two bytes from a field, which clobbered our partner's jobs. my boss wants a notification mechanism to propagate schema changes to everyone, but instead of writing something, id like to get the schema change history report exposed on sharepoint somehow. is that possible? thanks very much for your help drew

    Read the article

< Previous Page | 91 92 93 94 95 96 97 98 99 100 101 102  | Next Page >