Search Results

Search found 1693 results on 68 pages for 'ascii'.

Page 41/68 | < Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >

  • Unexpected result from printf

    - by Sandeep
    #include<stdio.h> int main() { printf("He %c llo",65); } Output: He A llo #include<stdio.h> int main() { printf("He %c llo",13); } Output: llo. It doesnt print He. I can understand that 65 is ascii value for A and hence A is printed in first case but why llo in second case. Thanks

    Read the article

  • Ctrl-M chars when transfer files SFTP

    - by eve
    Hi, I am sending files from a windows system to a Unix SFTP server using JSCAPE ftp client. However, I am experiencing the following issue: When uploading a text file from windows to UNiX, each line of text files transferred contains Control-M characters. I did some search and found out that If I use the "ASCII" transfer mode it should solve the issue. But the Ctrl-M is still appearing on the files. Can anyone throw some light in this issue? thanks in advance

    Read the article

  • passing unicode string from C# exe to C++ DLL

    - by Martin
    Using this function in my C# exe, I try to pass a Unicode string to my C++ DLL: [DllImport("Test.dll", CharSet = CharSet.Unicode, CallingConvention = CallingConvention.StdCall)] public static extern int xSetTestString(StringBuilder xmlSettings); This is the function on the C++ DLL side: __declspec(dllexport) int xSetTestString(char* pSettingsXML); Before calling the function in C#, I do a MessageBox.Show(string) and it displays all characters properly. On the C++ side, I do: OutputDebugStringW((wchar_t*)pString);, but that shows that the non-ASCII characters were replaced by '?'.

    Read the article

  • syscall from within GCC inline assembly

    - by guest
    is it possible to write a single character using a syscall from within an inline assembly block? if so, how? it should look "something" like this: __asm__ __volatile__ ( " movl $1, %%edx \n\t" " movl $80, %%ecx \n\t" " movl $0, %%ebx \n\t" " movl $4, %%eax \n\t" " int $0x80 \n\t" ::: "%eax", "%ebx", "%ecx", "%edx" ); $80 is 'P' in ascii, but that returns nothing. any suggestions much appreciated!

    Read the article

  • Code golf - hex to (raw) binary conversion

    - by Alnitak
    In response to this question asking about hex to (raw) binary conversion, a comment suggested that it could be solved in "5-10 lines of C, or any other language." I'm sure that for (some) scripting languages that could be achieved, and would like to see how. Can we prove that comment true, for C, too? NB: this doesn't mean hex to ASCII binary - specifically the output should be a raw octet stream corresponding to the input ASCII hex. Also, the input parser should skip/ignore white space. edit (by Brian Campbell) May I propose the following rules, for consistency? Feel free to edit or delete these if you don't think these are helpful, but I think that since there has been some discussion of how certain cases should work, some clarification would be helpful. The program must read from stdin and write to stdout (we could also allow reading from and writing to files passed in on the command line, but I can't imagine that would be shorter in any language than stdin and stdout) The program must use only packages included with your base, standard language distribution. In the case of C/C++, this means their respective standard libraries, and not POSIX. The program must compile or run without any special options passed to the compiler or interpreter (so, 'gcc myprog.c' or 'python myprog.py' or 'ruby myprog.rb' are OK, while 'ruby -rscanf myprog.rb' is not allowed; requiring/importing modules counts against your character count). The program should read integer bytes represented by pairs of adjacent hexadecimal digits (upper, lower, or mixed case), optionally separated by whitespace, and write the corresponding bytes to output. Each pair of hexadecimal digits is written with most significant nibble first. The behavior of the program on invalid input (characters besides [a-fA-F \t\r\n], spaces separating the two characters in an individual byte, an odd number of hex digits in the input) is undefined; any behavior (other than actively damaging the user's computer or something) on bad input is acceptable (throwing an error, stopping output, ignoring bad characters, treating a single character as the value of one byte, are all OK) The program may write no additional bytes to output. Code is scored by fewest total bytes in the source file. (Or, if we wanted to be more true to the original challenge, the score would be based on lowest number of lines of code; I would impose an 80 character limit per line in that case, since otherwise you'd get a bunch of ties for 1 line).

    Read the article

  • Generate terminal graphics

    - by Werner
    Hi, which open source software options exist for generating terminal graphics? I mean, there is for instance the nice matplotlib, which can generate beautiful plots from data, in PNG or similar formats. But there are similar alternatives for generating just kind of ascii graphics? Thanks

    Read the article

  • Character Sets explained for Dummies!

    - by Imran
    I don't think i fully understand character sets so i was wondering if anyone would be kind enough to explain it in layman's terms with examples ( for Dummies).I know there is utf8, latin1, ascii ect The more answers the better really. Thank you in advance;-)

    Read the article

  • Need to convert int value to hex value

    - by SA
    Hi, I need to convert char to hex values. Refer to the Ascii table but I have a few examples listed below: int 1 = 31 2 = 32 3 = 33 4 = 34 5 = 35 A = 41 a = 61 etc Therefore int test = 12345; Need to get the converted i = 3132333435

    Read the article

  • Unexpected answer

    - by Sandeep
    #include<stdio.h> int main() { printf("He %c llo",65); } Output: A #include<stdio.h> int main() { printf("He %c llo",13); } Output: llo I can understand that 65 is ascii value for A and hence A is printed in first case but why llo in second case. Thanks

    Read the article

  • How to ANSI-C cast from unisigned int * to char *?

    - by user314290
    I want these two print functions to do the same thing: unsigned int Arraye[] = {0xffff,0xefef,65,66,67,68,69,0}; char Arrage[] = {0xffff,0xefef,65,66,67,68,69,0}; printf("%s", (char*)(2+ Arraye)); printf("%s", (char*)(2+ Arrage)); where Array is an unsigned int. Normally, I would change the type but, the problem is that most of the array is numbers, although the particular section should be printed as ASCII.

    Read the article

  • Efective way to avoid integer overflow when multiplying?

    - by Jonathan
    Hi, I'm working on a hash function which gets a string as input. Right now I'm doing a loop and inside the hash (an int variable) is being multiplied by a value and then the ASCII code for the current character is added to the mix. hash = hash * seed + string[i] But sometimes, if the string is big enough there is an integer overflow there, what can I do to avoid it while maintaining the same hash structure? Maybe a bit operation included inside the loop?

    Read the article

  • Problem while adding a new value to a hashtable when it is enumerated

    - by karthik
    `hi I am doing a simple synchronous socket programming,in which i employed twothreads one for accepting the client and put the socket object into a collection,other thread will loop through the collection and send message to each client through the socket object. the problem is 1.i connect to clients to the server and start send messages 2.now i want to connect a new client,while doing this i cant update the collection and add a new client to my hashtable.it raises an exception "collection modified .Enumeration operation may not execute" how to add a NEW value without having problems in a hashtable. private void Listen() { try { //lblStatus.Text = "Server Started Listening"; while (true) { Socket ReceiveSock = ServerSock.Accept(); //keys.Clear(); ConnectedClients = new ListViewItem(); ConnectedClients.Text = ReceiveSock.RemoteEndPoint.ToString(); ConnectedClients.SubItems.Add("Connected"); ConnectedList.Items.Add(ConnectedClients); ClientTable.Add(ReceiveSock.RemoteEndPoint.ToString(), ReceiveSock); //foreach (System.Collections.DictionaryEntry de in ClientTable) //{ // keys.Add(de.Key.ToString()); //} //ClientTab.Add( //keys.Add( } //lblStatus.Text = "Client Connected Successfully."; } catch (Exception ex) { MessageBox.Show(ex.Message); } } private void btn_receive_Click(object sender, EventArgs e) { Thread receiveThread = new Thread(new ThreadStart(Receive)); receiveThread.IsBackground = true; receiveThread.Start(); } private void Receive() { while (true) { //lblMsg.Text = ""; byte[] Byt = new byte[2048]; //ReceiveSock.Receive(Byt); lblMsg.Text = Encoding.ASCII.GetString(Byt); } } private void btn_Send_Click(object sender, EventArgs e) { Thread SendThread = new Thread(new ThreadStart(SendMsg)); SendThread.IsBackground = true; SendThread.Start(); } private void btnlist_Click(object sender, EventArgs e) { //Thread ListThread = new Thread(new ThreadStart(Configure)); //ListThread.IsBackground = true; //ListThread.Start(); } private void SendMsg() { while (true) { try { foreach (object SockObj in ClientTable.Keys) { byte[] Tosend = new byte[2048]; Socket s = (Socket)ClientTable[SockObj]; Tosend = Encoding.ASCII.GetBytes("FirstValue&" + GenerateRandom.Next(6, 10).ToString()); s.Send(Tosend); //ReceiveSock.Send(Tosend); Thread.Sleep(300); } } catch (Exception ex) { MessageBox.Show(ex.Message); } } }

    Read the article

  • Easy way to convert a string of 0's and 1's into a character? Plain C

    - by Nick
    I'm doing a steganography project where I read in bytes from a ppm file and add the least significant bit to an array. So once 8 bytes are read in, I would have 8 bits in my array, which should equal some character in a hidden message. Is there an easy way to convert an array of 0's and 1's into an ascii value? For example, the array: char bits[] = "0,1,1,1,0,1,0,0" would equal 't'. Plain C

    Read the article

  • Convert wchar_t to char

    - by Yan Cheng CHEOK
    I was wondering is it safe to do so? wchar_t wide = /* something */; assert(wide >= 0 && wide < 256 &&); char myChar = static_cast<char>(wide); If I am pretty sure the wide char will fall within ASCII range.

    Read the article

  • Android Soft Keyboard value using array

    - by Shubh
    Hi friends, Yet I use Soft Keyboard sample from http://developer.android.com/resources/samples/SoftKeyboard/index.html Here we uses ASCII value from XML ,now at the place of XML I want to generate these value using array. How can I do this Thanks in Advance

    Read the article

  • calling c function from assembly

    - by void
    I'm trying to use a function in assembly in a C project, the function is supposed to call a libc function let's say printf() but I keep getting a segmentation fault. In the .c file I have the declaration of the function let's say int do_shit_in_asm() In the .asm file I have .extern printf .section .data printtext: .ascii "test" .section .text .global do_shit_in_asm .type do_shit_in_asm, @function do_shit_in_asm: pushl %ebp movl %esp, %ebp push printtext call printf movl %ebp, %esp pop %ebp ret Any pointers would be appreciated. as func.asm -o func.o gcc prog.c func.o -o prog

    Read the article

  • Help with \0 terminated strings in C#

    - by Joshua
    I'm using a low level native API where I send an unsafe byte buffer pointer to get a c-string value. So it gives me // using byte[255] c_str string s = new string(Encoding.ASCII.GetChars(c_str)); // now s == "heresastring\0\0\0\0\0\0\0\0\0\0\0\0\0\0\0(etc)"; So obviously I'm not doing it right, how I get rid of the excess?

    Read the article

  • [PowerShell] Input encoding

    - by Andy
    Hi! I need to get output of native application under PowerShell. The problem is, output is encoded with UTF-8 (no BOM), which PowerShell does not recognize and just converts those funky UTF chars directly into Unicode. I've found PowerShell has $OutputEncoding variable, but it does not seem to affect input data. Good ol' iconv is of no help either, since this unnecessary UTF8-as-if-ASCII = Unicode conversion takes place before the next pipeline member acquires data.

    Read the article

  • Invalid instruction suffix for push when assembling with gas

    - by vitaut
    When assembling a file with GNU assembler I get the following error: hello.s:6: Error: invalid instruction suffix for `push' Here's the file that I'm trying to assemble: .text LC0: .ascii "Hello, world!\12\0" .globl _main _main: pushl %ebp movl %esp, %ebp subl $8, %esp andl $-16, %esp movl $0, %eax movl %eax, -4(%ebp) movl -4(%ebp), %eax call __alloca call ___main movl $LC0, (%esp) call _printf movl $0, %eax leave ret What is wrong here and how do I fix it?

    Read the article

< Previous Page | 37 38 39 40 41 42 43 44 45 46 47 48  | Next Page >