Search Results

Search found 796 results on 32 pages for 'hex'.

Page 4/32 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Compute hex color code for an arbitrary string

    - by user222164
    Heading Is there a way to map an arbitrary string to a HEX COLOR code. I tried to compute the HEX number for string using string hashcode. Now I need to convert this hex number to six digits which are in HEX color code range. Any suggestions ? String [] programs = {"XYZ", "TEST1", "TEST2", "TEST3", "SDFSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS"}; for(int i = 0; i < programs.length; i++) { System.out.println( programs[i] + " -- " + Integer.toHexString(programs[i].hashCode())); }

    Read the article

  • Hex colors: Numeric representation for "transparent"?

    - by Pekka
    I am building a web CMS in which the user can choose colours for certain site elements. I would like to convert all colour values to hex to avoid any further formatting hassle ("rgb(x,y,z)" or named colours). I have found a good JS library for that. The only thing that I can't get into hex is "transparent". I need this when explicitly declaring an element as transparent, which in my experience can be different from not defining any value at all. Does anybody know whether this can be turned into some numeric form? Will I have to set up all processing instances to accept hex values or "transparent"? I can't think of any other way.

    Read the article

  • Plot hex tiles with different length sides?

    - by Phil
    I'm trying to create a basic grid of hex tiles. I found some code... s=h/Math.cos(30*Math.PI/180)/2; tile._x=x*s*1.5; tile._y=y*h+(x%2)*h/2; That does just that, but I think it's setup for hex's that have same length sides. However my hex has different length sides. It's width is 140 and it's height is 80. I could completely change the code to work with my side lengths, but I was wondering if there's a better way of doing it with the code above.

    Read the article

  • View numeric columns in hex - SQL Server Management Studio

    - by Jonathan
    In SQL Server Management Studio, when I run a query which outputs a numeric column (int or similar types), they are displayed in decimal (example: 193). I want them to display in hex (example: 0xC1). How do I do that? I found many answers on how to write converter functions, but I don't want that - only change the display in SSMS. I only found this unanswered question. I use SQL2010, though may move to SQL2012 soon.

    Read the article

  • Details in hex of the certificate in .pem openssl

    - by allenzzzxd
    Hi, I have generated using openssl mycert.pem which contents the certificate. And I converted the base64 text into hex. I wonder if it's possible to extract the informations from the hex string in c (without using the openssl library). For example, the public key, the issuer, the subject, the validity information, etc. Thanks.

    Read the article

  • Deploy binary hex registry via GPO or PowerShell

    - by Prashanth Sundaram
    I am trying to deploy a custom registry entry which I exported from a test machine. It looks like below. I came across THIS similar request on another site, but I couldn't make it to work. "TextFontSimple"=hex:3c,00,00,00,1f,00,00,f8,00,00,00,40,dc,00,00,00,00,00,00,\ 00,00,00,00,ff,00,31,43,6f,75,72,69,65,72,20,4e,65,77,00,00,00,00,00,00,00,\ 00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00 As per the other solution, my PS command below, throws error."A parameter cannot be found that matches parameter name" Set-ItemProperty -Path "HKEY_CURRENT_USER\Software\Microsoft\Office\14.0\Common\MailSettings" -Name "TextFontSimple" -PropertyType Binary -Value ([byte[]] (0x3c,0x00,0x00,0x00,0x1f....0x00)) Any ideas? ====EDIT===== The key & value already exists. When I use Get-ItemProperty PSPath : Microsoft.PowerShell.Core\Registry::HKEY_CURRENT_USER\Software\Microsoft\Office\14.0\Common\MailSettings PSParentPath : Microsoft.PowerShell.Core\Registry::HKEY_CURRENT_USER\Software\Microsoft\Office\14.0\Common PSChildName : MailSettings PSProvider : Microsoft.PowerShell.Core\Registry TextFontSimple : {60, 0, 0, 0...}

    Read the article

  • How to input 64-bit hex values in octave

    - by Chris Ashton
    I'm trying to use Octave as a programmer's calculator. I want to input a 64-bit pointer, but when I do apparently the 64-bit value gets silently truncated to 32-bit: octave:44> base_ptr=0x1010101020202020 base_ptr = 538976288 octave:45> uint64(base_ptr) ans = 538976288 octave:46> printf("%lx\n", base_ptr) 20202020 So it seems like it's truncated the input value to the low 32-bits. I would use scanf, but the docs say it should only be used internally. How can I input the full 64-bit value? Alternately, is there some awesome free programmer's calculator out there for Windows? (I know Windows calculator has a programmer's mode but I would like arbitrary variable support). I tried using my ti-89 but it also doesn't support 64-bit hex.

    Read the article

  • iptables drop packet by hex string match

    - by Flint
    I got this packet captured with tcpdump but I'm not sure how to use the --hex-string param to match the packet. Can someone show me how to do it? 11:18:26.614537 IP (tos 0x0, ttl 17, id 19245, offset 0, flags [DF], proto UDP (17), length 37) x.x.187.207.1234 > x.x.152.202.6543: [no cksum] UDP, length 9 0x0000: f46d 0425 b202 000a b853 22cc 0800 4500 .m.%.....S"...E. 0x0010: 0025 4b2d 4000 1111 0442 5ebe bbcf 6701 .%[email protected]^...g. 0x0020: 98ca 697d 6989 0011 0000 ffff ffff 5630 ..i}i.........V0 0x0030: 3230 3300 0000 0000 0000 0000 203.........

    Read the article

  • Using sed to convert hex characters in postgresql dump file

    - by Bernt
    I am working on moving several databases from a Postgresql 8.3 server to a Postgresql 8.4 server. It has worked fine so far, but one base has given me some trouble. The database is listed as unicode-encoded in the 8.3-server, but somehow a client program has managed to inject some invalid unicode data into it. When I do a normal dump and restore using postgres' custom format, the new server won't accept it, complaining about unicode errors. My plan is to do a plain text dump of the database, then use sed to replace the invalid characters with nothing (they are not needed). But how do you make sed work on hex/binary values in a file?

    Read the article

  • Encryption is hard: AES encryption to Hex

    - by Rob Cameron
    So, I've got an app at work that encrypts a string using ColdFusion. ColdFusion's bulit-in encryption helpers make it pretty simple: encrypt('string_to_encrypt','key','AES','HEX') What I'm trying to do is use Ruby to create the same encrypted string as this ColdFusion script is creating. Unfortunately encryption is the most confusing computer science subject known to man. I found a couple helper methods that use the openssl library and give you a really simple encryption/decryption method. Here's the resulting string: "\370\354D\020\357A\227\377\261G\333\314\204\361\277\250" Which looks unicode-ish to me. I've tried several libraries to convert this to hex but they all say it contains invalid characters. Trying to unpack it results in this: string = "\370\354D\020\357A\227\377\261G\333\314\204\361\277\250" string.unpack('U') ArgumentError: malformed UTF-8 character from (irb):19:in `unpack' from (irb):19 At the end of the day it's supposed to look like this (the output of the ColdFusion encrypt method): F8E91A689565ED24541D2A0109F201EF Of course that's assuming that all the padding, initialization vectors, salts, cypher types and a million other possible differences all line up. Here's the simple script I'm using to encrypt/decrypt: def aes(m,k,t) (aes = OpenSSL::Cipher::Cipher.new('aes-256-cbc').send(m)).key = Digest::SHA256.digest(k) aes.update(t) << aes.final end def encrypt(key, text) aes(:encrypt, key, text) end def decrypt(key, text) aes(:decrypt, key, text) end Any help? Maybe just a simple option I can pass to OpenSSL::Cipher::Cipher that will tell it to hex-encode the final string?

    Read the article

  • How to convert an NSString to hex values

    - by einsteinx2
    I'd like to convert a regular NSString into an NSString with the (what I assume are) ASCII hex values and back. I need to produce the same output that the Java methods below do, but I can't seem to find a way to do it in Objective-C. I've found some examples in C and C++ but I've had a hard time working them into my code. Here are the Java methods I'm trying to reproduce: /** * Encodes the given string by using the hexadecimal representation of its UTF-8 bytes. * * @param s The string to encode. * @return The encoded string. */ public static String utf8HexEncode(String s) { if (s == null) { return null; } byte[] utf8; try { utf8 = s.getBytes(ENCODING_UTF8); } catch (UnsupportedEncodingException x) { throw new RuntimeException(x); } return String.valueOf(Hex.encodeHex(utf8)); } /** * Decodes the given string by using the hexadecimal representation of its UTF-8 bytes. * * @param s The string to decode. * @return The decoded string. * @throws Exception If an error occurs. */ public static String utf8HexDecode(String s) throws Exception { if (s == null) { return null; } return new String(Hex.decodeHex(s.toCharArray()), ENCODING_UTF8); }

    Read the article

  • PHP How to create real hex values from a string

    - by Piet
    Hi, I have a string with hexvalues that I use with sha1() echo sha1("\x23\x9A\xB9\xCB\x28\x2D\xAF\x66\x23\x1D\xC5\xA4\xDF\x6B\xFB\xAE\x00\x00\x00\x01"); ab94fcedf2664edfb9b291f85d7f77f27f2f4a9d now I have another string with the same value only not hex. $string2=strtoupper("239ab9cb282daf66231dc5a4df6bfbae00000001"); I want to convert this string so that it is read as above and that the sha1-value is the same as above. echo sha1(do_something($string2)); ab94fcedf2664edfb9b291f85d7f77f27f2f4a9d Does anybody know how to convert a string to real hexvalues? I've tried with pack, sprinft, hexdec, but nothing worked (couldn't find typecasting in hex) Thanks

    Read the article

  • Hex web colours

    - by Mick
    Hi I am displaying a colour as a hex value in php . Is it possible to vary the shade of colour by subtracting a number from the hex value ? What I want to do it display vivid web safe colour but if selected I want to dull or lighten the colour. I know I can just use two shades of colour but I could hundred of potential colours . to be clear #66cc00 is bright green and #99ffcc is a very pale green . What do i subtract to get the second colour ? is there any formula because I just can get it . Thanks for any help Cheers

    Read the article

  • Python UTF-16 encoding hex representation

    - by Romeno
    I have a string in Python 2.7.2 say u"\u0638". When I write it to file: f = open("J:\\111.txt", "w+") f.write(u"\u0638".encode('utf-16')) f.close() In hex it looks like: FF FE 38 06 When i print such a string to stdout i will see: '\xff\xfe8\x06'. The querstion: Where is \x38 in the string output to stdout? In other words why the string output to stdout is not '\xff\xfe\x38\x06'? If I write the string to file twice: f = open("J:\\111.txt", "w+") f.write(u"\u0638".encode('utf-16')) f.write(u"\u0638".encode('utf-16')) f.close() The hex representation in file contains byte order mark (BOM) \xff\xfe twice: FF FE 38 06 FF FE 38 06 I wonder what is the techique to avoid writting BOM in UTF-16 encoded strings?

    Read the article

  • How do you pronounce large hex numbers?

    - by warrenm
    This question might be subjective, but I'm hoping there's some consensus that I just don't know about. Short hex numbers are relatively easy to spell out (e.g., 0xC4A might be "cee-four-ay"). Hex numbers ending with a multiple of three zeros are likewise pretty easy (e.g., 0xC000 might be "cee-thousand"). But is there a concise way to pronounce 0xFFFF0000 or 0xCA000000? Magic numbers like 0xDEADBEEF are popular for their pronounceability, but I'm mostly asking about large-ish, round numbers that seem like they should have a more concise pronunciation.

    Read the article

  • Compressing a hex string in Ruby/Rails

    - by PreciousBodilyFluids
    I'm using MongoDB as a backend for a Rails app I'm building. Mongo, by default, generates 24-character hexadecimal ids for its records to make sharding easier, so my URLs wind up looking like: example.com/companies/4b3fc1400de0690bf2000001/employees/4b3ea6e30de0691552000001 Which is not very pretty. I'd like to stick to the Rails url conventions, but also leave these ids as they are in the database. I think a happy compromise would be to compress these hex ids to shorter collections using more characters, so they'd look something like: example.com/companies/3ewqkvr5nj/employees/9srbsjlb2r Then in my controller I'd reverse the compression, get the original hex id and use that to look up the record. My question is, what's the best way to convert these ids back and forth? I'd of course want them to be as short as possible, but also url-safe and simple to convert. Thanks!

    Read the article

  • Insert hex string value to sql server image field is appending extra 0

    - by rotary_engine
    Have an image field and want to insert into this from a hex string: insert into imageTable(imageField) values(convert(image, 0x3C3F78...)) however when I run select the value is return with an extra 0 as 0x03C3F78... This extra 0 is causing a problem in another application, I dont want it. How to stop the extra 0 being added? The schema is: CREATE TABLE [dbo].[templates]( [templateId] [int] IDENTITY(1,1) NOT NULL, [templateName] [nvarchar](50) NOT NULL, [templateBody] [image] NOT NULL, [templateType] [int] NULL) and the query is: insert into templates(templateName, templateBody, templateType) values('I love stackoverflow', convert(image, 0x3C3F786D6C2076657273696F6E3D.......), 2) the actual hex string is quite large to post here.

    Read the article

  • printf not passing correct Hex Address to stack

    - by kriss
    I have a hickup in using printf . I am on ubuntu 10.04. Basically i have a C program asking for some input and then prints it back. It is OK for printing something after inputing. I tried to insert some Hex Address to Stack by following format:- printf "hello world!\x12\x23\x34" | ./input1 But i don't know what is the problem. If i give only string beyond 12 bytes it overwrites BUT If I give hex address(through printf), it doesn't overwrite on return address. Instead it stores some other thing. Could anyone help??? I can't proceed further becoz of this. Thanks in advance

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >