Search Results

Search found 5325 results on 213 pages for 'huffman encoding'.

Page 25/213 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Passing HTML using JSON

    - by ActionFactory
    Hi All, I'm passing Data using JSON to iPhone and iPad. One Field of Data is HTML. The problem is the encoding. Here's what I get back: > "GadgetHTML": "<strong>Hello</strong> > from Catworld<br />\n<img alt=\"\" > src=\"http://www.iconarchive.com/icons/fasticon/ifunny/128/dog-icon.png\" > />", The \ are killing me. The \n does not help. Any good way to do this? Any JSON to HTML Cleaning Functions? Encoding? (There must be something better than manually removing ) Thanks

    Read the article

  • Forcing a mixed ISO-8859-1 and UTF-8 multi-line string into UTF-8 in Perl

    - by knorv
    Consider the following problem: A multi-line string $junk contains some lines which are encoded in UTF-8 and some in ISO-8859-1. I don't know a priori which lines are in which encoding, so heuristics will be needed. I want to turn $junk into pure UTF-8 with proper re-encoding of the ISO-8859-1 lines. Also, in the event of errors in the processing I want to provide a "best effort result" rather than throwing an error. My current attempt looks like this: $junk = &force_utf8($junk); sub force_utf8 { my $input = shift; my $output = ''; foreach my $line (split(/\n/, $input)) { if (utf8::valid($line)) { utf8::decode($line); } $output .= "$line\n"; } return $output; } While this appears to work I'm certain this is not the optimal solution. How would you improve the force_utf8(...) sub?

    Read the article

  • How to send HTML as GET-Request parameter?

    - by Mork0075
    I would like to send a html string with a GET request like this with Apaches HttpClient: http://sample.com/?html=<html><head>... This doesnt work at the moment, i think its an encoding problem. Do you have any ideas how to do that? method.setQueryString(new NameValuePair[] {new NameValuePair("report", "<html>....")}); client.executeMethod(method) This fails with org.apache.commons.httpclient.NoHttpResponseException: The server localhost failed to respond. If i replace "<html>" by "test.." it works fine. EDIT It seams to be a problem of URL length after encoding, the server doesnt except such long URls. Sending it as POST solves the problem.

    Read the article

  • Strange behaviour of mb_detect_order() in PHP

    - by termopro
    I would like to detect encoding of some text (using PHP). For that purpose i use mb_detect_encoding() function. The problem is that the function returns different results if i change the order of possible encodings with mb_detect_order() function. Consider the following example $html = <<< STR ????????????????????????????????????????????????????????????????????????????????????????????????????????? STR; mb_detect_order(array('UTF-8','EUC-JP', 'SJIS', 'eucJP-win', 'SJIS-win', 'JIS', 'ISO-2022-JP','ISO-8859-1','ISO-8859-2')); $originalEncoding = mb_detect_encoding($str); die($originalEncoding); // $originalEncoding = 'UTF-8' However if you change the order of encodings in mb_detect_order() the results will be different: mb_detect_order(array('EUC-JP','UTF-8', 'SJIS', 'eucJP-win', 'SJIS-win', 'JIS', 'ISO-2022-JP','ISO-8859-1','ISO-8859-2')); die($originalEncoding); // $originalEncoding = 'EUC-JP' So my questions are: Why is that happening ? Is there a way in PHP to correctly and unambiguously detect encoding of text ?

    Read the article

  • Sane(r) way to get character-encoding of CLI?

    - by danyowdee
    Hi all! I was writing a CLI-Tool for Mac OS X (10.5+) that has to deal with command-line arguments which are very likely to contain non-ASCII characters. For further processing, I convert these arguments using +[NSString stringWithCString:encoding:]. My problem is, that I couldn't find good information on how to determine the character-encoding used by the shell in which said cli-tool is running in. What I came up with as a solution is the following: NSDictionary *environment = [[NSProcessInfo processInfo] environment]; NSString *ianaName = [[environment objectForKey:@"LANG"] pathExtension]; NSStringEncoding encoding = CFStringConvertEncodingToNSStringEncoding( CFStringConvertIANACharSetNameToEncoding( (CFStringRef)ianaName ) ); NSString *someArgument = [NSString stringWithCString:argv[someIndex] encoding:encoding]; I find that a little crude, however -- which makes me think that I missed out something obvious...but what? Is there a saner/cleaner way of achieving essentially the same? Thanks in advance D

    Read the article

  • jQuery AJAX Character Encoding Problem

    - by Salty
    Hi everyone, I'm currently coding a French website. There's a schedule page, where a link on the side can be used to load another day's schedule. http://aquate.us/film/horaire.html (At the moment, only the links for November 13th and November 14th work) Here's the JS I'm using to do this: <script type="text/javascript"> function load(y) { $.get(y,function(d) { $("#replace").html(d); mod(); }); } function mod() { $("#dates a").click(function() { y = $(this).attr("href"); load(y); return false; }); } mod(); </script> The actual AJAX works like a charm. My problem lies with the response to the request. Because it is a French website, there are many accented letters. I'm using the ISO-8859-15 charset for that very reason. However, in the response to my AJAX request, the accents are becoming ?'s because the character encoding seems to be changed back to UTF-8. How do I avoid this? I've already tried adding some PHP at the top of the requested documents to set the character set: <?php header('Content-Type: text/html; charset=ISO-8859-15'); ?> But that doesn't seem to work either. Any thoughts? Also, while any of you are looking here...why does the rightmost column seem to become smaller when a new page is loaded, causing the table to distort and each <li> within the <td> to wrap to the next line? Cheers

    Read the article

  • Emacs Lisp: how to set encoding for call-process

    - by RamyenHead
    I thought I knew how to set coding-system (or encoding): use process-coding-system-alist. Apparently, it's not working. ;; -*- coding: utf-8 -*- (require 'cl) (let ((process-coding-system-alist '("cygwin/bin/bash" . (utf-8-dos . utf-8-unix)))) (setq my-words (list "Lilo" "?_?" "_?" "?_" "?" "Stitch") my-cygwin-bash "C:/cygwin/bin/bash.exe" my-outbuf (get-buffer-create "*my cygwin bash echo test*") ) (with-current-buffer my-outbuf (goto-char (point-max)) (loop for word in my-words do (insert (concat "echo " word "\n")) (call-process my-cygwin-bash nil my-outbuf nil "-c" (concat "echo " word))) ) (display-buffer my-outbuf) ) Running the above code, the output is this: echo Lilo Lilo echo ?_? /usr/bin/bash: -c: line 0: unexpected EOF while looking for matching `"' /usr/bin/bash: -c: line 1: syntax error: unexpected end of file echo _? /usr/bin/bash: -c: line 0: unexpected EOF while looking for matching `"' /usr/bin/bash: -c: line 1: syntax error: unexpected end of file echo ?_ /usr/bin/bash: $'echo \346\267\205?': command not found echo ? /usr/bin/bash: -c: line 0: unexpected EOF while looking for matching `"' /usr/bin/bash: -c: line 1: syntax error: unexpected end of file echo Stitch Stitch Anything sent to cygwin in unicode is failing (MS Windows, Korean).

    Read the article

  • BizTalk SMTP Message Part Getting XML Encoding

    - by alram
    I have a email multi-part message which I am using to send failed message routing from the messagebox to a business users mailbox. Email{ Body - RawString; OriginalMessage - string}; The original message gets set from the received message that activates the orchestration. For example assume the original failed message is from a Flat file that failed disassembly with the contents: Order,1,2,3,4,5,<6>, I set the message using: Email.OriginalMessage = MyUtil.XlangMsgToStringMethod(FailedMessage);// XmlDocument type, this can be malformed xml, valid xml, or flat file that fails in disassembler. I can then write to the event log to test whats in Email.OriginalMessage: System.Diagnostics.EventLog.WriteEntry("BizTalk Server 2006", Email.OriginalMessage, Information); // This displays the correct original message "Order, 1,2,3,4,5,<6," When the email is delivered using a SMTP server and a dynamic send port, with the attachment set to text/plain mime type, the original message gets xml encoding escaped and wrapped in xml: <?xml version="1.0"?> <string>Order, 1,2,3,4,5,&lt;6&gt;,</string> Any ideas why? The SMTP port has passthrutransmit as pipeline. Thanks.

    Read the article

  • Problems with utf-8 encoding in php

    - by Addsy
    Hi, Another utf-8 related problem I believe... I am using php to update data in a mysql db then display that data elsewhere in the site. Previously I have run into utf-8 problems before where special characters are displayed as question marks when viewed in a browser but this one seems slightly different. I have a number of records to enter that contain the è character. If I enter this directly in the db then it appears correctly on the page so I take this to mean that utf-8 content is being output correctly. However when I try and update the values in the db through php, then the è character is replaced. What appears instead is & Atilde ; & uml ; (without the spaces) which appears in the browser as è I have the tables in the database set to use UTF-8. I believe this is correct cos, as mentioned, if I update the db through phpMyAdmin, its all ok. Similarly I have set the character encoding for the page which seems to be correct. I am also running the sql statement "SET NAMES 'utf8';" before trying to update the db. Anyone have any other ideas as to where the problem may lie? Many thanks

    Read the article

  • Encoding/Decoding hex packet

    - by Roberto Pulvirenti
    I want to send this hex packet: 00 38 60 dc 00 00 04 33 30 3c 00 00 00 20 63 62 39 62 33 61 36 37 34 64 31 36 66 32 31 39 30 64 30 34 30 63 30 39 32 66 34 66 38 38 32 62 00 06 35 2e 31 33 2e 31 00 00 02 3c so i build the string: string packet = "003860dc0000" + textbox1.text+ "00000020" + textbox2.text+ "0006" + textbox3.text; then "convert" it to ascii: conn_str = HexString2Ascii(packet); then i send the packet... but i have this: 00 38 60 **c3 9c** 00 00 04 33 30 3c 00 00 00 20 63 62 39 62 33 61 36 37 34 64 31 36 66 32 31 39 30 64 30 34 30 63 30 39 32 66 34 66 38 38 32 62 00 06 35 2e 31 33 2e 31 00 00 02 3c **0a** why?? Thank you! P.S. the function is: private string HexString2Ascii(string hexString) { byte[] tmp; int j = 0; int lenght; lenght=hexString.Length-2; tmp = new byte[(hexString.Length)/2]; for (int i = 0; i <= lenght; i += 2) { tmp[j] =(byte)Convert.ToChar(Int32.Parse(hexString.Substring(i, 2), System.Globalization.NumberStyles.HexNumber)); j++; } return Encoding.GetEncoding(1252).GetString(tmp); }

    Read the article

  • Silverlight - WCF Enable Binary Encoding

    - by Villager
    Hello, I have a WCF service that is returning a lot of data. I want to compress that information so I thought that using BinaryEncoding would be appropriate. Currently, I have a binding setup in my web.config as follows: <binding name="myCustomBinding" closeTimeout="00:05:00" openTimeout="00:05:00" receiveTimeout="00:05:00" sendTimeout="00:05:00"> <binaryMessageEncoding /> <httpTransport maxReceivedMessageSize="8388608" maxBufferSize="8388608"> <extendedProtectionPolicy policyEnforcement="Never" /> </httpTransport> </binding> In my ServiceReferences.clientconfig file, I have the following binding settings: <binding name="CustomBinding_MyService"> <httpTransport maxReceivedMessageSize="2147483647" maxBufferSize="2147483647"> <extendedProtectionPolicy policyEnforcement="Never" /> </httpTransport> </binding> Oddly, this configuration will not work. As soon as I remove the <binaryMessageEncoding /> line from the web.config, everything works fine. My question is, how do I use binary message encoding? Is there something I need to configure in my ServiceReferences.clientconfig? Thank you

    Read the article

  • Unicode version of base64 encoding/ decoding

    - by Yan Cheng CHEOK
    I am using base64 encoding/decoding from http://www.adp-gmbh.ch/cpp/common/base64.html It works pretty well with the following code. const std::string s = "I Am A Big Fat Cat" ; std::string encoded = base64_encode(reinterpret_cast<const unsigned char*>(s.c_str()), s.length()); std::string decoded = base64_decode(encoded); std::cout << _T("encoded: ") << encoded << std::endl; std::cout << _T("decoded: ") << decoded << std::endl; However, when comes to unicode namespace std { #ifdef _UNICODE typedef wstring tstring; #else typedef string tstring; #endif } const std::tstring s = _T("I Am A Big Fat Cat"); How can I still make use of the above function? Merely changing std::string base64_encode(unsigned TCHAR const* , unsigned int len); std::tstring base64_decode(std::string const& s); will not work correctly. (I expect base64_encode to return ASCII. Hence, std::string should be used instead of std::tstring)

    Read the article

  • Streaming webcam video in Flash using MP4 encoding

    - by Herms
    One of the features of the Flash app I'm working on is to be able to stream a webcam to others. We're just using the built-in webcam support in Flash and sending it through FMS. We've had some people ask for higher quality video, but we're already using the highest quality setting we can in Flash (setting quality to 100%). My understanding is that in the newer flash players they added support for MPEG-4 encoding for the videos. I created a simple test Flex app to try and compare the video quality of the MP4 vs FLV encodings. However, I can't seem to get MP4 to work at all. According to the Flex documentation the only thing I need to do to use MP4 instead of FLV is prepend "mp4:" to the name of the stream when calling publish: Specify the stream name as a string with the prefix mp4: with or without the filename extension. The prefix indicates to the server that the file contains H.264-encoded video and AAC-encoded audio within the MPEG-4 Part 14 container format. When I try this nothing happens. I don't get any events raised on the client side, no exceptions thrown, and my logging on the server side doesn't show any streams starting. Here's the relevant code: // These are all defined and created within the class. private var nc:NetConnection; private var sharing:Boolean; private var pubStream:NetStream; private var format:String; private var streamName:String; private var camera:Camera; // called when the user clicks the start button private function startSharing():void { if (!nc.connected) { return; } if (sharing) { return; } if(pubStream == null) { pubStream = new NetStream(nc); pubStream.attachCamera(camera); } startPublish(); sharing = true; } private function startPublish():void { var name:String; if (this.format == "mp4") { name = "mp4:" + streamName; } else { name = streamName; } //pubStream.publish(name, "live"); pubStream.publish(name, "record"); }

    Read the article

  • Invalid character in a Base-64 string when Concatenating and Url encoding a string

    - by Rob
    I’m trying to write some encryption code that is passed through a Url. For the sake of the issue I’ve excluded the actual encryption of the data and just shown the code causing the problem. I take a salt value, convert it to a byte array and then convert that to a base64 string. This string I concatenate to another base64 string (which was previously a byte array). These two base64 strings are then Url encoded. Here’s my code... using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Security.Cryptography; using System.IO; using System.Web; class RHEncryption { private static readonly Encoding ASCII_ENCODING = new System.Text.ASCIIEncoding(); private static readonly string SECRET_KEY = "akey"; private static string md5(string text) { return BitConverter.ToString(new MD5CryptoServiceProvider().ComputeHash(ASCII_ENCODING.GetBytes(text))).Replace("-", "").ToLower(); } public string UrlEncodedData; public RHEncryption() { // encryption object RijndaelManaged aes192 = new RijndaelManaged(); aes192.KeySize = 192; aes192.BlockSize = 192; aes192.Mode = CipherMode.CBC; aes192.Key = ASCII_ENCODING.GetBytes(md5(SECRET_KEY)); aes192.GenerateIV(); // convert Ivector to base64 for sending string base64IV = Convert.ToBase64String(aes192.IV); // salt value string s = "maryhadalittlelamb"; string salt = s.Substring(0, 8); // convert to byte array // and base64 for sending byte[] saltBytes = ASCII_ENCODING.GetBytes(salt.TrimEnd('\0')); string base64Salt = Convert.ToBase64String(saltBytes); //url encode concatenated base64 strings UrlEncodedData = HttpUtility.UrlEncode(base64Salt + base64IV, ASCII_ENCODING); } public string UrlDecodedData() { // decode the url encode string string s = HttpUtility.UrlDecode(UrlEncodedData, ASCII_ENCODING); // convert back from base64 byte[] base64DecodedBytes = null; try { base64DecodedBytes = Convert.FromBase64String(s); } catch (FormatException e) { Console.WriteLine(e.Message.ToString()); Console.ReadLine(); } return s; } } If I then call the UrlDecodedData method I get a “Invalid character in a Base-64 string” exception. This is generated because the base64Salt variable contains an invalid character (I’m guessing a line termination) but I can’t seem to strip it off.

    Read the article

  • Which browsers handle `Content-Encoding: gzip` and which of them has any special requirements on encodinq quality?

    - by user1049847
    I am creating a "hand made" HTTP 1/0, 1/1 server. I recently integrated zip lib so now I can stream encoded gziped data in and out. I wonder Which major browsers (alive ones - IE6-IE10, Chrome, FF, etc) send Accept-Encoding: deflate, gzip, ... and so can handle Content-Encoding: gzip today? Which of them send any quality expectations? Which of them can send encoded gziped post request and multypart/form data to my server?

    Read the article

  • How do you remove invalid hexadecimal characters from an XML-based data source prior to constructing

    - by Oppositional
    Is there any easy/general way to clean an XML based data source prior to using it in an XmlReader so that I can gracefully consume XML data that is non-conformant to the hexadecimal character restrictions placed on XML? Note: The solution needs to handle XML data sources that use character encodings other than UTF-8, e.g. by specifying the character encoding at the XML document declaration. Not mangling the character encoding of the source while stripping invalid hexadecimal characters has been a major sticking point. The removal of invalid hexadecimal characters should only remove hexadecimal encoded values, as you can often find href values in data that happens to contains a string that would be a string match for a hexadecimal character. Background: I need to consume an XML-based data source that conforms to a specific format (think Atom or RSS feeds), but want to be able to consume data sources that have been published which contain invalid hexadecimal characters per the XML specification. In .NET if you have a Stream that represents the XML data source, and then attempt to parse it using an XmlReader and/or XPathDocument, an exception is raised due to the inclusion of invalid hexadecimal characters in the XML data. My current attempt to resolve this issue is to parse the Stream as a string and use a regular expression to remove and/or replace the invalid hexadecimal characters, but I am looking for a more performant solution.

    Read the article

  • Freemarker - lack of special character in email subject template cause email content crash

    - by freakman
    im fighting with strange error. Im using seperate freemarker templates for mail subject and body. It is sent using org.springframework.mail.javamail.JavaMailSender. Only templates that contains some special swedish character works in my application ( yes you read right... not the other way). If I delete it my email content crashes. It contains then: MIME-Version: 1.0 Content-Type: text/html;charset=UTF-8 Content-Transfer-Encoding: 7bit .. html code here .. My freemarker.properties file locale=sv_SE classic_compatible=false number_format= date_format=yyyy-MM-dd time_format=HH:mm datetime_format=yyyy-MM-dd HH:mm output_encoding=UTF-8 url_escaping_charset=UTF-8 auto_import=spring.ftl as spring auto_include= default_encoding=UTF-8 localized_lookup=true strict_syntax=true whitespace_stripping=true template_update_delay=10 Ive tried to convert subject file with dos2unix tool. Using 'find -bi subject.ftl' show that encoding is us-ascii. With added special character - utf-8. This thing is suprisingly strange for me... //SOLUTION: use :set bomb and save file in vim.

    Read the article

  • How to read properties file in Greek using Java

    - by Subhendu Mahanta
    I am trying to read from a properties file which have keys in English & values in greek.My code is like this: public class I18NSample { static public void main(String[] args) { String language; String country; if (args.length != 2) { language = new String("el"); country = new String("GR"); } else { language = new String(args[0]); country = new String(args[1]); } Locale currentLocale; ResourceBundle messages; currentLocale = new Locale(language, country); messages = ResourceBundle.getBundle("MessagesBundle",currentLocale, new CustomClassLoader("E:\\properties")); System.out.println(messages.getString("greetings")); System.out.println(messages.getString("inquiry")); System.out.println(messages.getString("farewell")); } } import java.io.File; import java.net.MalformedURLException; import java.net.URL; public class CustomClassLoader extends ClassLoader { private String path; public CustomClassLoader(String path) { super(); this.path = path; } @Override protected URL findResource(String name) { File f = new File(path + File.separator + name); try { return f.toURL(); } catch (MalformedURLException e) { e.printStackTrace(); } return super.findResource(name); } } MessagesBundle_el_GR.properties greetings=??µ. ?a??et? farewell=ep?f. a?t?? inquiry=t? ???e?s, t? ???ete I am compiling like this javac -encoding UTF8 CustomClassLoader.java javac -encoding UTF8 I18Sample.java When I run this I get garbled output.If the properies file is in English,French or German it works fine. Please help. Regards, Subhendu

    Read the article

  • Rip DVD on Linux

    - by becomingGuru
    I want to rip a DVD and store it in a good and compressed format. What software to use for that and what is the best format; I want best quality and low file size. +1 if I don't have to install any new software (at least those not in the repos) +1 if it can be done on command line

    Read the article

  • How to rip dvd and arrange files

    - by user23950
    I'm ripping a dvd with dvd shrink. There are lots of anime episodes on it. And when I open those .vob files it seems that the episodes are not arranged in one .vob file. Because 1 .vob file has multiple episodes but they are not arranged as ep 1, ep 2, ep 3, etc. Is it possible to edit those files so that the episodes would be arranged

    Read the article

  • Grep a strange acirc character

    - by John Hunt
    I have this character appearing in places in some files I have:  (if you can't see it or it looks like a question mark it's the Acirc character (capital A with a circumflex over it)) I simply want to grep replace this char with a space, however when I do this: grep --color -ri  myproject.php Putty gets very confused, as does grep. As I understand it there's probably a way to use an escaped hex code with grep.. does anyone know how? EDIT: The character is showing up on my web page as a weird <?>. The http headers for the page specify utf-8 as does the meta character set and I still see the strange character. In putty it appears as a space (putty also set to utf-8.) When I copy from vim and paste into grep it simply doesn't find it. Cheers, John

    Read the article

  • How to create 1280x720 music video with static pic

    - by monov
    I wanna upload a song to youtube, and put a static 1280x720 pic as the video. I'd like the format to be one of the recommended ones. I tried Windows Movie Maker 2.6 but it only generated a 640x480 video. I also tried Windows Live Movie Maker but it put a big black margin around my pic (and unexplicably, produced a video with a slightly lower volume). Do you know any way to do what I need?

    Read the article

  • Why is my ogg bigger then my m4a?

    - by acidzombie24
    i am using ffmpeg.exe -i 0123456789 -ab 192k out.m4a ffmpeg.exe -i 0123456789 -f wav - | oggenc2.exe - -r -q 6 -o out.ogg (0123456789 has no extension). My m4a output is 14,608kb while my ogg output is 19,809kb why? AFAIK -q 6 is roughly 192kb. So it should be about even. I could see one file being 1-3mb bigger then the other but 5 is pretty large. the m4a is almost 75% of the ogg! thats a lot! Why is this?

    Read the article

  • Finding out if a FLAC or WAVPACK audio file is NOT originally encoded from a lossy source

    - by cornel
    Is there a way of checking that the so-called FLAC or WAVPACK audio file was originally encoded from a lossless source (WAV, CDA, APE, etc.) instead of a lossy source (MP3, AAC, ATRAC, etc.)? Say I have a lossy MP3 audio file (5.17Mb, 87% compressed from its original, source unknown). I then encode it to another lossless format, say FLAC or WAVPACK. The size increases (23.14Mb, 39% compressed from its original, source MP3)! ID tags, etc, remain the same and there's no way of checking the integrity of its origin. How do I go about doing that?

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >