Search Results

Search found 5333 results on 214 pages for 'chunked encoding'.

Page 11/214 | < Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >

  • Eclipse Encoding MacRoman -> UTF8

    - by wishi_
    Hi! I recently created a project, organized it and well... I used my Mac with Eclipse running. Somehow it stored everything in MacRoman. The project has to be UTF8. Is there any easy way to handle the conversions?

    Read the article

  • req.getParameter returns values wrong character encoding

    - by coder247
    I'm trying to get values from a JSP using getParameter which includes ü,é,à etc. But get wrong values in servlet. I've checked the content type with firebug and found that Content-Type text/html;charset=UTF-8 checked the POST section with firebug and found the correct value there, when I try to access it in servlet it is wrong. Gives ö instead of ö req.getCharacterEncoding(); returns null. Tried with setting req.setCharacterEncoding("UTF-8"); at the beginning of servlet but didn't help.

    Read the article

  • X264 encoding using Opencv

    - by user573193
    I am working with a high resolution camera: 4008x2672. I a writing a simple program which grabs frame from the camera and sends the frame to a avi file. For working with such a high resolution, I found only x264 codec that could do the trick (Suggestions welcome). I am using opencv for most of the image handling stuff. As mentioned in this post http://doom10.org/index.php?topic=1019.0 , I modified the AVCodecContext members as per ffmpeg presets for libx264 (Had to do this to avoid broken ffmpeg defaults settings error). This is output I am getting when I try to run the program [libx264 @ 0x992d040]non-strictly-monotonic PTS 1294846981.526675 1 0 //Timestamp camera_no frame_no 1294846981.621101 1 1 1294846981.715521 1 2 1294846981.809939 1 3 1294846981.904360 1 4 1294846981.998782 1 5 1294846982.093203 1 6 Last message repeated 7 times [avi @ 0x992beb0]st:0 error, non monotone timestamps -614891469123651720 = -614891469123651720 OpenCV Error: Unspecified error (Error while writing video frame) in icv_av_write_frame_FFMPEG, file /home/ajoshi/ext/OpenCV-2.2.0/modules/highgui/src/cap_ffmpeg.cpp, line 1034 terminate called after throwing an instance of 'cv::Exception' what(): /home/ajoshi/ext/OpenCV-2.2.0/modules/highgui/src/cap_ffmpeg.cpp:1034: error: (-2) Error while writing video frame in function icv_av_write_frame_FFMPEG Aborted Modifications to the AVCodecContext are: if(codec_id == CODEC_ID_H264) { //fprintf(stderr, "Trying to parse a preset file for libx264\n"); //Setting Values manually from medium preset c-me_method = 7; c-qcompress=0.6; c-qmin = 10; c-qmax = 51; c-max_qdiff = 4; c-i_quant_factor=0.71; c-max_b_frames=3; c-b_frame_strategy = 1; c-me_range = 16; c-me_subpel_quality=7; c-coder_type = 1; c-scenechange_threshold=40; c-partitions = X264_PART_I8X8 | X264_PART_I4X4 | X264_PART_P8X8 | X264_PART_B8X8; c-flags = CODEC_FLAG_LOOP_FILTER; c-flags2 = CODEC_FLAG2_BPYRAMID | CODEC_FLAG2_MIXED_REFS | CODEC_FLAG2_WPRED | CODEC_FLAG2_8X8DCT | CODEC_FLAG2_FASTPSKIP; c-keyint_min = 25; c-refs = 3; c-trellis=1; c-directpred = 1; c-weighted_p_pred=2; } I am probably not setting the dts and pts values which I believed ffmpeg should be setting it for me. Any sugggestions welcome. Thanks in advance

    Read the article

  • String Encoding doesn't ouput all characters

    - by AndroidXTr3meN
    My client uses InputStreamReader/BufferedReader to fetch text from the Internet. However when I save the Text to a *.txt the text shows extra weird special symbols like 'Â'. I've tried Convert the String to ASCII but that mess upp å,ä,ö,Ø which I use. I've tried food = food.replace("Â", ""); and IndexOf(); But string won't find it. But it's there in HEX Editor. So summary: When I use text.setText(Android), the output looks fine with NO weird symbols, but when I save the text to *.txt I get about 4 of 'Â'. I do not want ASCII because I use other Non-ASCII character. The 'Â' is displayed as a Whitespace on my Android and in notepad. Thanks! Have A great Weekend! EDIT* found:   in Wordpad

    Read the article

  • ASP.NET 3.5 Page character encoding problem

    - by Dorian McHensie
    Hello everyone. I have a problem in my asp.net 3.5 application (C#) when I try to render in my pages characters like 'è' which are shown in a very strange manner (if i'm lucky i get a ? mark in my web page). in fact Expression Web, when i open my web site, substitutes the è char with �... How can I tell asp.net that I want to use a particular charset so that i can write in the html source letters like è without using hexadecimal codes?????? I tried in the web.config this: inside the system.web namespace of the file but nothing works.... Can anyone tell me how to do? THANKS in advance

    Read the article

  • Reliable email encoding.

    - by Seba Illingworth
    What are the most reliable encodings for sending email? I had some problems recently with .NET's System.Net.Mail default of quoted-printable ('=0D=0A' scattered throughout the message). So I changed to iso-8859-1 for the body (set via alternative views), and 7bit for transfer (and base64 for embedded resources). Are there the better choices?

    Read the article

  • [php] mysql and encoding

    - by user339865
    I moved my php application to the new server. i use mysql5 db. When i'm Updating or Inserting something to db, every " and - sign changed to ?. I use SET NAMES UTF8 and SET CHARACTER SET but it don't work. Any ideas?

    Read the article

  • Character encoding issues?

    - by Santosh
    We had a a clob column in DB. Now when we extract this clob and try to display it (plain text not html), it prints junk some characters on html screen. The character when directly streamed to a file looks like ” (not the usual double quote on regular keyboard) One more observation: System.out.println("”".getBytes()[0]); prints -108. Why a character byte should be in negative range ? Is there any way to display it correctly on a html screen ?

    Read the article

  • JSF - database character encoding

    - by wheelie
    Hi there, I have a Java Web application using GlassFish 3, JSF2.0 (facelets) and JPA (EclipseLink). The problem I'm facing, is that if I'm saving entities to the database with the update() method, String data loses integrity; '?' is shown instead of some characters. The server, pages and database is/are configured to use UTF-8. After I post form data, the next page shows the data correctly. Furthermore it "seems" in debug that the String property of the current entity stores the correct value too. Dunno if NetBeans debug can be trusted; might be that it decodes correctly, however it's incorrect. Any help would be appreciated, thanks in advance! Daniel

    Read the article

  • Webservice parameter value encoding

    - by dnolan
    We have a webservice created in WCF and presenting itself as a basicHttpBinding. One of the parameters is a string which takes an xml string. Looking at the soap the client generates to send to the webservice, the xml is encoded, with all < and swapped into < and >. My question is, is this all that is encoded, or has the parameter been run through HtmlEncode so that other entities would be swapped also? The reason I ask is we are submitting with our client to a 3rd party webservice now and they would like to know the details on what is encoded.

    Read the article

  • PHP utf8 encoding problem

    - by shyam
    How can I encode strings on UTF-16BE format in PHP? For "Demo Message!!!" the encoded string should be '00440065006D006F0020004D00650073007300610067006'. Also, I need to encode Arabic characters to this format.

    Read the article

  • How is this website fixing the encoding ??

    - by Tal Galili
    Hi all, I am trying to turn this text: ×וויר. העתיד של רשתות חברתיות והתקשורת ×©×œ× ×• Into this text: ?????. ????? ?? ????? ??????? ???????? ???? Somehow, this website: http://www.pixiesoft.com/flip/ Can do it, and I would like to know how I might be able to do it myself (with whatever programming language or software) Just saving the file as UTF8 won't do it. My motivation for this question is that I have a friend's exported XML file with the garbled text which I want to turn into corrected Hebrew text file. The XML export was originally garbled by MySQL import and exports, but I don't have the information needed to fix it or traceback the problem. Thanks.

    Read the article

  • Java - Image encoding in XML

    - by Hoopla
    Hi everyone, I thought I would find a solution to this problem relatively easily, but here I am calling upon the help from ye gods to pull me out of this conundrum. So, I've got an image and I want to store it in an XML document using Java. I have previously achieved this in VisualBasic by saving the image to a stream, converting the stream to an array, and then VB's xml class was able to encode the array as a base64 string. But, after a couple of hours of scouring the net for an equivalent solution in Java, I've come back empty handed. The only success I have had has been by: import it.sauronsoftware.base64.*; import java.awt.image.BufferedImage; import org.w3c.dom.*; ... BufferedImage img; Element node; ... java.io.ByteArrayOutputStream os = new java.io.ByteArrayOutputStream(); ImageIO.write(img, "png", os); byte[] array = Base64.encode(os.toByteArray()); String ss = arrayToString(array, ","); node.setTextContent(ss); ... private static String arrayToString(byte[] a, String separator) { StringBuffer result = new StringBuffer(); if (a.length > 0) { result.append(a[0]); for (int i=1; i<a.length; i++) { result.append(separator); result.append(a[i]); } } return result.toString(); } Which is okay I guess, but reversing the process to get it back to an image when I load the XML file has proved impossible. If anyone has a better way to encode/decode an image in an XML file, please step forward, even if it's just a link to another thread that would be fine. Cheers in advance, Hoopla.

    Read the article

  • android: getting rid of "warning: unmappable character for encoding ascii"

    - by Lo'oris
    I'm compiling using android tools without eclipse. I compile launching "ant debug" from command line. I have found many many instructions around the web about how to remove with annoying warning, but I haven't been able to make any of them work. I've tried -D option, I've tried randomly tweaking build.* files, I've tried exporting an environment variable... nothing. I guess some of these methods just don't work, and some others would work but I've been doing them incorrectly. Anything is possible and I can't stand it any more: any advice on how to do it?

    Read the article

  • Encoding Video on the iPhone

    - by Dave
    Im trying to record the contents of the iPhone screen to video , in the app I'm working on there able to create a little animation. I'm just not sure how to encode the screen contents/animation to video? The problem with using something like ffmpeg is that its LGPL which can lead to licensing issues, is there a better option?

    Read the article

  • Funny characters in my db

    - by hdx
    My web app is breaking when I try edit a certain content type and I'm pretty sure it is because of some weird characters in my database. So when I do: SELECT body FROM message WHERE id = 666 it returns: <p>⢠<span></span></p><p><br /></p><p><em><strong>NOTE:</strong> Please remember to use your to participate in the discussion.</em></p> However when I try to count how many documents have those characters postgres complains: foo_450_prod=# SELECT COUNT(*) FROM message WHERE body LIKE'%â¢%'; ERROR: invalid byte sequence for encoding "UTF8": 0xe2a225 HINT: This error can also happen if the byte sequence does not match the encodi Does anybody know what the issue is and how I can query for those funny characters? Thanks in advance!

    Read the article

  • Git Shell in Windows: patch's default character encoding is UCS-2 Little Endian - how to change this to ANSI or UTF-8 without BOM?

    - by Sk8erPeter
    When creating a diff patch with Git Shell in Windows (when using GitHub for Windows), the character encoding of the patch will be UCS-2 Little Endian according to Notepad++ (see the screenshots below). How can I change this behavior, and force git to create patches with ANSI or UTF-8 without BOM character encoding? It causes a problem because UCS-2 Little Endian encoded patches can not be applied, I have to manually convert it to ANSI.

    Read the article

  • Win32: What is the status of chunked encoding support in WinHttpReadData?

    - by Cheeso
    The documentation for WinHttpReadData says, regarding HTTP's chunked transfer coding: Starting in Windows Vista and Windows Server 2008, WinHttp enables applications to perform chunked transfer encoding on data sent to the server. When the Transfer-Encoding header is present on the WinHttp response, WinHttpReadData strips the chunking information before giving the data to the application. Can anyone decipher this? Q1 First, this text is on the page for WinHttpReadData, which is used to ... read data within an HTTP client application, specifically the response data. So what does it mean when it says Starting in Windows Vista and Windows Server 2008, WinHttp enables applications to perform chunked transfer encoding on data sent to the server. The WinHttpReadData function isn't used with data being sent to the server. It is used when reading data from the server. Consulting the doc for the WinHttpWriteData function, which is used to send data to the server as part of an HTTP request, there is no mention of the chunked transfer capability. Q2 Supposing that I figure out just what the newish chunked transfer support amounts to, how do I get that support? It says that it is new on Vista and WS2008. What happens if I write an app that runs on WS2003, and uses WinHttpReadData and it encounters a chunked response, or WinHttpWriteData, and it wants to send a chunked request? Between the lines, is this documentation saying that I need to link against the Vista-era Windows SDK, or later, in order to get the capability to do chunked encoding? Or is it really impossible on WS2003?, in other words it is the case that the app doing chunked transfer using this library must run on the OS specified? This might read like a rant, but it's not. I truly want to know.

    Read the article

  • Visual Studio 2008 project file does not load because of an unexpected encoding change.

    - by Xenan
    In our team we have a database project in visual Studio 2008 which is under source control by Team Foundation Server. Every two weeks or so, after one co-worker checks in, the project file won't load on the other developers machines. The error message is: The project file could not be loaded. Data at the root level is invalid. Line 1, position 1. When I look at the project file in Notepad++, the file looks like this: ??<NUL?NULxNULmNULlNUL NULvNULeNULrNULsNULiNULoNULnNUL ... and so on (you can see <?xml version in this) whereas an normal project file looks like: <?xml version="1.0" encoding="utf-16"?> ... So probably something is wrong with the encoding of the file. This is a problem for us because it turns out to be impossible to get the file encoding correct again. The 'solution' is to throw away the project file an get the last know working version from source control. According to the file, the encoding should be UTF-16. According to Notepad++, the corrupted file is actually UTF-8. My questions are: Why is Visual Studio messing up the encoding of the project file, apparently at random times and at random machines? What should we do to prevent this? When it has happened, is there a possibility to restore the current file in the correct encoding instead of pulling an older version from source control? As a last note: the problem is with one single project file, all other project files don't expose this problem.

    Read the article

  • youtube - video upload failure - unable to convert file - encoding the video wrong?

    - by Anthony
    I am using .NET to create a video uploading application. Although it's communicating with YouTube and uploading the file, the processing of that file fails. YouTube gives me the error message, "Upload failed (unable to convert video file)." This supposedly means that "your video is in a format that our converters don't recognize..." I have made attempts with two different videos, both of which upload and process fine when I do it manually. So I suspect that my code is a.) not encoding the video properly and/or b.) not sending my API request properly. Below is how I am constructing my API PUT request and encoding the video: Any suggestions on what the error could be would be appreciated. Thanks P.S. I'm not using the client library because my application will use the resumable upload feature. Thus, I am manually constructing my API requests. Documentation: http://code.google.com/intl/ja/apis/youtube/2.0/developers_guide_protocol_resumable_uploads.html#Uploading_the_Video_File Code: // new PUT request for sending video WebRequest putRequest = WebRequest.Create(uploadURL); // set properties putRequest.Method = "PUT"; putRequest.ContentType = getMIME(file); //the MIME type of the uploaded video file //encode video byte[] videoInBytes = encodeVideo(file); public static byte[] encodeVideo(string video) { try { byte[] fileInBytes = File.ReadAllBytes(video); Console.WriteLine("\nSize of byte array containing " + video + ": " + fileInBytes.Length); return fileInBytes; } catch (Exception e) { Console.WriteLine("\nException: " + e.Message + "\nReturning an empty byte array"); byte [] empty = new byte[0]; return empty; } }//encodeVideo //encode custom headers in a byte array byte[] PUTbytes = encode(putRequest.Headers.ToString()); public static byte[] encode(string headers) { ASCIIEncoding encoding = new ASCIIEncoding(); byte[] bytes = encoding.GetBytes(headers); return bytes; }//encode //entire request contains headers + binary video data putRequest.ContentLength = PUTbytes.Length + videoInBytes.Length; //send request - correct? sendRequest(putRequest, PUTbytes); sendRequest(putRequest, videoInBytes); public static void sendRequest(WebRequest request, byte[] encoding) { Stream stream = request.GetRequestStream(); // The GetRequestStream method returns a stream to use to send data for the HttpWebRequest. try { stream.Write(encoding, 0, encoding.Length); } catch (Exception e) { Console.WriteLine("\nException writing stream: " + e.Message); } }//sendRequest

    Read the article

  • I need help converting a C# string from one character encoding to another?

    - by Handleman
    According to Spolsky I can't call myself a developer, so there is a lot of shame behind this question... Scenario: From a C# application, I would like to take a string value from a SQL db and use it as the name of a directory. I have a secure (SSL) FTP server on which I want to set the current directory using the string value from the DB. Problem: Everything is working fine until I hit a string value with a "special" character - I seem unable to encode the directory name correctly to satisfy the FTP server. The code example below uses "special" character é as an example uses WinSCP as an external application for the ftps comms does not show all the code required to setup the Process "_winscp". sends commands to the WinSCP exe by writing to the process standardinput for simplicity, does not get the info from the DB, but instead simply declares a string (but I did do a .Equals to confirm that the value from the DB is the same as the declared string) makes three attempts to set the current directory on the FTP server using different string encodings - all of which fail makes an attempt to set the directory using a string that was created from a hand-crafted byte array - which works Process _winscp = new Process(); byte[] buffer; string nameFromString = "Sinéad O'Connor"; _winscp.StandardInput.WriteLine("cd \"" + nameFromString + "\""); buffer = Encoding.UTF8.GetBytes(nameFromString); _winscp.StandardInput.WriteLine("cd \"" + Encoding.UTF8.GetString(buffer) + "\""); buffer = Encoding.ASCII.GetBytes(nameFromString); _winscp.StandardInput.WriteLine("cd \"" + Encoding.ASCII.GetString(buffer) + "\""); byte[] nameFromBytes = new byte[] { 83, 105, 110, 130, 97, 100, 32, 79, 39, 67, 111, 110, 110, 111, 114 }; _winscp.StandardInput.WriteLine("cd \"" + Encoding.Default.GetString(nameFromBytes) + "\""); The UTF8 encoding changes é to 101 (decimal) but the FTP server doesn't like it. The ASCII encoding changes é to 63 (decimal) but the FTP server doesn't like it. When I represent é as value 130 (decimal) the FTP server is happy, except I can't find a method that will do this for me (I had to manually contruct the string from explicit bytes). Anyone know what I should do to my string to encode the é as 130 and make the FTP server happy and finally elevate me to level 1 developer by explaining the only single thing a developer should understand?

    Read the article

  • Tip: Replacing Html.Encode Calls With New Html Encoding Syntax

    Like the well disciplined secure developer that you are, when you built your ASP.NET MVC 1.0 application, you remembered to call Html.Encode every time you output a value that came from user input. Didnt you? Well, in ASP.NET MVC 2 running on ASP.NET 4, those calls can be replaced with the new HTML encoding syntax (aka code nugget). Ive written a three part series on the topic. Html Encoding Code Blocks With ASP.NET 4 Html Encoding Nuggets With ASP.NET MVC 2 Using AntiXss as the default...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Video Encoding library for C++ game

    - by Paulo Pinto
    I'm looking for a video encoding library in C++ that I can use to record game footage. It can not be an external application like Fraps, it must be a library. Ideally the encoding can be done in real time without affecting game performance too much, although this is not a must have requirement. Another preference is that the video file being saved from the game is already compressed and ready to be used by most video players without any further processing. I realize that this might not be possible especially for real time encoding, so I would accept a trade off of having to process the file later for better compression and/or better file format. I'd like to hear about your experience integrating the library into a game if possible and any interesting trade offs you had to make. Some libraries support more that one file format or codec, so advice on the file format would also be appreciated.

    Read the article

< Previous Page | 7 8 9 10 11 12 13 14 15 16 17 18  | Next Page >