Search Results

Search found 25009 results on 1001 pages for 'content encoding'.

Page 40/1001 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Cannot write log file 'ffmpeg2pass-0.log' for pass-1 encoding: Permission denied

    - by matt_tm
    Our PHP application is installed as 'root' on a Redhat5/CentOS system at: /var/www/html/beta/ After disabling SELINUX in order to allow these scripts to execute other programs on the system - http://serverfault.com/questions/192951/what-permissions-are-needed-to-run-a-system-command-within-a-php-script-that-wr I faced the error that the Apache error_log showed this: Cannot write log file 'ffmpeg2pass-0.log' for pass-1 encoding: Permission denied

    Read the article

  • HTTP Data chunks over multiple packets?

    - by myforwik
    What is the correct way for a HTTP server to send data over multiple packets? For example I want to transfer a file, the first packet I send is: HTTP/1.1 200 OK Content-type: application/force-download Content-Type: application/download Content-Type: application/octet-stream Content-Description: File Transfer Content-disposition: attachment; filename=test.dat Content-Transfer-Encoding: chunked 400 <first 1024 bytes here> 400 <next 1024 bytes here> 400 <next 1024 bytes here> Now I need to make a new packet, if I just send: 400 <next 1024 bytes here> All the clients close there connections on me and the files are cut short. What headers do I put in a second packet to continue on with the data stream?

    Read the article

  • windows dvd maker encoding

    - by Greg Rains
    My Windows DVD Maker stops/freezes the encoding process on some movies that I have converted to AVI files. Is there an answer to this using Windows DVD Maker? Or is there another software product that is better? (using Windows 7 Professional)

    Read the article

  • Keep ASP.NET site and content separate

    - by Nelson
    I have an ASP.NET site in folder x. Currently lots of other static content gets added to folder x and gets mixed in, making it one big mess. I would like to keep the ASP.NET site and the content separate somehow. I know you can create virtual directories in IIS, but there are LOTS and even some content in the root. The content people are not technical and really need an easy way to add it. I would stick everything in a subfolder (they don't touch anything outside, I don't touch their folder), but that would change their URLs (www.example.com/something to www.example.com/content/something). I almost need a way to "merge" two folders and have them act as one. I'm guessing that is impossible since there could be file conflicts, etc. Any other ways I can achieve this?

    Read the article

  • What could cause the file command in Linux to report a text file as data?

    - by Jonah Bishop
    I have a couple of C++ source files (one .cpp and one .h) that are being reported as type data by the file command in Linux. When I run the file -bi command against these files, I'm given this output (same output for each file): application/octet-stream; charset=binary Each file is clearly plain-text (I can view them in vi). What's causing file to misreport the type of these files? Could it be some sort of Unicode thing? Both of these files were created in Windows-land (using Visual Studio 2005), but they're being compiled in Linux (it's a cross-platform application). Any ideas would be appreciated. Update: I don't see any null characters in either file. I found some extended characters in the .cpp file (in a comment block), removed them, but file still reports the same encoding. I've tried forcing the encoding in SlickEdit, but that didn't seem to have an effect. When I open the file in vim, I see a [converted] line as soon as I open the file. Perhaps I can get vim to force the encoding?

    Read the article

  • Windows Server NTFS volume list file name encodings and any illegal file names

    - by benbradley
    I'm having to deal with a Windows Server (NTFS) file server and our backup application appears to be failing with certain files. According to this https://en.wikipedia.org/wiki/NTFS#Internals NTFS apparently supports file names encoded in UTF-16 but according to their support team, our backup application only supports UTF-8. I'd like to confirm whether this is actually the problem by seeing the file name encoding for myself. The files that are failing appear to be using plain English A-Z letters and other ASCII characters. No accents or non-English letters etc. I suppose even though the letters appear to be plain A-Z the file name could still be encoded in UTF-16. Does anyone know of a utility or script that can recursively go through all files in a directory and show the encoding of the file name? Then I could try renaming to UTF-8 to see if the backup can proceed. I'm not a Windows developer so can't write this up myself. Presumably the encoding of the file name should be stored in the FS somewhere and therefore it should be possible to expose this.

    Read the article

  • SQL Error (1064) when importing data from SQL file

    - by mejpark
    I have a MySQL database, which was originally set up with the default latin1 character set and latin1_swedish_ci collation. I was using the database like this for sometime, until I noticed strange characters on my production web site, which is powered by a database exported from my development machine. At this point, I changed the default character set of the database and tables to utf8 and the collation to utf8_unicode_ci, converted the latin1 data inside each table to utf8 (using the 'convert data' option) and exported the database as a single SQL file using HeidiSQL. When the resulting SQL file is opened in Notepad++, several characters are rendered incorrectly. For example, en dashes (-) are displayed as – and e with accent (é) are displayed as é. I changed the encoding of the file from ANSI to UTF-8 (using the encoding menu option in Notepad++) and the offending characters are rendered correctly. I saved the new utf8-encoded SQL file and attempted to import the contents into the MySQL database on my production server. The import process fails with following error: /* SQL Error (1064): You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near '?# -------------------------------------------------------- # Host: ' at line 1 */ /* Error with snippets directory: The specified path was not found */ The head of the SQL file: # -------------------------------------------------------- # Host: 127.0.0.1 # Server version: 5.1.33-community # Server OS: Win32 # HeidiSQL version: 6.0.0.3773 # Date/time: 2011-04-20 09:48:36 # -------------------------------------------------------- It chokes on the first line of the file, which is commented out. Why is this happening? I didn't have a problem loading data from SQL files until I changed the character set and collation of the database. I came up with an ugly workaround to this problem by performing following steps: Export database as single SQL file using HeidiSQL Open resulting file in Notepad++ and convert from ANSI to UTF-8 encoding Create new empty file in Notepad++, paste in UTF-8 and save file normally What am I missing here?

    Read the article

  • How can one prevent double encoding of html entities when they are allowed in the input

    - by Bob
    How can I prevent double encoding of html entities, or fix them programmatically? I am using the encode() function from the HTML::Entities perl module to encode HTML entities in user input. The problem here is that we also allow users to input HTML entities directly and these entities end up being double encoded. For example, a user may enter: Stackoverflow & Perl = Awesome&hellip; This ends up being encoded to Stackoverflow &amp; Perl = Awesome&amp;hellip; This renders in the browser as Stackoverflow & Perl = Awesome&hellip; We want this to render as Stackoverflow & Perl = Awesome... Is there a way to prevent this double encoding? Or is there a module or snippet of code that can easily correct these double encoding issues? Any help is greatly appreciated!

    Read the article

  • Hide table row based on content of table cell

    - by timkl
    I want to make some jQuery that shows some table rows and hides others based on the content of the first table cell in each row. When I click a list item I want jQuery to check if the first letter of the item matches the first letter in any table cell in my markup, if so the parent table row should be shown and other rows should be hidden. This is my markup: <ul> <li>A</li> <li>B</li> <li>G</li> </ul> <table> <tr> <td>Alpha1</td> <td>Some content</td> </tr> <tr> <td>Alpha2</td> <td>Some content</td> </tr> <tr> <td>Alpha3</td> <td>Some content</td> </tr> <tr> <td>Beta1</td> <td>Some content</td> </tr> <tr> <td>Beta2</td> <td>Some content</td> </tr> <tr> <td>Beta3</td> <td>Some content</td> </tr> <tr> <td>Gamma1</td> <td>Some content</td> </tr> <tr> <td>Gamma2</td> <td>Some content</td> </tr> <tr> <td>Gamma3</td> <td>Some content</td> </tr> </table> So if I press "A" this is what is rendered in the browser: <ul> <li>A</li> <li>B</li> <li>G</li> </ul> <table> <tr> <td>Alpha1</td> <td>Some content</td> </tr> <tr> <td>Alpha2</td> <td>Some content</td> </tr> <tr> <td>Alpha3</td> <td>Some content</td> </tr> </table> I'm really new to jQuery so any hint on how to go about a problem like this would be appreciated :)

    Read the article

  • How to use SMS content provider? Where are the docs?

    - by Mark
    Hi, I'd like to be able to read the system's SMS content provider. Basically I wanted to make an SMS messaging app, but it would only be useful if I could see past threads etc. It seems like there's a content provider for this, but I can't find documentation for it - anyone know where that is? Thanks -------- edit ----------- Ok I found a way to get the sms inbox provider, and I just dumped all the column names in that provider, looks like this: Uri uriSms = Uri.parse("content://sms/inbox"); Cursor c = context.getContentResolver().query(uriSms, null,null,null,null); // column names for above provider: 0: _id 1: thread_id 2: address 3: person 4: date 5: protocol 6: read 7: status 8: type 9: reply_path_present 10: subject 11: body 12: service_center 13: locked I'm just piecing this together from random threads I find around the net, I'm really wondering where this is all documented (if at all)? Thanks again

    Read the article

  • How Do I Set XML Content-Type in Flex 3?

    - by Laxmidi
    Hi, I've got a Flex 3 project. Flex makes an ExternalCall to some Javascript. The Javascript is then turned into XML. But, my xml isn't parsing in IE. It works in all other browsers. I think that the problem is that I haven't set the XML's content-type and IE doesn't like that. So my code looks like: myReturn = '<myXMLReturn>' + myReturn + '</myXMLReturn>'; myReturn = '<?xml version="1.0" encoding="UTF-8"?>' + myReturn; xmlReturn = new XML(myReturn); How would I set: header('Content-type: application/xml'); in Actionscript? As I understand it IE requires that the Content-type be set. Is this correct? Thank you. -Laxmidi

    Read the article

  • Printing UTF-16 strings in JSP is outputted as HTML encoding (&#xxxx)

    - by Ori Osherov
    Hello, When I try to print a UTF-16 string in JSP, specifically Hebrew, it ends up showing up as HTML encoding (&#xxxx). This problem occurs because I print an array of variables into the web page and then parse them. The variables are all UTF-16 strings, but once the servlet prints the variables, it becomes translated to HTML encoding. Is there any way to get rid of the encoding? Thanks in advance Edit for a bit more background: The JSP that I'm printing is not the entirety of the page. It's used in a manner I don't quite understand by a server app which prints the JSPs output into its built in page. As a result, I can't, for instance, use a tag because the will have already been placed somewhere else. This isn't a frame or anything like that. It's just redirected output.

    Read the article

  • Care to be taken when serving static content (JS, CSS, Media) from different domain?

    - by Aahan Krish
    Let me try to explain by example. Say website is hosted at example.com (NOT www.example.com). In order to serve static content cookie-free, I've chosen to use a different domain example-static.com. Now, lets consider that my static content is currently served like this: http://example.com/js/script.js http://example.com/css/style.css http://example.com/media/image.jpg ** Now I create a CNAME record aliasing example-static.com to my main domain i.e. example.com so that the static content is served as such: http://example-static.com/js/script.js http://example-static.com/css/style.css http://example-static.com/media/image.jpg ** Is that all I have to do? Will all browsers execute JavaScript files and load web fonts without any security concerns? OR should I be using some .htaccess rules to modify header information and the like? PS: It would be great if you can provide what rules should be added, if need be.

    Read the article

  • Generate encoding String according to creation order.

    - by Tony
    I need to generate encoding String for each item I inserted into the database. for example: x00001 for the first item x00002 for the sencond item x00003 for the third item The way I chose to do this is counting the rows. Before I insert the third item, I count against the database, I know there're already 2 rows, so the next encoding is ended with 3. But there is a problem. If I delete the second item, the forth item will not be the x00004,but x00003. I can add additional columns to table, to store the next encoding, I don't know if there's other better solutions ?

    Read the article

  • Ad-hoc taxonomy: owning the chess set doesn't mean you decide how the little horsey moves

    - by Roger Hart
    There was one of those little laugh-or-cry moments recently when I heard an anecdote about content strategy failings at a major online retailer. The story goes a bit like this: successful company in a highly commoditized marketplace succeeds on price and largely ignores its content team. Being relatively entrepreneurial, the founders are still knocking around, and occasionally like to "take an interest". One day, they decree that clothing sold on the site can no longer be described as "unisex", because this sounds old fashioned. Sad now. Let me just reiterate for the folks at the back: large retailer, commoditized market place, differentiating on price. That's inherently unstable. Sooner or later, they're going to need one or both of competitive differentiation and significant optimization. I can't speak for the latter, since I'm hypothesizing off a raft of rumour, but one of the simpler paths to the former is to become - or rather acknowledge that they are - a content business. Regardless, they need highly-searchable terminology. Even in the face of tooth and claw resistance to noticing the fundamental position content occupies in driving sales (and SEO) on the web, there's a clear information problem here. Dilettante taxonomy is a disaster. Ok, so this is a small example, but that kind of makes it a good one. Unisex probably is the best way of describing clothing designed to suit either men or women interchangeably. It certainly takes less time to type (and read). It's established terminology, and as a single word, it's significantly better for web readability than a phrasal workaround. Something like "fits men or women" is short, by could fall foul of clause-level discard in web scanning. It's not an adjective, so for intuitive reading it's never going to be near the start of a title or description. It would also clutter up search results, and impose cognitive load in list scanning. Sorry kids, it's just worse. Even if "unisex" were an archaism (which it isn't), the only thing that would weigh against its being more usable and concise terminology would be evidence that this archaism were hurting conversions. Good luck with that. We once - briefly - called one of our products a "Can of worms". It was a bundle in a bug-tracking suite, and we thought it sounded terribly cool. Guess how well that sold. We have information and content professionals for a reason: to make sure that whatever we put in front of users is optimised to meet user and business goals. If that thinking doesn't inform style guides, taxonomy, messaging, title structure, and so forth, you might as well be finger painting.

    Read the article

  • Content Encryption Options in Oracle IRM 11g

    - by martin.abrahams
    Another of the innovations in Oracle IRM 11g is a wider choice of encryption algorithms for protecting content. The choice is now as illustrated below. As you see, three of the choices are marked as FIPS options, where FIPS refers to the Federal Information Processing Standard Publication 140-2, a U.S. government security standard for accreditation of cryptographic modules.

    Read the article

  • Why Content Should Be King of Your Business' Website

    You've no doubt heard that "content is king" on websites, but do you really know why? If your business website doesn't give visitors good, solid information they're going to leave your site and go to the next site that came up in their search engine results looking for the information you didn't provide. If they find it at your competitor's site, she'll probably get their business.

    Read the article

  • Filtering Your Content

    - by rickramsey
    Watch it directly on YouTube You can't always get what you want, but we do try to get you what you need. Use these OTN System Collections to see what's been published lately in your area of interest: Sysadmin Collection Developer Collection OTN ystems Collection See all collections (work in progress) If you prefer to use your RSS feeder, try this page: RSS Feeds for OTN Systems Content - Rick System Admin and Developer Community of OTN OTN Garage Blog OTN Garage on Facebook OTN Garage on Twitter

    Read the article

  • Create Keyword Dense Content For Better SEO

    Briefly touching on this in the introduction this is important to do and be aware of but do not make this a massive part of your efforts to achieve better search engine ranking. This basically means optimizing your content to be more keyword dense so that the search engine robots will pick up your site as being relevant for a certain search phrase or keyword.

    Read the article

  • Myth Busting the Duplicate Content Myth

    Using previously published articles such as you find on article sites does not mean you are going to suffer the wrath of search engines and suffer the "Duplicate Content Penalty" death sentence. If you respect your readers, your guest bloggers and article authors, then you should not worry. If you are trying to deceive readers or search engines then you will get in trouble.

    Read the article

  • Content Management System ? Overview

    On several occasion a client would like to be able to modify the pages and add their content. This need may arise due to change in profile of the services offered or additional services. This may eve... [Author: Alan Smith - Web Design and Development - June 04, 2010]

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >