Search Results

Search found 802 results on 33 pages for 'compressed sensing'.

Page 21/33 | < Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >

  • Uncompressing zlib data using boost::iostreams::filtering_streambuf trouble

    - by GuitaringEgg
    I'm trying to write a small class that will load the chunk data from part of a minecraft world file. I'm to the point where I have stored some data in a char array which was compressed with zlib and need to decompress it. I'm trying to use the boost filtering_streambuf to do this. char * rawChunk = new char[length - 1]; // Load chunk data stringstream ssRawChunk(rawChunk); boost::iostreams::filtering_istream in; in.push(boost::iostreams::zlib_decompressor()); in.push(ssRawChunk); stringstream ssOut; boost::iostreams::copy(in, ssOut); My problem is that rawChunk contains null data, so when coping data from (char*) rawChunk to (stringstream) ssRawChunk, it terminates at ~257 instead of the expected length 2154. Is there any way to use filtering_streambuf without stringstream to allow for null data or is there a way to stop stringstream to not terminate on null data?

    Read the article

  • Multi-part gzip file random access (in Java)

    - by toluju
    This may fall in the realm of "not really feasible" or "not really worth the effort" but here goes. I'm trying to randomly access records stored inside a multi-part gzip file. Specifically, the files I'm interested in are compressed Heretrix Arc files. (In case you aren't familiar with multi-part gzip files, the gzip spec allows multiple gzip streams to be concatenated in a single gzip file. They do not share any dictionary information, it is simple binary appending.) I'm thinking it should be possible to do this by seeking to a certain offset within the file, then scan for the gzip magic header bytes (i.e. 0x1f8b, as per the RFC), and attempt to read the gzip stream from the following bytes. The problem with this approach is that those same bytes can appear inside the actual data as well, so seeking for those bytes can lead to an invalid position to start reading a gzip stream from. Is there a better way to handle random access, given that the record offsets aren't known a priori?

    Read the article

  • Simplest solution to replace a tiny file inside an MSI?

    - by sascha
    Many of our customers have access to InstallShield, WISE or AdminStudio. These aren't a problem. I'm hoping there is some way I can provide our smaller customers without access to commercial repackaging tools a freely available set of tools and steps to do the file replacement themselves. Only need to replace a single configuration file inside a compressed MSI, the target user can be assumed to already have Orca installed, know how to use this to customize the Property table (to embed license details for GPO deployment) and have generated an MST file. Disclaimer: this is very similar to another question but both questions and answers in that thread are not clear.

    Read the article

  • Making a DVD video with a still image and PCM 16bit audio with ffmpeg

    - by João
    I'm trying to make a small video with a still image and a sound file playing in the background to pass it to dvdauthor and create a DVD. The command I'm using is this: ffmpeg -loop_input -i image.jpg -qscale 2 -i song.flac -aspect 4:3 -target pal-dvd -acodec pcm_s16le -shortest output.mpg However, the resulting video file doesn't have sound at all (testing it on VLC Player). I don't know if I can't combine "-acodec pcm_s16le" with "-target pal-dvd" to override the later, or if there is something else wrong with the command. If I try without the "-acodec pcm_s16le" parameter the video and audio works, I can even create a DVD ISO with it. However, the audio stays as AC3. I wanted to include with the video the lossless audio, not a compressed one. I suppose the DVD standart allows to have PCM audio in it, am I right?

    Read the article

  • when i try to save this file it creates and folder for the entire path

    - by girish
    ZipFileToCreate = "c:\user\desktop\webservice\file.zip"; so, when i try to save this file it creates the folder for the path like user\desktop\webservice\file\ why is it so, FileStream fs = new FileStream(ZipFileToCreate, FileMode.Open); byte[] data = new Byte[fs.Length]; BinaryReader br = new BinaryReader(fs); br.Read(data, 0, data.Length); br.Close(); Response.Clear(); Response.ContentType = "application/x-zip-compressed"; Response.AppendHeader("Content-Disposition", "filename=" + Parameter + ".zip"); DeleteOldFiles(); Response.BinaryWrite(data);

    Read the article

  • HTML Chrome Audit Specify Image Dimensions

    - by AKRamkumar
    I just started using the chrome developer tools for some basic html websites and I used the audit tool. I had two identical images, one with the height and width attribute, and one without. On the Resources section, both the latency and the download time were identical. However, the Audit showed Specify image dimensions (1) A width and height should be specified for all images in order to speed up page display. Does this actually help? And are there any other ways to speed up page time? This is only a splash page for the website I am building and as such it is only html, no css or javascript or anything. I have already compressed the images but I want to speed up load time even more. Is there a way?

    Read the article

  • Decompressing file with gzip produces file with no read-permissions on Windows 7

    - by Abiel
    I am attempting to decompress a .gz file using the GnuWin32 gzip program in Windows 7. I have full permissions on the compressed file, and my user account is an administrator. However, I end up not having read permissions on the decompressed file. To get read permissions I would have to manually change the permissions on it through right-clicking and selecting Properties Security. I am able to do this exact same thing with no permission problems in Windows XP, which leads me to believe that Windows 7's user account control system is causing problems. Does anyone know what I can do to make things work as I would expect (read permission on the decompressed file) in Windows 7? Thanks.

    Read the article

  • Saving gzipped string into Sqlite3 via Rails throws unrecognized token error

    - by user141146
    Hi, I'm using rails 2.3 and I'm trying to compress (gzip) text that I'd like to save using ActiveRecord into a sqlite database. However, the compressed text isn't being saved b/c I get this type of error: SQLite3::SQLException: unrecognized token: "'x##U?#7 Any thoughts on what I can do to avoid this error? should I compress the text in some other fashion? should I save the data using some other method(s)? Relevant code is below. # compress my text require 'zlib' defl = Zlib::Deflate test_string = "<h3>some text</h3>some additional text<p>here's some more text</p>" compressed_string = defl.deflate(test_string) => "x\234\263\3110\266+\316\317MU(I\255(\261\321\207\361\022SR2K2\363\363\022s \022\005v\031\251E\251\352\305\n`\331\334\374\"\230\206\002;\000\0225\027\222" ModelClass.new(:attribute1 => compressed_string).save ActiveRecord::StatementInvalid: SQLite3::SQLException: unrecognized token: "'x#####+##MU(I#(#?####2K2#### v#E####

    Read the article

  • Compressing three individual jpeg pics containing temporal redundancy?

    - by michael
    I am interfacing and embedded device with a camera module that returns a single jpeg compressed frame each time I trigger it. I would like to take three successive shots (approx 1 frame per 1/4 second) and further compress the images into a single file. The assumption here is that there is a lot of temporal redundancy, therefore lots of room for more compression across the three frames (compared to sending three separate jpeg images). I will be implementing the solution on an embedded device in C without any libraries and no OS. The camera will be taking pics in an area with very little movement (no visitors or screens in the background, maybe a tree with swaying branches), so I think my assumption about redundancy is pretty solid. When the file is finally viewed on a pc/mac, I don't mind having to write something to extract the three frames (so it can be a nonstandard cluge) So I guess the actual question is: What is the best way to compress these three images together given the fact that they are already in JPEG format (it is a possibly to convert back to a raw image, but if i dont have too...)

    Read the article

  • Small help in alert box by moo tools

    - by piemesons
    <script type="text/javascript" src="http://ajax.googleapis.com/ajax/libs/mootools/1.2.2/mootools-yui-compressed.js"></script> <script type="text/javascript" src="sexyalertbox.v1.2.moo.js"></script> <link rel="stylesheet" type="text/css" media="all" href="sexyalertbox.css"/> <body> <p><a href="#" onclick="Sexy.alert('<h1>hello</h1><p>Please enter a valid email address</p>');return false;">Click here</a></p> </body> </html> When we click on click here then a alert box will appear. I want to have that alert box in javascript ie alert("Same sexy alert box type alert box"); // hwo to do this??

    Read the article

  • Random access gzip stream

    - by jkff
    I'd like to be able to do random access into a gzipped file. I can afford to do some preprocessing on it (say, build some kind of index), provided that the result of the preprocessing is much smaller than the file itself. Any advice? My thoughts were: Hack on an existing gzip implementation and serialize its decompressor state every, say, 1 megabyte of compressed data. Then to do random access, deserialize the decompressor state and read from the megabyte boundary. This seems hard, especially since I'm working with Java and I couldn't find a pure-java gzip implementation :( Re-compress the file in chunks of 1Mb and do same as above. This has the disadvantage of doubling the required disk space. Write a simple parser of the gzip format that doesn't do any decompressing and only detects and indexes block boundaries (if there even are any blocks: I haven't yet read the gzip format description)

    Read the article

  • Haxe app and Gtk-WARNING on linux servers

    - by Cambiata
    Hi! I'm trying a Haxe-compiled solution called FAR (Flash Archiver) created by Edwin Van Rijkom (http://code.google.com/p/vanrijkom-flashlibs/) wich uses a command-line tool for creating compressed archives. When running the FAR tool locally on my ubuntu laptop, everything works fine. When running remotely (terminal as Root) on my Ubuntu and Debian servers, it gives the following error: Gtk-WARNING **: cannot open display: I've tried to reach Edvin about this, but no response so far. Maybe it has something to do with installation or user rights on the server? Any clue?

    Read the article

  • is jQuery 1.4.2 compatible with Closure Compiler?

    - by Mohammad
    According to the official release statement version 1.4 has been re-written to be compressed with Closure Compiler yet when I use the online version of closure compiler I get 130 warnings. This is the code I use. // ==ClosureCompiler== // @compilation_level ADVANCED_OPTIMIZATIONS // @output_file_name default.js // @code_url http://ajax.googleapis.com/ajax/libs/jquery/1.4.2/jquery.js // ==/ClosureCompiler== And as far as I know you get the real benefit of Closure Compiler if you include the library with your code also, so it removes the unused functions. Yet my testing show that I can't get any further than compressing the library itself.. What am I doing wrong? Any kind of insight will be much appreciated.

    Read the article

  • how to change link url on the web browser's status bar.

    - by sunglim
    I already read many article about this issue in here, SO. I just want to discuss how to do it. NOT the moral issue. -- for example. at the google search webpage. before I click the link, the link does not indicate the google url. but After I click the link with shift-key, the url on the status bar is changed. this mean the google webpage indicate 'Fake URL'. the google compressed script is too difficult to read and analyze. # edited The second url should work on ie8 even if I click with ctrl key.

    Read the article

  • Permissions denied on apache rewrite module virtual host configuration

    - by sina
    All of a sudden I keep getting "Permissions denied" on apache 2 virtualhost once we moved it to its own conf file. I have tried all the suggestions I have found here but none work. Please can someone tell me what I am doing wrong? Thanks! <VirtualHost *:80> DocumentRoot "/var/www/mm" <Directory "/var/www/mm"> Options +Indexes +MultiViews +FollowSymLinks AllowOverride all Order deny,allow Allow from all AddType text/vnd.sun.j2me.app-descriptor .jad AddType application/vnd.rim.cod .cod </Directory> Alias /holdspace "/var/www/mm/holdspace" RewriteLogLevel 9 RewriteLog "/var/log/httpd/rewrite.log" RewriteEngine on # 91xx RewriteCond %{HTTP_USER_AGENT} BlackBerry.9105 RewriteRule ^/download/(.*) /holdspace/bb6-360x480/$1 [L] # 92xx RewriteCond %{HTTP_USER_AGENT} BlackBerry.9220 RewriteRule ^/download/(.*) /holdspace/bb5-320x240/$1 [L] Errors in error.log: [Wed May 28 12:44:58 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /download/eazymoney.jad denied [Wed May 28 12:44:58 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /error/HTTP_FORBIDDEN.html.var denied [Wed May 28 12:44:59 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /favicon.ico denied [Wed May 28 12:44:59 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /error/HTTP_FORBIDDEN.html.var denied [Wed May 28 12:44:59 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /favicon.ico denied [Wed May 28 12:44:59 2014] [error] [client 197.255.173.95] (13)Permission denied: access to /error/HTTP_FORBIDDEN.html.var denied Errors in rewrite.log: 197.255.173.95 - - [28/May/2014:12:46:01 +0100] [41.203.113.103/sid#7fe41704ca28][rid#7fe417123378/initial/redir#1] (3) applying pattern '^/download/(.*)' to uri '/error/HTTP_FORBIDDEN.html.var' 197.255.173.95 - - [28/May/2014:12:46:01 +0100] [41.203.113.103/sid#7fe41704ca28][rid#7fe417123378/initial/redir#1] (3) applying pattern '^/download/(.*)' to uri '/error/HTTP_FORBIDDEN.html.var' Apache Configuration file: ServerTokens Prod ServerRoot "/etc/httpd" PidFile run/httpd.pid Timeout 60 KeepAlive Off MaxKeepAliveRequests 100 KeepAliveTimeout 15 <IfModule prefork.c> StartServers 8 MinSpareServers 5 MaxSpareServers 20 ServerLimit 256 MaxClients 256 MaxRequestsPerChild 4000 </IfModule> <IfModule worker.c> StartServers 4 MaxClients 300 MinSpareThreads 25 MaxSpareThreads 75 ThreadsPerChild 25 MaxRequestsPerChild 0 </IfModule> Listen 80 LoadModule auth_basic_module modules/mod_auth_basic.so LoadModule auth_digest_module modules/mod_auth_digest.so LoadModule authn_file_module modules/mod_authn_file.so LoadModule authn_alias_module modules/mod_authn_alias.so LoadModule authn_anon_module modules/mod_authn_anon.so LoadModule authn_dbm_module modules/mod_authn_dbm.so LoadModule authn_default_module modules/mod_authn_default.so LoadModule authz_host_module modules/mod_authz_host.so LoadModule authz_user_module modules/mod_authz_user.so LoadModule authz_owner_module modules/mod_authz_owner.so LoadModule authz_groupfile_module modules/mod_authz_groupfile.so LoadModule authz_dbm_module modules/mod_authz_dbm.so LoadModule authz_default_module modules/mod_authz_default.so LoadModule ldap_module modules/mod_ldap.so LoadModule authnz_ldap_module modules/mod_authnz_ldap.so LoadModule include_module modules/mod_include.so LoadModule log_config_module modules/mod_log_config.so LoadModule logio_module modules/mod_logio.so LoadModule env_module modules/mod_env.so LoadModule ext_filter_module modules/mod_ext_filter.so LoadModule mime_magic_module modules/mod_mime_magic.so LoadModule expires_module modules/mod_expires.so LoadModule deflate_module modules/mod_deflate.so LoadModule headers_module modules/mod_headers.so LoadModule usertrack_module modules/mod_usertrack.so LoadModule setenvif_module modules/mod_setenvif.so LoadModule mime_module modules/mod_mime.so LoadModule dav_module modules/mod_dav.so LoadModule status_module modules/mod_status.so LoadModule autoindex_module modules/mod_autoindex.so LoadModule info_module modules/mod_info.so LoadModule dav_fs_module modules/mod_dav_fs.so LoadModule vhost_alias_module modules/mod_vhost_alias.so LoadModule negotiation_module modules/mod_negotiation.so LoadModule dir_module modules/mod_dir.so LoadModule actions_module modules/mod_actions.so LoadModule speling_module modules/mod_speling.so LoadModule userdir_module modules/mod_userdir.so LoadModule alias_module modules/mod_alias.so LoadModule substitute_module modules/mod_substitute.so LoadModule rewrite_module modules/mod_rewrite.so LoadModule proxy_module modules/mod_proxy.so LoadModule proxy_ftp_module modules/mod_proxy_ftp.so LoadModule proxy_http_module modules/mod_proxy_http.so LoadModule proxy_ajp_module modules/mod_proxy_ajp.so LoadModule proxy_connect_module modules/mod_proxy_connect.so LoadModule cache_module modules/mod_cache.so LoadModule suexec_module modules/mod_suexec.so LoadModule disk_cache_module modules/mod_disk_cache.so LoadModule cgi_module modules/mod_cgi.so LoadModule version_module modules/mod_version.so Include conf.d/*.conf User apache Group apache ServerAdmin root@localhost ServerName sv001zma002.africa.int.myorg.com UseCanonicalName Off DocumentRoot "/var/www/html" <Directory /> Options FollowSymLinks AllowOverride None </Directory> <Directory "/var/www/html"> Options FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> <IfModule mod_userdir.c> UserDir disabled </IfModule> DirectoryIndex index.html index.html.var AccessFileName .htaccess <Files ~ "^\.ht"> Order allow,deny Deny from all Satisfy All </Files> TypesConfig /etc/mime.types DefaultType text/plain <IfModule mod_mime_magic.c> MIMEMagicFile conf/magic </IfModule> HostnameLookups Off ErrorLog logs/error_log LogLevel warn LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined LogFormat "%h %l %u %t \"%r\" %>s %b" common LogFormat "%{Referer}i -> %U" referer LogFormat "%{User-agent}i" agent CustomLog logs/access_log combined ServerSignature Off TraceEnable Off Alias /icons/ "/var/www/icons/" <Directory "/var/www/icons"> Options MultiViews FollowSymLinks AllowOverride None Order allow,deny Allow from all </Directory> <IfModule mod_dav_fs.c> DAVLockDB /var/lib/dav/lockdb </IfModule> ScriptAlias /cgi-bin/ "/var/www/cgi-bin/" <Directory "/var/www/cgi-bin"> AllowOverride None Options None Order allow,deny Allow from all </Directory> IndexOptions FancyIndexing VersionSort NameWidth=* HTMLTable Charset=UTF-8 AddIconByEncoding (CMP,/icons/compressed.gif) x-compress x-gzip AddIconByType (TXT,/icons/text.gif) text/* AddIconByType (IMG,/icons/image2.gif) image/* AddIconByType (SND,/icons/sound2.gif) audio/* AddIconByType (VID,/icons/movie.gif) video/* AddIcon /icons/binary.gif .bin .exe AddIcon /icons/binhex.gif .hqx AddIcon /icons/tar.gif .tar AddIcon /icons/world2.gif .wrl .wrl.gz .vrml .vrm .iv AddIcon /icons/compressed.gif .Z .z .tgz .gz .zip AddIcon /icons/a.gif .ps .ai .eps AddIcon /icons/layout.gif .html .shtml .htm .pdf AddIcon /icons/text.gif .txt AddIcon /icons/c.gif .c AddIcon /icons/p.gif .pl .py AddIcon /icons/f.gif .for AddIcon /icons/dvi.gif .dvi AddIcon /icons/uuencoded.gif .uu AddIcon /icons/script.gif .conf .sh .shar .csh .ksh .tcl AddIcon /icons/tex.gif .tex AddIcon /icons/bomb.gif core AddIcon /icons/back.gif .. AddIcon /icons/hand.right.gif README AddIcon /icons/folder.gif ^^DIRECTORY^^ AddIcon /icons/blank.gif ^^BLANKICON^^ DefaultIcon /icons/unknown.gif ReadmeName README.html HeaderName HEADER.html IndexIgnore .??* *~ *# HEADER* README* RCS CVS *,v *,t AddLanguage ca .ca AddLanguage cs .cz .cs AddLanguage da .dk AddLanguage de .de AddLanguage el .el AddLanguage en .en AddLanguage eo .eo AddLanguage es .es AddLanguage et .et AddLanguage fr .fr AddLanguage he .he AddLanguage hr .hr AddLanguage it .it AddLanguage ja .ja AddLanguage ko .ko AddLanguage ltz .ltz AddLanguage nl .nl AddLanguage nn .nn AddLanguage no .no AddLanguage pl .po AddLanguage pt .pt AddLanguage pt-BR .pt-br AddLanguage ru .ru AddLanguage sv .sv AddLanguage zh-CN .zh-cn AddLanguage zh-TW .zh-tw LanguagePriority en ca cs da de el eo es et fr he hr it ja ko ltz nl nn no pl pt pt-BR ru sv zh-CN zh-TW ForceLanguagePriority Prefer Fallback AddDefaultCharset UTF-8 AddType application/x-compress .Z AddType application/x-gzip .gz .tgz AddType application/x-x509-ca-cert .crt AddType application/x-pkcs7-crl .crl AddHandler type-map var AddType text/html .shtml AddOutputFilter INCLUDES .shtml ProxyErrorOverride On Alias /error/ "/var/www/error/" <IfModule mod_negotiation.c> <IfModule mod_include.c> <Directory "/var/www/error"> AllowOverride None Options IncludesNoExec AddOutputFilter Includes html AddHandler type-map var Order allow,deny Allow from all LanguagePriority en es de fr ForceLanguagePriority Prefer Fallback </Directory> ErrorDocument 400 /error/HTTP_BAD_REQUEST.html.var ErrorDocument 401 /error/HTTP_UNAUTHORIZED.html.var ErrorDocument 403 /error/HTTP_FORBIDDEN.html.var ErrorDocument 404 /error/HTTP_NOT_FOUND.html.var </IfModule> </IfModule> BrowserMatch "Mozilla/2" nokeepalive BrowserMatch "MSIE 4\.0b2;" nokeepalive downgrade-1.0 force-response-1.0 BrowserMatch "RealPlayer 4\.0" force-response-1.0 BrowserMatch "Java/1\.0" force-response-1.0 BrowserMatch "JDK/1\.0" force-response-1.0 BrowserMatch "Microsoft Data Access Internet Publishing Provider" redirect-carefully BrowserMatch "MS FrontPage" redirect-carefully BrowserMatch "^WebDrive" redirect-carefully BrowserMatch "^WebDAVFS/1.[0123]" redirect-carefully BrowserMatch "^gnome-vfs/1.0" redirect-carefully BrowserMatch "^XML Spy" redirect-carefully BrowserMatch "^Dreamweaver-WebDAV-SCM1" redirect-carefully ErrorDocument 400 "Bad Request"

    Read the article

  • mmap() for large file I/O?

    - by Boatzart
    I'm creating a utility in C++ to be run on Linux which can convert videos to a proprietary format. The video frames are very large (up to 16 megapixels), and we need to be able to seek directly to exact frame numbers, so our file format uses libz to compress each frame individually, and append the compressed data onto a file. Once all frames are finished being written, a journal which includes meta data for each frame (including their file offsets and sizes) is written to the end of the file. I'm currently using ifstream and ofstream to do the file i/o, but I am looking to optimize as much as possible. I've heard that mmap() can increase performance in a lot of cases, and I'm wondering if mine is one of them. Our files will be in the tens to hundreds of gigabytes, and although writing will always be done sequentially, random access reads should be done in constant time. Any thoughts as to whether I should investigate this further, and if so does anyone have any tips for things to look out for? Thanks!

    Read the article

  • Encode/compress sequence of repeating integers

    - by Alex
    Hey there! I have very long integer sequences that look like this (arbitrary length!): 0000000001110002220033333 Now I need some algorithm to convert this string into something compressed like a9b3a3c3a2d5 Which means "a 9 times, then b 3 times, then a 3 times" and so on, where "a" stands for 0, "b" for 1, "c" for 2 and "d" for 3. How would you do that? So far nothing suitable came to my mind, and I had no luck with google because I didn't really know what to search for. What is this kind of encoding / compression called? PS: I am going to do the encoding with PHP, and the decoding in JavaScript.

    Read the article

  • Does the iPhone compress images saved within my app's documents directory?

    - by Jane Sales
    We are caching images downloaded from our server. We write them to our local storage like this: NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0] ; NSString* folder = [[documentsDirectory stringByAppendingPathComponent:@"flook.images"] retain]; NSString* fileName = [folder stringByAppendingFormat:@"/%@", aBaseFilename]; BOOL writeSuccess = [anImageData writeToFile:fileName atomically:NO]; The downloaded images are always the expected size, around 45-85KB. Later, we read images from our cache like this: NSArray *paths = NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0] ; NSString* folder = [[documentsDirectory stringByAppendingPathComponent:@"flook.images"] retain]; NSString* fileName = [folder stringByAppendingFormat:@"/%@", aBaseFilename]; image = [UIImage imageWithContentsOfFile:fileName]; Occasionally, the images returned from this cache read are much smaller because they are much more compressed - around 5-10KB. Has the OS done this to us?

    Read the article

  • Most flexible minimizer/compressor for ASP.NET MVC 2?

    - by AlexanderN
    From your experience, what's the most flexible minimizer/compressor (JS+CSS) for ASP.NET MVC you've dealt with? So far mbcompress doesn't seem to be too MVC friendly weboptimizer.codeplex.com lacks documentation clientdependency.codeplex.com is still in beta compress2 seems like a good candidate, but haven't tried it yet mvcscriptmanager only combines and compresses javascript but not CSS By flexible I mean Choose what should be compressed, minified, and combined Add exceptions. E.g. if debug don't compress XYZ.JS or don't minify ABC.CSS Caching In the end, it should help offer the best YSLOW score. If you know of any other assemblies out there, please list them also.

    Read the article

  • I'm using the correct content type & Headers so Why is FireFox saving Zip Files without extensions

    - by The_AlienCoder
    Users on my site have the option to download all the photos in an album as a zip file.The Zip file is dynamically created and saved to Response.OutPutStream to be detected as a file download on the user's browser. Here is the Header and Content-type I am outputing context.Response.AddHeader("Content-Disposition", "attachment; filename=Photos.zip"); context.Response.ContentType = "application/x-zip-compressed"; ..Well everything works fine with every browser except FireFox. Although Firefox correctly detects the download as a Zip file, It saves the file without the .zip extension. I thought adding this header context.Response.AddHeader("Content-Disposition", "attachment; filename=Photos.zip"); ..is supposed to force FF to save the extension. I believe I am following the correct protocol so why is FF behaving this way and how do I fix this?

    Read the article

  • I need programmatically way to perform fastest 'trans-wrap' of mov to mp4 on iPhone/iPad application

    - by user1307877
    I want to change the container of a .mov video files that I pick using  UIImagePickerController and compressed them via AVAssetExportSession with AVAssetExportPresetMediumQuality and  shouldOptimizeForNetworkUse = YES to .mp4 container. I need programmatically way/sample code to perform a fastest 'trans-wrap' on iPhone/iPad application I tried to set AVAssetExportSession.outputFileType property to AVFileTypeMPEG4 but it not supported and I got exception I tried to do this transform using AVAssetWriter by specifying fileType:AVFileTypeMPEG4, actually I got .mp4 output file, but it was not 'wrap-trans', the output file was  3x bigger than source, and the convert process took 128 sec for video with 60 sec duration. I need solution that will run quickly and will keep the file size  please help

    Read the article

  • Can't correctly stream x264 video to mobile via RTSP

    - by Andrew
    Hello! I'm trying to stream video compressed with x264 and NeroAAC and hinted with MP4Box. When i'm playing it with VLC player streaming works fine, as expected. When i'm trying to open URL with my HTC Hero, it switches to player mode, then starts "loading video" animation, then after some time it shows "unable to connect to server". Sample movies provided with DSS streamed fine regardless bitrate. I tried few encoding options, but always the same result. I suspect nocabac and level=11 but it didn't changed nothing. Is there some more specific encoding options for such type? Thank You!

    Read the article

  • Accept-Encoding headers being sent by browser but not received by server

    - by Daniel Jacobs
    I have been trying to debug this for weeks. All of the browsers on all of the clients on my home network are sending 'Accept-Encoding: gzip,deflate'. However, that header is somehow, somewhere being dropped before the request makes it to a web server. For example, http://www.whatsmyip.org/http_compression/ says 'No, your browser is not requesting compressed content'. I've used Fiddler to make sure that all of my browsers are indeed sending the header. I've swapped out my router. I've turned off all anti-virus software. Brighthouse/Roadrunner (the local cable ISP) says they are not doing any filtering (and I can't see why they would in this case). Any suggestions would be most welcome!

    Read the article

  • HTTP compression - can I configure a client to compress the data sent to a server?

    - by lgomide
    Hello, I'm using IIS 7 as web server for my application. If I enable dynamic content compression in the server, will this also enable clients to send compressed data to the server, if they can? I mean, my application uses SOAP webservices, and clients usually send large chunks of data to the server. The clients are written in C#/.NET. Is there any kind of configuration I can do in a web reference / serice reference in order to tell them to compress the content before they send it to IIS? And do I have to do any kind of configuration in IIS in order for this to work? Thanks in advance

    Read the article

  • php gzip xml file (53MB) casue Out of memory error

    - by ntan
    Hi, i have a 53 MB xml file that i want to gzip. The code below gzip it $gzFile = "my.gz"; $data = IMPLODE("", FILE($filename)); $gzdata = GZENCODE($data, 9); //open gz -- 'w9' is highest compression $fp = gzopen ($gzFile, 'w9'); //loop through array and write each line into the compressed file gzwrite ($fp, $gzdata); //close the file gzclose ($fp); This cause PHP Fatal error: Out of memory (allocated 70516736) (tried to allocate 24 bytes) Any one have any suggestions. I already have increase the memory in php.ini

    Read the article

< Previous Page | 17 18 19 20 21 22 23 24 25 26 27 28  | Next Page >