Search Results

Search found 1246 results on 50 pages for 'backupup compression'.

Page 37/50 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • MSBuild Extension Pack Zip the folders and subfolders

    - by ManojTrek
    I have to Zip my folders and subfolders Using MSbuild, I was looking at the MSBuild Extension pack, and tried this <ItemGroup> <ZipFiles Include="\Test\Web\**\*.*" > <Group>Release</Group> </ZipFiles> </ItemGroup> <MSBuild.ExtensionPack.Compression.Zip TaskAction="Create" CompressFiles="@(ZipFiles)" ZipFileName="$(WorkingDir)%(ZipFiles.Group).zip"/> When I do this it just keep adding all the files to root, instead of adding it into the specific subfolder within the zip file. I am missing something, can anyone help here please.

    Read the article

  • OpenPGP Signing

    - by singpolyma
    I'm reading RFC4880 in an attempt to produce an implementatdion of a subset of OpenPGP (RSA signatures) using http://phpseclib.sourceforge.net/. I have the publickey and compression-literal-signature packets parsed out. I can extract n and e and feed them to Crypt_RSA to construct a verifier. I tell it I'm using sha256. It then needs a "message" and a " signature" parametre. I get the signature data out of the signature packet no problem. The question I have is: what is "message"? According to sec tion 5.2.4 it's some combination of the literal data packet(s?) (their bodies or the whole packet?) and the "hashed" subpackets. Do I just concat all the data packets and the hashed packets together in the order they appear?

    Read the article

  • Optimizing website - minification, sprites, etc...

    - by nivlam
    I'm looking at the product Aptimize Website Accelerator, which is an ISAPI filter that will concatenate files, minify css/javascript, and more. Does anyone have experience with this product, or any other "all-in-one" solutions? I'm interesting in knowing whether something like this would be good long-term, or would manually setting up all the components (integrate YUICompress into the build process, setting up gzip compression, tweaking expiration headers, etc...) be more beneficial? An all-in-one solution like this looks very tempting, as it could save a lot of time if our website is "less than optimal". But how efficient are these products? Would setting up the components manually generate better results? Or would the gap between the all-in-one solution and manually setting up the component be so small, that it's negligible?

    Read the article

  • Ruby/Rails Audio Conversion Plugins?

    - by coneybeare
    I am looking for a good gem/plugin to convert user-uploaded audio files to different formats. One format in particular that I am interested in is converting to Apple .caf with ima4 compression for inclusion in an iPhone app. I have been using afconvert on my mac for this so far, but I need to do it on my linux box, server-side. Ideally, I would be able to work into paperclip. As an additional solution, ffmpeg could work, but I have not seen any .caf options for it. Anybody know of one?

    Read the article

  • Tomcat gzip while chunked issue

    - by hoodoos
    I'm expiriencing some problem with one of my data source services. As it says in HTTP response headers it's running on Apache-Coyote/1.1. Server gives responses with Transfer-Encoding: chunked, here sample response: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Content-Encoding: gzip Date: Tue, 30 Mar 2010 06:13:52 GMT And problem is when I'm requesting server to send gzipped request it often sends not full response. I recieve response, see that last chunk recieved, but then after ungzipping I see that response is partial. I never seen such behavior with gzip turned off in request headers. So my question is: is it common tomcat issue? maybe one of it's mod which is doing compression? Or maybe it maybe some kind of proxy issue? I can't tell about versions of tomcat or what gzip mod they use, but feel free to ask, i'll try ask my service provider. Thanks.

    Read the article

  • Embedding wav files in AS3 Flash/Flex project?

    - by aaaidan
    The Flash IDE is capable of embedding many types of uncompressed sound files, including wav, and offers optional compression when publishing. However, the [Embed] tag, only seems to allow embedding of mp3 files. Is it truly impossible to embed an uncompressed wav file, or am I missing some magic, undocumented mimeType? I was hoping for something like: [Embed source="../../audio/wibble.wav" mimeType="audio/wav"] ...but I get no transcoder registered for mimeType 'audio/wav' It's possible to embed wav or other format as an octet-stream and parse at runtime, but that's pretty heavy handed I think. I'm surprised that even though the Flash IDE can embed uncompressed sound data, [Embed] cannot, given that the swf spec can contain uncompressed sound data. Any takers?

    Read the article

  • The text in a circle.

    - by kalininew
    Prompt how to solve a problem, please. There is a block with the text of the unknown sizes. It is necessary to spin (with pouring inside) round this block that the text was in a circle (without dependence from the sizes of the text). At change of the sizes of the text (at window compression) the circle should vary dynamically. It is necessary to make it in browser. From what to me to start to solve the given problem? Can be methods javascript (on a site it is connected jQuery)? P.S. The decision should be ???????????????.

    Read the article

  • Why does Google's Closure Compiler leave a few unnecessary spaces or line breaks?

    - by Bungle
    I've noticed that every time I use Google's Closure Compiler Service, it leaves a few unnecessary spaces in the compiled code presented on the right-hand side of the page. These correspond to line breaks in the hosted version of the compiled code. For example (note the line breaks, each of which seems unnecessary): http://troy.onespot.com/static/stack_overflow/closure_spaces.js To date, I've just been removing them manually, but I'm curious why they're there. Is it to limit the line length of the hosted version of the code to make it more readable? Could the compiler be smart enough to leave or insert those intentionally to maximize GZIP compression efforts? I know that they have a trivial effect on the file size, but with so much effort going into minifying every last byte in the source script, it's counterintuitive why they're there.

    Read the article

  • Conditional configuration in maven pom.xml

    - by David Zhao
    I'd like to ONLY exclude certain files in maven-war-plugin when property "skipCompress" set to true, I thought I could do something like this, but it doesn't work for me. BTW, I can't use profile to achieve this even I want to use skipCompress to turn on and off the compression in both development and deployment profiles. <plugin> <artifactId>maven-war-plugin</artifactId> <configuration> <if> <not> <equals arg1="${skipCompress}" arg2 = "true"/> </not> <then> <warSourceExcludes>**/external/dojo/**/*.js</warSourceExcludes> </then> </if> </configuration> </plugin> Thanks, David

    Read the article

  • IIS and Flash forceSmoothing issue

    - by Mark Kennerley
    Hi everyone, I am incorporating a Flash Flickr Polaroid file into my site, http://www.no3dfx.com/polaroid/ But I am having problems with the images being smooth. I have edited the code throughout with forceSmoothing = true and _quality = best. It all works and looks smooth if I test the file in the preview window and if I run the HTML file. But as soon as I put the file under IIS the smoothing stops. All my flash players are v10+ I have turned the IIS compression off but no luck. Can anyone please help with this? Thanks, Clyde

    Read the article

  • Running Sybase ISQL scripts from windows batch file

    - by user1328709
    I have already researched on this site as well as on google extensively for this. I have created a number of batch files that perform certain automated transactions(backups etc) on our production database. i want to further simplify my end of day processes by taking the dumps using a script that accepts input of some parameters. the script is able to login the isql prompt but unable to do the execution of the commands. @ECHO ***Started*** @ECHO Enter MonthDay(MMDD) SET /p md= @ECHO %md% mkdir \\10.20.1.17\arch\212%md%_banking set run=isql -Uuser -SORBITS -Ppass %run% @echo dump database banking to '/media/newArch/212%md%_banking/212%md%EOD_banking.dmp' with compression=5 @echo dump database master to '/media/newArch/212%md%_banking/212%md%EOD_master.dmp' @echo go pause I have been unsuccessful at putting these in a separate script file because the script itself uses a passed parameter. Please give me hints and links to Thanks

    Read the article

  • Is there any simple way to test two PNGs for equality?

    - by Mason Wheeler
    I've got a bunch of PNG images, and I'm looking for a way to identify duplicates. By duplicates I mean, specifically, two PNG files whose uncompressed image data are identical, not necessarily whose files are identical. This means I can't do something simple like compare CRC hash values. I figure this can actually be done reliably since PNGs use lossless compression, but I'm worried about speed. I know I can winnow things down a little by testing for equal dimensions first, but when it comes time to actually compare the images against each other, is there any way to do it reasonably efficiently? (ie. faster than the "double-for-loop checking pixel values against each other" brute-force method?)

    Read the article

  • git: Is it possible to save the packed objects of a dry run and push them later?

    - by shovavnik
    I'm trying to push a bunch of commits that contain a lot of code and a few thousand MP3 and PDF files besides (ranging from 5-40 MB each). Git successfully packs the objects: C:\MyProject> git push Counting objects: 7582, done. Delta compression using up to 2 threads. Compressing objects: 100% (7510/7510), done. But it fails to send the push for some as yet unknown reason. The problem is that it takes it a very long time to repack the files (I'm on a battery-powered laptop and it took about 20 minutes to pack). So I guess my question can be phrases thus: Is it possible to save the packed objects created in a dry run? Once saved, is it possible to push those packed objects and avoid repacking? I looked it up in the git manual and elsewhere and couldn't find anything conclusive. Any help or pointers are appreciated.

    Read the article

  • YUI Compressor + gzip causes Illegal Character error in jQuery

    - by lo_fye
    When I minify jquery using YUI compressor, it works fine. When I then add gzip compression (and serve this version via mod rewrite), the gzipped version throws this error: illegal character in jquery.min.js on line 1 Line 1 is: ???????M???????????s?8?0???!sz?dKr?=? This results in a "jquery is not defined" error. I am using the following rewrite rules to serve up the gzipped versions: #Check to see if browser can accept gzip files. ReWriteCond %{HTTP:accept-encoding} (gzip.*) #make sure there's no trailing .gz on the url ReWriteCond %{REQUEST_FILENAME} !^.+\.gz$ #check to see if a .gz version of the file exists. RewriteCond %{REQUEST_FILENAME}.gz -f #All conditions met so add .gz to URL filename (invisibly) RewriteRule ^(.+) $1.gz [L] I can't find any references to this happening to anyone else. Thoughts?

    Read the article

  • Image coding library

    - by Dmitry
    Is there any good library for lossless image encoding/decoding that has compression rate more or less similar to PNG but decoding to raw RGB bitmap data would be much faster than PNG? Also alpha transparency is needed, but not essential because, alpha channel could be taken from separate image. Original problem lies in slowness of reading and decoding PNG files on iPhone using standard libraries. Obvious and the simples solution would have been storing raw RGB bitmap data, but then size of unpacked ipa is too large - 4 times larger than PNG files. So, I am trying to find some compromise solution.

    Read the article

  • ZIPLIB problem on opening zip files

    - by Ahmet vardar
    I am using this class to create zip <?php // vim: expandtab sw=4 ts=4 sts=4: class zipfile { var $datasec = array(); var $ctrl_dir = array(); var $eof_ctrl_dir = "\x50\x4b\x05\x06\x00\x00\x00\x00"; var $old_offset = 0; function unix2DosTime($unixtime = 0) { $timearray = ($unixtime == 0) ? getdate() : getdate($unixtime); if ($timearray['year'] < 1980) { $timearray['year'] = 1980; $timearray['mon'] = 1; $timearray['mday'] = 1; $timearray['hours'] = 0; $timearray['minutes'] = 0; $timearray['seconds'] = 0; } // end if return (($timearray['year'] - 1980) << 25) | ($timearray['mon'] << 21) | ($timearray['mday'] << 16) | ($timearray['hours'] << 11) | ($timearray['minutes'] << 5) | ($timearray['seconds'] >> 1); } // end of the 'unix2DosTime()' method function addFile($data, $name, $time = 0) { $name = str_replace('\\', '/', $name); $dtime = dechex($this->unix2DosTime($time)); $hexdtime = '\x' . $dtime[6] . $dtime[7] . '\x' . $dtime[4] . $dtime[5] . '\x' . $dtime[2] . $dtime[3] . '\x' . $dtime[0] . $dtime[1]; eval('$hexdtime = "' . $hexdtime . '";'); $fr = "\x50\x4b\x03\x04"; $fr .= "\x14\x00"; // ver needed to extract $fr .= "\x00\x00"; // gen purpose bit flag $fr .= "\x08\x00"; // compression method $fr .= $hexdtime; // last mod time and date // "local file header" segment $unc_len = strlen($data); $crc = crc32($data); $zdata = gzcompress($data); $zdata = substr(substr($zdata, 0, strlen($zdata) - 4), 2); // fix crc bug $c_len = strlen($zdata); $fr .= pack('V', $crc); // crc32 $fr .= pack('V', $c_len); // compressed filesize $fr .= pack('V', $unc_len); // uncompressed filesize $fr .= pack('v', strlen($name)); // length of filename $fr .= pack('v', 0); // extra field length $fr .= $name; // "file data" segment $fr .= $zdata; // "data descriptor" segment (optional but necessary if archive is not // served as file) $fr .= pack('V', $crc); // crc32 $fr .= pack('V', $c_len); // compressed filesize $fr .= pack('V', $unc_len); // uncompressed filesize // add this entry to array $this -> datasec[] = $fr; // now add to central directory record $cdrec = "\x50\x4b\x01\x02"; $cdrec .= "\x00\x00"; // version made by $cdrec .= "\x14\x00"; // version needed to extract $cdrec .= "\x00\x00"; // gen purpose bit flag $cdrec .= "\x08\x00"; // compression method $cdrec .= $hexdtime; // last mod time & date $cdrec .= pack('V', $crc); // crc32 $cdrec .= pack('V', $c_len); // compressed filesize $cdrec .= pack('V', $unc_len); // uncompressed filesize $cdrec .= pack('v', strlen($name) ); // length of filename $cdrec .= pack('v', 0 ); // extra field length $cdrec .= pack('v', 0 ); // file comment length $cdrec .= pack('v', 0 ); // disk number start $cdrec .= pack('v', 0 ); // internal file attributes $cdrec .= pack('V', 32 ); // external file attributes - 'archive' bit set $cdrec .= pack('V', $this -> old_offset ); // relative offset of local header $this -> old_offset += strlen($fr); $cdrec .= $name; // optional extra field, file comment goes here // save to central directory $this -> ctrl_dir[] = $cdrec; } // end of the 'addFile()' method function file() { $data = implode('', $this -> datasec); $ctrldir = implode('', $this -> ctrl_dir); return $data . $ctrldir . $this -> eof_ctrl_dir . pack('v', sizeof($this -> ctrl_dir)) . // total # of entries "on this disk" pack('v', sizeof($this -> ctrl_dir)) . // total # of entries overall pack('V', strlen($ctrldir)) . // size of central dir pack('V', strlen($data)) . // offset to start of central dir "\x00\x00"; // .zip file comment length } // end of the 'file()' method function addFiles($files ) { foreach($files as $file) { if (is_file($file)) //directory check { $data = implode("",file($file)); $this->addFile($data,$file); } } } function output($file) { $fp=fopen($file,"w"); fwrite($fp,$this->file()); fclose($fp); } } // end of the 'zipfile' class ?> It creates zip file but when i try to open it on Mac os x snow leopard and windows 7, it doesnt open. on mac i had this error: Error 1: operation not permitted Any idea ? thanks

    Read the article

  • Fixing warning from git

    - by japancheese
    I've been doing a workflow of making a git repository on a remote central repository, cloning that repo on my local dev machine, doing some work, and then pushing the changes back to the same repo on the remote server. However, and I believe this was after an update I did to git recently, after pushing up a change, I'm getting the following warning: Counting objects: 2724, done. Delta compression using up to 2 threads. Compressing objects: 100% (2666/2666), done. Writing objects: 100% (2723/2723), 5.90 MiB | 313 KiB/s, done. Total 2723 (delta 219), reused 0 (delta 0) warning: updating the currently checked out branch; this may cause confusion, as the index and working tree do not reflect changes that are now in HEAD. Can someone explain to me exactly what this warning means, and what I'm doing wrong in my workflow to not receive this warning?

    Read the article

  • New to AVL tree implementation.

    - by nn
    I am writing a sliding window compression algorithm (LZ77) that searches for phrases in a "moving" dictionary. So far I have written a BST where each node is stored in an array and it's index in the array is also the value of the starting position in the window itself. I am now looking at transforming the BST to an AVL tree. I am a little confused at the sample implementations I have seen. Some only appear to store the balance factors whereas others store the height of each tree. Are there any performance advantage/disadvantages of storing the height and/or balance factor for each node? Apologies if this is a very simple question, but I'm still not visualizing how I want to restructure my BST to implement height balancing. Thanks.

    Read the article

  • Own data format for the iPhone

    - by Stefan
    Hi, I would like to create my own data format for an iPhone app. The files should be similar structured as e.g. Apple's iWork files (.pages). That means, I have a folder with some files in it: The file 'Juicy.fruit' contains: Fruits ---> Apple.xml ---> Banana.xml ---> Pear.xml ---> PreviewPicture.png This folder "Fruits" should be packed in a handy file 'Juicy.fruit'. Compression isn't necessary. How could I achieve this? I've discovered some open source ZIP-libraries. However, I would like to to build my own data format with the iPhones built-in libs (if possible). Best regards, Stefan

    Read the article

  • IE 8 dialog windows not decompressing files

    - by Mike
    Hi, I've got a website where we have pre-compressed all of our HTML files. In general this works fine, but since IE 8 has come out some people are finding that they can not use some parts of the website. We've used the showModalDialog command to open a dialog window and pointing to one of our pre-compressed files but it displays it just show up as strange characters (ie not decompressed). Now it only happens in the dialog. I'm pretty sure our compression is all fine because the page they are viewing to open the dialog is also compressed. Has anyone else come across this or got any suggestions cuz i'm stumped??? Thanks, Mike

    Read the article

  • Extremely slow insert from Delphi to Remote MySQL Database

    - by MarkRobinson
    Having a major hair-pulling issue with extremely slow inserts from Delphi 2010 to a remote MySQL 5.09 server. So far, I have tried: ADO using MySQL ODBC Driver Zeoslib v7 Alpha I have used batching and direct insert with ADO (using table access), and with Zeos I have used SQL insertion with a Query, then used Table direct mode and also cached updates Table mode using applyupdates and commit. Both technologies I have tried with compression on and off. So far I have seen a pretty much the same across the board 7.5 records per second!!! Now, I would from this point assume that the remote server is just slow, but the MySQL Workbench is amazingly fast, and the Migration toolkit managed the initial migration very quickly (to be honest, I don't recall how quickly - which kind of means that it was quick) I'm just about to try the MyDAC components as we already use SDAC (wish there was a multi-buy discount or that we'd chosen UniDAC instead now!) Any ideas?

    Read the article

  • Untar, ungz, gz, tar - how do you remember all the useful options?

    - by deadprogrammer
    I am pretty sure I am not the only one with the following problem: every time I need to uncompress a file in *nix I can't remember all the switches, and end up googling it, which is surprizing considering how often I need to do this. Do you have a good compression cheat sheet? Or how about a mnemonic for all those nasty switches in tar? I am making this article a wiki so that we can create a nice cheat sheet here. Oh, and about man pages: is there's one thing they are not helpful for, it's for figuring out how to uncompress a file.

    Read the article

  • sending binary data via POST on android

    - by wo_shi_ni_ba_ba
    Android supports a limited version of apache's http client(v4). typically if I want to send binary data using content type= application/octet-stream via POST, I do the following: HttpClient client = getHttpClient(); HttpPost method=new HttpPost("http://192.168.0.1:8080/xxx"); System.err.println("send to server "+s); if(compression){ byte[]compressed =compress(s); RequestEntity entity = new ByteArrayRequestEntity(compressed); method.setEntity(entity); } HttpResponse resp=client.execute(method); however ByteArrayRequestEntity is not supported on android. what can I do?

    Read the article

  • Encode/compress sequence of repeating integers

    - by Alex
    Hey there! I have very long integer sequences that look like this (arbitrary length!): 0000000001110002220033333 Now I need some algorithm to convert this string into something compressed like a9b3a3c3a2d5 Which means "a 9 times, then b 3 times, then a 3 times" and so on, where "a" stands for 0, "b" for 1, "c" for 2 and "d" for 3. How would you do that? So far nothing suitable came to my mind, and I had no luck with google because I didn't really know what to search for. What is this kind of encoding / compression called? PS: I am going to do the encoding with PHP, and the decoding in JavaScript.

    Read the article

  • Are these jobs for developer or designers or for client himself? for a web-site projects [closed]

    - by jitendra
    Are these jobs for developer or for designers or for client himself? for a web-site projects. Client is asking to do all things to XHTML CSS PHP coder.. Spell checking grammar checking Descriptive alt text for big chart , graph images, technical images To write Table summary and caption Descriptive Link text Color Contrast checking Deciding in content what should be H2 ,H3, H4... and what should be <strong> or <span class="boldtext"> Meta Description and keywords for each pages Image compression To decide Filenames for images,PDf etc To decide Page's <title> for each page

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >