Search Results

Search found 40229 results on 1610 pages for 'deleted files'.

Page 401/1610 | < Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >

  • Capturing output of find . -print0 into a bash array

    - by Idris
    Using find . -print0 seems to be the only safe way of obtaining a list of files in bash due to the possibility of filenames containing spaces, newlines, quotation marks etc. However, I'm having a hard time actually making find's output useful within bash or with other command line utilities. The only way I have managed to make use of the output is by piping it to perl, and changing perl's IFS to null: find . -print0 | perl -e '$/="\0"; @files=<>; print $#files;' This example prints the number of files found, avoiding the danger of newlines in filenames corrupting the count, as would occur with: find . | wc -l As most command line programs do not support null-delimited input, I figure the best thing would be to capture the output of find . -print0 in a bash array, like I have done in the perl snippet above, and then continue with the task, whatever it may be. How can I do this? This doesn't work: find . -print0 | ( IFS=$'\0' ; array=( $( cat ) ) ; echo ${#array[@]} ) A much more general question might be: How can I do useful things with lists of files in bash?

    Read the article

  • Does MultiThreading in Java takes much time for task completion?

    - by Geeta
    I have to search for a string in 10 large size files (in zip format 70 MB) and have to print the lines with the search string to corresponding 10 output files.(i.e. file 1 output should be in output_file1...file2--- output_file2). The same program takes 15 mins for a single file. But if use 10 threads to read 10 files and to write in 10 different files it should complete in 15 mins but its taking 40 mins. How can I solve this. Or multithreading will take this much time only?

    Read the article

  • bash find xargs grep only single occurence

    - by keftebub
    hi. maybe it's a bit strange - and maybe there are other tools to do this but, well.. i am using the following classic bash command to find all files which contain some string: find . -type f | xargs grep "something" i have a great number of files, on multiple depths. first occurence of "something" is enough for me, but find continues searching, and takes a long time to complete the rest of the files. what i would like to do is something like a "feedback" from grep back to find so that find could stop searching for more files. is such a thing possible? thank you

    Read the article

  • RubyMine Folder Tree doesn't refresh

    - by user336514
    Hi Using RubyMine 2.0.2 for the first time today, on Mac OSX Leopard. If I create files in the filesystem (with, say, script/generate) those new files do not appear in rubymine. I have had limited success with restarting the program, in that the files in the db/ folder are added and removed, but unfortunately new folders in the views directory are not shown. Pardon my french, but wtf? Thanks

    Read the article

  • PHP - WideImage class cropping

    - by Kolind
    I am trying to crop a large image to make thumbnails. But I can't get WideImage to save the file. I've tried the following: 1st method: $img = WideImage::load('../files/' . $file_name)->crop('center', 'center', 200, 200); $img->saveToFile('../files/thumb/thumb_' . $file_name); 2nd method: $img = WideImage::load('../files/' . $file_name)->crop('center', 'center', 200, 200); $img->copy('../files/thumb/thumb_' . $file_name); But nothing seems to happen, when I am performing any of these methods. I am getting no errors and no thumbnails. Am I doing something wrong here?

    Read the article

  • PHP include doesn't work

    - by Chris
    I'm not sure how simple is this to solve, but I assume I'm doing something wrong. I'm new to PHP, so bear with me, please. When I started learning PHP, I always placed all my project files into the same folder along with index.php and thus included everything like this: <?php include('./translation.php'); ?> Later on in the process of learning as I gained experience and my skill increased, I had to start using folders and place my files into sub folders. I ended up successfully including my files with the following: <?php include('../translation.php'); ?> My trouble-free coding took an unexpected turn when I decided to start using sub-sub folders. After placing all the files even deeper into the file structure I was shocked to find out that I cannot include them anymore, using: <?php include('.../translation.php'); ?> Now I'm lost. What did I do wrong? Am I to understand that I cannot include files deeper than 2 directories in the project? Should I start using a different file system?

    Read the article

  • Split string with delimiter in sql server

    - by Renju
    I'm having a coloumn name with varchar field that holds some folder path like "C:\Program Files\Internet Explorer\en-US" .I need to update the root folder name(Program files to profilesNew).Can anyone please help. I tried with a query declare @val as varchar(100) set @val='C:\Program Files\Internet Explorer\en-US' select substring(@val,charindex(':\',@val),charindex('\',@val)) but not getting the exact answer C:\Program FilesNew\Internet Explorer\en-US

    Read the article

  • SVN Export in Eclipse removes labels on tabs

    - by Sorcy
    I have a very strange effect when using subclipse with eclipse. Whenever I use Team-Export to export a file from the editor the export works fine, but the label of the tab of the file is removed. Effect can be seen here: http://www.daspferd.de/img/tabs.png Strangely enough it happens with php-files, css-files, html-files but NOT with javascript-files. So I'm assuming it's some kind of setting that I haven't found yet and not a bug in subclipse. Anyone know where I can shut down this behaviour?

    Read the article

  • Seperate compilation in C++

    - by Pat Murray
    Suppose you are creating a class with multiple .cpp files (which each contain the implementation of a member function) and have the class' declaration in a .h file. Also, each .cpp file includes the .h file via the include directive. I was told that if you change the implementation of any of the member functions (.cpp files) that you will have to recompile every .cpp file in order to run the program. That is, if I had 5 member functions (each implemented in a .cpp file) and I changed the implementation of 1 of the .cpp files I would have to compile the 1 .cpp file I changed AND the 4 other .cpp files I didn't change in order to correctly run my program. My question, if the previous statement is true, is why is the statement is true? Any insight on this concept would be helpful.

    Read the article

  • .htaccess - Block all referrers but one

    - by HarryBeasant
    I am currently running a file sharing website where all files are stored remotely, they are hot linked on the download buttons (on the main site). The current .htaccess force downloads all files, images etc. <FilesMatch "\.(?i:doc|odf|pdf|rtf|txt|png|jpg|jpeg|mp3|mp4|wav|wmv|gif|bmp|avi|mts)$"> Header set Content-Disposition attachment </FilesMatch> What i am trying to do is make sure people cannot hot link the files. So i was thinking, is there a was i can block all other referrers to the domain that stores the files (it's an IP) apart from the main website (a domain). Thanks!

    Read the article

  • How to handle build rule with unknown targets in OMake when target list generator is built

    - by Michael E
    I have a project which uses OMake for its build system, and I am trying to handle a rather tough corner case. I have some definition files and a tool which can take these definition files and create GraphViz files. There are two problems, though: Each definition file can produce multiple graphs, and the list of graphs it can produce is encoded in the file. My dump tool does have a -list option which lists all the graphs a definition file will produce. This dump tool is built in the source tree. I want this list available in the OMakefile so I can use other rules to convert the DOT files to SVG, and have a phony target depend on all the SVGs (goal: a single build command which builds SVG descriptions of all my graphs). If I only had the first problem, it would be easy - I would run the tool to build a list, and then use that list to build a target which invokes the dumper to output the GraphViz files. However, I am rather stuck on forcing the dump tool to be built before it is needed. If this were make, I would just run make recursively to build the dump tool. OMake does not allow recursive invocation, however, and the build function is only usable from osh. Any suggestions for a good solution to this problem?

    Read the article

  • Implement a threading to prevent UI block on a bug in an async function

    - by Marcx
    I think I ran up againt a bug in an async function... Precisely the getDirectoryListingAsync() of the File class... This method is supposted to return an object containing the lists of files in a specified folder. I found that calling this method on a direcory with a lot of files (in my tests more than 20k files), after few seconds there is a block on the UI until the process is completed... I think that this method is separated in two main block: 1) get the list of files 2) create the array with the details of the files The point 1 seems to be async (for a few second the ui is responsive), then when the process pass from point 1 to point 2 the block of the UI occurs until the complete event is dispathed... Here's some (simple) code: private function checkFiles(dir:File):void { if (dir.exists) { dir.addEventListener( FileListEvent.DIRECTORY_LISTING, listaImmaginiLocale); dir.getDirectoryListingAsync(); // after this point, for the firsts seconds the UI respond well (point 1), // few seconds later (point 2) the UI is frozen } } private function listaImmaginiLocale( event:FileListEvent ):void { // from this point on the UI is responsive again... } Actually in my projects there are some function that perform an heavy cpu usage and to prevent the UI block I implemented a simple function that after some iteration will wait giving time to UI to be refreshed. private var maxIteration:int = 150000; private function sampleFunct(offset:int = 0) :void { if (offset < maxIteration) { // do something // call the recursive function using a timeout.. // if the offset in multiple by 1000 the function will wait 15 millisec, // otherwise it will be called immediately // 1000 is a random number for the pourpose of this example, but I usually change the // value based on how much heavy is the function itself... setTimeout(function():void{aaa(++offset);}, (offset%1000?15:0)); } } Using this method I got a good responsive UI without afflicting performance... I'd like to implement it into the getDirectoryListingAsync method but I don't know if it's possibile how can I do it where is the file to edit or extend.. Any suggestion???

    Read the article

  • UIFileSharingEnabled use folders

    - by Ali Shafai
    I want to allow users add files to the application document folder, so I used the iTunes file sharing. The problem is they can only add single files with a flat structure. I want to drag and drop whole folder (even with sub folders) and keep the structure. Questions I have: is it possible with iTunes file sharing? if not, is there an open source project that helps me with writing a pc side app that talks to the iPhone side app and pushes the files into it?

    Read the article

  • Jenkins - Publish Over CIFS Plugin

    - by Juvil
    I am getting confused with this plug in. basically my target is to Deploy files from Server1 to Server2 now the buildoutput dir is in a specific location in Server1 example E:\BuildOutput\Apps\Application1\Bin\ I need to deploy them in Server2 C:\Program Files\Tools\Application1\Bin\ How do I set up this plugin to work to what i need? I am getting stressed in the amount of files that needs to be deployed to another server, i just wished a simple xcopy tool to another server could work.

    Read the article

  • Clean logging with BASH

    - by Matt Krouse
    I have a script that deletes files 7 days or older and then logs them to a folder. It logs and deletes everything correctly but when I open up the log file for viewing, its very sloppy. log=$HOME/Deleted/$(date) find $HOME/OldLogFiles/ -type f -mtime +7 -delete -print > "$log" The log file is difficult to read Example File Output: (when opened in notepad) /home/u0146121/OldLogFiles/file1.txt/home/u0146121/OldLogFiles/file2.txt/home/u0146121/OldLogFiles/file3.txt Is there anyway to log the file nicer and cleaner? Maybe with the Filename, date deleted, and how old it was? Any suggestions help!

    Read the article

  • Jenkins Build fails when artifacts are not there

    - by leifg
    I use Jenkins to run some integration tests on a web appilcation (using cucumber, capybara and selenium) Everytime a test fails, a screenshot, the HTML source and a video of the process is saved. the path structure looks like this: results/output/<test_name>/<files> I use the archive artifacts feature of Jenkins to provide the files (pattern: results/output/*/*). It works great. However as soon as a build succeeds, there are no screenshots/videos etc... and the build fails because Jenkins cannot find the files for the pattern. Is there a way to tell Jenkins to succeed without having the files present? I don't want to do a dirty hack which involves creating an empty folder structure like result/output/success/hooray.txt.

    Read the article

  • Drupal, Video: why the thumbnails image src links to a video instead of an image ?

    - by Patrick
    hi, I'm using Video module with CCK Video Upload field and I want to play videos in a lightbox when the user clicks on video thumbnails. I can select any option in the "Display Field" tab in "Content Type" settings, such as: LightBox2: galleryVideo -> original The src attribute of the thumbnail image always links to the video instead of the image... what's the reason of this bug ? <a rel="lightbox[field_video][Video Number 2&lt;br /&gt;&lt;br /&gt;&lt;a href=&quot;/lancelmaat/content/stalkshow&quot; id=&quot;node_link_text&quot; class=&quot;active&quot;&gt;View Image Details&lt;/a&gt;]" href="http://localhost/lancelmaat/sites/default/files/files/projects/Stalkshow/videos/stalkshowdvd21Mbps.flv" class="lightbox-processed"> <img title="" alt="Video Number 2" src="http://localhost/lancelmaat/sites/default/files/imagecache/galleryVideo/files/projects/Stalkshow/videos/stalkshowdvd21Mbps.flv"> </a> thanks

    Read the article

  • Forwarding HTTP Request with Direct Server Return

    - by Daniel Crabtree
    I have servers spread across several data centers, each storing different files. I want users to be able to access the files on all servers through a single domain and have the individual servers return the files directly to the users. The following shows a simple example: 1) The user's browser requests http://www.example.com/files/file1.zip 2) Request goes to server A, based on the DNS A record for example.com. 3) Server A analyzes the request and works out that /files/file1.zip is stored on server B. 4) Server A forwards the request to server B. 5) Server B returns file1.zip directly to the user without going through server A. Note: steps 4 and 5 must be transparent to the user and cannot involve sending a redirect to the user as that would violate the requirement of a single domain. From my research, what I want to achieve is called "Direct Server Return" and it is a common setup for load balancing. It is also sometimes called a half reverse proxy. For step 4, it sounds like I need to do MAC Address Translation and then pass the request back onto the network and for servers outside the network of server A tunneling will be required. For step 5, I simply need to configure server B, as per the real servers in a load balancing setup. Namely, server B should have server A's IP address on the loopback interface and it should not answer any ARP requests for that IP address. My problem is how to actually achieve step 4? I have found plenty of hardware and software that can do this for simple load balancing at layer 4, but these solutions fall short and cannot handle the kind of custom routing I require. It seems like I will need to roll my own solution. Ideally, I would like to do the routing / forwarding at the web server level, i.e. in PHP or C# / ASP.net. However, I am open to doing it at a lower level such as Apache or IIS, or at an even lower level, i.e. a custom proxy service in front of everything.

    Read the article

  • HTML5 Offline Storage on iPad and iPhone BUG

    - by scaraveos
    Hello everyone, I created a manifest file with 1000 items. Safari, Mozilla browsers are saving the files offline successfully and even Android saves the files correctly offline. On iPad and iPhone when I am trying to save more than 300 items in some point the applicationCache returns "error". When I am trying to save less (e.x.: 200) it saves the files correctly and the applicationCache returns "cached". Any ideas? Thank you.

    Read the article

  • Best directory to store application data with read\write rights for all users?

    - by Wodzu
    Hi guys. Until Windows Vista I was saving my application data into the directory where the program was located. The most common place was "C:\Program Files\MyApplication". As we know, under Vista and later the common user does't have rights to write under "Program Files" folder. So my first idea was to save the application data under "All Useres\Application Data" folder. But it seams that this folder has writing restrictions too! So to sum up, my requirements are: Folder should exist under Windows XP and above Microsoft's systems. All useres of the system should read\write\creation rights to this folder and it subfolders and files. I want to have only one copy of file\files for all useres. Thanks for your time.

    Read the article

  • Apply email retention policy to Inbox but not subfolders?

    - by NaOH
    Our official email policy states that email older than 90 days in the Inbox is moved to Deleted Items, not including subfolders of the Inbox. This wasn't a problem to implement in Exchange 2003. In 2010, however, it appears that Policy Tags applied to the Inbox also apply to its subfolders. How can I prevent this from occuring? EDIT: Here is the output of Get-RetentionPolicy: RunspaceId : b6a05d43-3e56-4348-9d0e-2d2bf7e6c283 RetentionId : 56417b54-af3b-4c14-bd3c-9dcf9bdd133e RetentionPolicyTagLinks : {Junk E-mail - 7 Days, Deleted Items - 7 Days, Sent Items - 90 Days, Inbox - 90 Days} AdminDisplayName : ExchangeVersion : 1.0 (0.0.0.0) Name : Default Company Policy DistinguishedName : CN=Default Company Policy,CN=Retention Policies Container,CN=Company,CN=Microsoft Exchange,CN=Services,CN=Configuration,DC=domain,DC=com Identity : Default Company Policy Guid : 56417b54-af3b-4c14-bd3c-9dcf9bdd133e ObjectCategory : domain.com/Configuration/Schema/ms-Exch-Mailbox-Recipient-Template ObjectClass : {top, msExchRecipientTemplate, msExchMailboxRecipientTemplate} WhenChanged : 2/8/2013 2:18:11 PM WhenCreated : 2/8/2013 2:11:18 PM WhenChangedUTC : 2/8/2013 10:18:11 PM WhenCreatedUTC : 2/8/2013 10:11:18 PM OrganizationId : OriginatingServer : server.domain.com IsValid : True

    Read the article

< Previous Page | 397 398 399 400 401 402 403 404 405 406 407 408  | Next Page >