Search Results

Search found 22480 results on 900 pages for 'internet archive'.

Page 75/900 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • deleting old unused images

    - by Ayyash
    As we move on with our content-based websites, lots of images get dumped in our images folder, but we rarely come across self-committed monkyes that delete their files once they do not need it, which means, we end up with a huge list of images in one folder, and it is very tricky to clean it up. My question is (and i dont know if this is the right website to ask it), is there a tool that allows me to find out if an image has been requested by web in the last (n) months? my other general question is, how do you do it? how do you take control of your images folders? what policy do you enforce on developers to clean up? what measures do you take in order to decide what goes and what stays if you end up with an out-of-control situation? my suggestion was to rename the images folder, create a new one, copy the basic ones and wait for someone to complain about a broken image! :) i find this to be the most efficient.

    Read the article

  • Concerns with compressing sensitive data in OSX

    - by Derek Adair
    Hi, I have some really sensitive data that I am trying to compress and back up so I can reformat. It's absolutely imperative that nothing happens to these files. I'm a developer so I have very little room for error... plus I'd really hate it if any of my .mp3's got corrupted! File Formats Include... .mp3 .php .js .ai - .psd - .flv (and many more) Is the mac OSX file compression safe enough? Or should I look elsewhere? Is there a more efficient/secure file-format other than .zip (i'm assuming so...)

    Read the article

  • cpio VS tar and cp

    - by Tim
    I just learned that cpio has three modes: copy-out, copy-in and pass-through. I was wondering what are the advantages and disadvantages of cpio under copy-out and copy-in modes over tar. When is it better to use cpio and when to use tar? Similar question for cpio under pass-through mode versus cp. Thanks and regards!

    Read the article

  • rename/delete a folder from multipart rar file

    - by kikio
    Hello. I've a question: (I sent it in past) I have multipart rar file. Their contents are: file.part01.rar: myfolder (is a folder) data.cab -- file.part02.rar: myfolder (is a folder) data.cab <- file.part03.rar: myfolder (is a folder) data.cab <- file.part04.rar: difffolder (is a folder) anfolder (is a folder) data.cab <- file.part05.rar: myfolder (is a folder) data.cab <-- I want to extract it, so I right-click on "file.part01.rar" and select "Extract to ...". It extract 3 files, but in part 4, WinRAR said: "CRC. This file is currput." I think it problem is in the folders name in part04.rar. Is there anyway to rename folders in part04.rar? and cut "data.cab" from "afolder" to "difffolder". I really need it!! it is very emergency!!!!!!!! Thank you .....

    Read the article

  • Best Secure Encryption for Zip Files via Linux

    - by Daniel
    I want to use highly secure encryption for zipped files via Linux/Ubuntu using a command line terminal, what is the best command line tool to get this job done? zip -e -P PASSWORD file1 file2 file3 file4 Or 7za a file.7z *.txt -pSECRET What encryption is used and how secure is it?

    Read the article

  • How do I reassemble a zip file that has been emailed in multiple parts?

    - by Guy
    I received 3 emails each containing part of a zip file. The extensions end in .z00, .z01 and .z02. (Emailed as such to get around the typical 10Mb attachment limit per email.) I have put all 3 files into one directory. I can use both 7-zip and WinZip to open the first file (the .z00 file) and it lists the contents of the zip but when trying to extract the files both programs are reporting errors. What is the least error prone way of reassembling this zip and getting to the files?

    Read the article

  • Is there a plugin for [Path] Finder to browse zip-archives as folders?

    - by Andrei
    Hi, I am migrating from Windows to OS X and looking for a good way to browse zip, rar etc. archives. Ideally, I need a plugin for Finder which will allow me to open archives as folders. Is there one, or any other suitable solution? Preferably free Update As I understand, most of Mac users are using Path Finder app for file management. It is an awesome program, however surprisingly it also doesn't have such functionality. I guess, the problem is in the way of thinking – my Windows-thinking is not applicable to Mac. Here are some threads for other former Windows users to push Cocoatech in a right direction: http://forum.cocoatech.com/showthread.php?t=2883 http://forum.cocoatech.com/showthread.php?t=5167

    Read the article

  • Backup solution to backup terabytes and lots of static files on linux server?

    - by user28679
    Which backup tool or solution would you use to backup terabytes and lots of files on a production linux server ? Note that the files are all different and almost never modified, and usage is mostly adding files, so data volume is today 3TB growing all the time at around +15GB/day. Please do not reply rsync. Basic unix tools are not enough, rsync does not keep history, rdiff-backup miserably fails from time to time and screw the history. Moreover these are all file based backup, which put a lot of IOwait just to browse directories and query stat(). But i guess, except R1Soft CDP, there is no way around that. We tried R1Soft CDP backup, which is block level backup, and it proved good and efficient for all our other servers, but systematically fails on the server with 3 terabytes and gazillions of files. That is already more than 2 months that the engineers of R1Soft and datacenter are playing a hot ball game... and still no backup except regular rsync We never tried big commercial solutions, except R1Soft CDP since it was provided as an optional service by the datacented hosting our servers.

    Read the article

  • Reply to mailman archived message

    - by Jasper
    (not exactly sure if this is the right place for this question, but I trust you will migrate it if it isn't) I was having a problem with gdb, and while there issue appears to be recurring, I found only one instance of someone recently experiencing the same problem. I found this other instance on a mailman archived mailing list. Then I tried some more things and finally solved the issue with gdb. So, now I want to report back the solution I found to the mailing list. However, this is really only of use if mailman recognizes my mail as being the same thread as the original problem, but I do not have that mail (just the online archived version of it) so I cannot reply to it. My question: How can I make sure mailman considers my mail as a reply to that thread? Is simply copying the topic enough?

    Read the article

  • Can MySQL use multiple data directories on different physical storage devices

    - by sirlark
    I am running MySQL with its data dir on a 128Gb SSD. I am dealing with large datasets (~20Gb) that are loaded and processed weekly, each stored in a separate DB for the purposes of time point comparisons. Putting all the data into a single database in unfeasible because the performance on such large databases is already a problem. However, I cannot keep more than 6 datasets on the SSD at a time. Right now I am manually dumping the oldest to much larger 2Tb spinning disk every week, and dropping the database to make space for the new one. But if I need one of the 'archived' databases (a semi regular occurrence) I have to drop a current one (after dumping), reload it, do what I need to, then reverse the results. Is there a way to configure MySQL to use multiple data directories, say one on the SSD and one on the 2Tb spinning disk, and 'merge' them transparently? If I could do this, then archiving would no longer mean "moved out of the database entirely", but instead would mean "moved onto the slow physical device". The time taken to do my queries on a spinning disk would be less than that taken to completely dump, drop, load, drop, reload two entire databases, so this is a win. I thought of using something like unionfs but I can't think of a way to control which database gets stored on which physical drive, because it works by merging on a directory level (from what I understand) so I'm still stuck with using multiple directories. Any help appreciated, thanks in advance

    Read the article

  • optimal folder structure for storing 100k files on a USB drive

    - by cherouvim
    I need to store 100k files (around 40GB) in a USB drive. Each file has a unique int id (e.g 45000). Option one is to put all files in a single folder: root/ root/1.pdf root/2.pdf root/3.pdf ... root/567.pdf root/568.pdf root/569.pdf ... root/10001.pdf root/10002.pdf root/10003.pdf ... root/99998.pdf root/99999.pdf root/100000.pdf Option two is to create a [1-9][0-9]* folder hierarchy based on that id: root/ root/1/file.pdf root/2/file.pdf root/3/file.pdf ... root/5/6/7/file.pdf root/5/6/8/file.pdf root/5/6/9/file.pdf ... root/1/0/0/0/1/file.pdf root/1/0/0/0/2/file.pdf root/1/0/0/0/3/file.pdf ... root/9/9/9/9/8/file.pdf root/9/9/9/9/9/file.pdf root/1/0/0/0/0/0/file.pdf Which option will scale better? I can understand that the second option will require tons of folders but each folder will at most contain 10 folders and 1 file. Maintenance will not be an issue since everything will be controlled by an application. Note that this is a USB drive on linux and based on the above I'd also like to know whether I should go with FAT32 or NTFS.

    Read the article

  • Exchange retention work cycle properties

    - by marcwenger
    I've setup retention tags and policies in Exchange, but only run when I execute start-managedfolderassistant Upon running the command get-mailboxserver | fl name,*workcycle*, *ManagedFolderAssistantSchedule*, I noticed the following fields (truncated): ManagedFolderWorkCycle : 1.00:00:00 ManagedFolderWorkCycleCheckpoint : 1.00:00:00 ManagedFolderAssistantSchedule : {Sun.1:00 AM-Sun.9:00 AM, Mon.1:00 AM-Mon.9:00 AM, Tue.1:00 AM-Tue.9:00 AM, Wed.1:00 AM-Wed.9:00 AM, Thu.1:00 AM-Thu.9:00 AM, Fri.1:00 AM-Fri.9:00 AM, Sat.1:00 AM-Sat.9:00 AM} The first two are set to run every day, but does this conflict with what is set in ManagedFolderAssistantSchedule? What's the difference between ManagedFolderWorkCycle and ManagedFolderWorkCycleCheckpoint?

    Read the article

  • Is there a Utility that will scan selected locations and return all files older than a certain date?

    - by CT
    Can anyone recommend a utility that can scan specified directory locations (network shares specifically) and return all files older than a certain date? I am looking to implement a data retention policy at my workplace. As our amount of data grows it puts a large strain on our backup routines. I would like to move old data to some sort of archival system. Extra points for the ability to move queried old files to another location for archival and the ability to create schedules for when this occurs. Many thanks. EDIT: Windows Shop. Mostly Windows 2003 Servers.

    Read the article

  • Iphone internet connection (Reachability)

    - by ludo
    Hi, I saw any post about Reachability but people doesn't really give the exact answer to the problem. In my application I use the Reachability code from apple and in my appDelegate I use this: -(BOOL)checkInternet { Reachability *reachability = [Reachability reachabilityWithHostName:@"www.google.com"]; NetworkStatus internetStatus = [reachability currentReachabilityStatus]; BOOL internet; if ((internetStatus != ReachableViaWiFi) && (internetStatus != ReachableViaWWAN)) { internet = NO; }else { internet = YES; } return internet; } So the problem is even if I have an internet connection, this code telling me that I don't have one. Does anyone know what to do to make this working? Thanks,

    Read the article

  • ISA Server 2006 SP1 :: Allow unauthenticated users (non domain users) access to external (internet)

    - by Klaptrap
    Now that we have applied an internal to external rule blocking all users access to the internet, other than those users in a whitelist, we have the obvious issue of non authenticated users, not on our domain, i.e.; domain-less guests not being able to access the internet. Other than configuring each machine to use our alternative gateway - which would require a member of IT to be onsite everytime a guest arrives - can this be done through ISA adn AD?

    Read the article

  • ISA Server 2006 SP1 :: Allow unauthenticated users (non domain users) access to external (internet)

    - by Klaptrap
    Now that we have applied an internal to external rule blocking all users access to the internet, other than those users in a whitelist, we have the obvious issue of non authenticated users, not on our domain, i.e.; domain-less guests not being able to access the internet. Other than configuring each machine to use our alternative gateway - which would require a member of IT to be onsite everytime a guest arrives - can this be done through ISA adn AD?

    Read the article

  • Windows server RAS VPN client can't connect to internet

    - by Dragouf
    I configured a windows server 2008 RAS to connect automatically to a pptp vpn server. Problem is that when it connect I can't access internet from this server (the vpn client connect through RAS) Usually I ask vpn not to be use as the default gateway but this part is disable int the network interface - VPN interface properties : And I don't find how to ask to connect to internet directly....

    Read the article

  • Windows Server 2003 R2 SP2 can't access the internet

    - by Ishmael
    Our 64-bit windows server can no longer access the internet although it is a web-server that is hosting webpages accessible by the internet. Since this dilemma the server can no longer receive windows updates or Symantec virus definitions. We have checked our firewalls so nothing is blocked. Any insight on why this issue happened would be appreciated. Also the server is running IE6 which we have been trying to update.

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >