Search Results

Search found 38088 results on 1524 pages for 'large scale project'.

Page 209/1524 | < Previous Page | 205 206 207 208 209 210 211 212 213 214 215 216  | Next Page >

  • How to rsync a large file, with as little CPU and bandwidth expense as possible?

    - by Johan Allgoth
    I have a 500 GB file that I plan on backing up remotely. The file changes often. I'll be rsyncing it from a desktop to a server. Both can run rsync client or server. What is the proper command for this? The ones I've tried sofar has been taking forever or simply acted strange. Example and results: rsync -cv --partial --inplace --no-whole-file /desktop/file1 myserver.com::module/file1 Seems to work, but only if I do it twice (?!). Also, slow. Does the above command do the checksumming on both computers, or only on the sending one? Is it correct otherwise?

    Read the article

  • Hard drive degredation from large memory usage and paging files?

    - by Stephen R
    I've had a question(s) regarding computer degradation going through my head for a while and haven't found many good resources for researching it. 1) First off, when is the virtual RAM/paging file on a hard drive used by Windows? Is it used when the RAM is full? Or does it use the Virtual RAM/paging file as intermediate caching between the RAM and actual hard drive space all the time? 2) If I were to run many applications on my computer at the same time and have a bad habit of doing this for the entire lifetime of the computer, does it use more of the virtual RAM/paging file than if I were to have fewer programs running? Just to note, the RAM never fills up on my computer but it is used heavily. 3) By extension of question 2, if the virtual RAM/paging file is used more heavily, would that result in rapid hard drive degradation? I have seen a pattern among all of the computers that I have owned or used in the past 5 years. I am the kind of person to leave my web browser up with 40 tabs among other programs which will eat up 40% of my memory typically. Over time my computer will slow down, browsers start crashing, programs start seizing up or crashing themselves, eventually the computer becomes essentially unusable. I have been trying to rack my mind to come up with a solution other than to purchase a new PC to have it die on me in the next couple years as well. This is the only thought that has come to mind that might have a simple hardware fix...Windows ReadyBoost...Maybe? I'd like to be able to discuss this so I can learn something about all of the above. Thanks.

    Read the article

  • How can one use online backup with large amounts of static data?

    - by Billy ONeal
    I'd like to setup an offsite backup solution for about 500GB of data that's currently stored between my various machines. I don't care about data retention rates, as this is only a backup of, not primary storage, for my data. If the backup is stored on crappy non-redundant systems, that does not matter. The data set is almost entirely static, and mostly consists of things like installers for Visual Studio, and installer disk images for all of my games. I have found two services which meet most of this: Mozy Carbonite However, both services impose low bandwidth caps, on the order of 50kb/s, which prevent me from backing up a dataset of this size effectively (somewhere on the order of 6 weeks), despite the fact that I get multiple MB/s upload speeds everywhere else from this location. Carbonite has the additional problem that it tries to ignore pretty much every file in my backup set by default, because the files are mostly iso files and vmdk files, which aren't backed up by default. There are other services such as EC2 which don't have such bandwidth caps, but such services are typically stored in highly redundant servers, and therefore cost on the order of 10 cents/gb/month, which is insanely expensive for storage of this kind of data set. (At $50/month I could build my own NAS to hold the data which would pay for itself after ~2-3 months) (To be fair, they're offering quite a bit more service than I'm looking for at that price, such as offering public HTTP access to the data) Does anything exist meeting those requirements or am I basically hosed?

    Read the article

  • What is the easiest way to do a direct file transfer of an extremely large file over the Internet?

    - by Kenneth Cochran
    I would like to transfer a 20+ GB file to a friend. I would like it to: Be fast Ensure data integrity Not require opening ports in either end's firewall Be free Not broadcast the file's existence to everyone on the Internet I've looked a several technologies and nothing seems to fit: Gnutella, BitTorrent, et al. satisfies 1, 2 and 4 JetBytes... 1, 3, 4 and 5 Yahoo Messenger, AIM, etc. 3, 4 and 5 FTP, SFTP... 1?, 4 and 5 rsync... 1, 2, 4 and 5 For a file this size speed and data integrity are the most important. No one wants a 20 GB file to fail a MD5 check after spending two days downloading it. Is there anything that meets all these requirements?

    Read the article

  • How to manage a large email delivery volume from a Email Marketing App ?

    - by Newtonx
    We provide Email Marketing service through our online Application. We have about 30 customers. And each one has it's own mailling list (5k to 100k emails each). What we really want is to distribute email's delivery between 2 or more servers. I was wondering What kind of aproach/solutions MailChimp , Constant Contact uses to provide a great service ? use many servers ? many IPs ? Our spam policy suspends ANY user/customer that gets 10% bounced . We currently rotate our outgoing Mail Ip once deliveries limit per remote host is reached. Is it the best approach/solution ?

    Read the article

  • How to manage a large email delivery volume from a Email Marketing App ?

    - by Newtonx
    We provide Email Marketing service through our online Application. We have about 30 customers. And each one has it's own mailling list (5k to 100k emails each). What we really want is to distribute email's delivery between 2 or more servers. I was wondering What kind of aproach/solutions MailChimp , Constant Contact uses to provide a great service ? use many servers ? many IPs ? Our spam policy suspends ANY user/customer that gets 10% bounced . We currently rotate our outgoing Mail Ip once deliveries limit per remote host is reached. Is it the best approach/solution ?

    Read the article

  • Why can't I debug my ASP project through a remote desktop connection?

    - by Anthony Benavente
    I just asked this question in Stack Overflow but I figured this stack exchange forum is a better fit. It's been about a month of trying to figure out this problem and we've still not found a solution. We have about seven virtual machines on a server running Windows XP Professional w/ SP 3 all with Visual Studio Interdev and IIS 5.1 installed. Running the programs all work fine, but we just can't debug through remote desktop. When we are logged into the server console (through VM Sphere) and log into one of the virtual machines through there, we are able to debug properly. We figured the issue lies with some kind of permissions for Remote Desktop Users. We've tried nearly every article on the internet (exaggerating of course) and are about to give up hope. One more thing, when we are logged into the virtual machine through the server console and then remote in, the user that was logged into the console is kicked off but debugging works! Does remoting in trick the computer into giving us the correct permissions? I'm really not sure how it works. I know that this technology predates human history, but we are in the process of migrating from ASP Classic to ASP.NET Specs: - Windows XP Professional W/ SP3 - IIS 5.1 - Visual Studio 6 Interdev EDIT: By "debug" I mean running the project with breakpoints. Interdev doesn't stop at breakpoints.

    Read the article

  • How can I crop every page of a large PDF file?

    - by Andrew
    I have a 1300 page PDF file of a scanned book that was unfortunately not cropped when scanned. The actual book page dimensions are around 6x9", but each scanned page is 8.5x11", the size of the scanner bed. For much smaller PDF files I could throw it into Photoshop and crop the page, but this is a huge file. What is the best way to losslessly crop all of the pages of the file, in either Windows or OS X?

    Read the article

  • Why is ext3 so slow to delete large files?

    - by Janis Peisenieks
    I have a server, which makes an incremental backup of a system every night. Now on saturdays, there is a full backup. But after the full backup has finished, a script kicks in, that deletes the incrementals. Now, the script sometimes breaks, and it is because the incrementals are each about 10GB files, and sometimes takes too long for the script. Now could someone explain to me, or point me in the direction of a resource, that explains why ext3 is so slow to delete files, when compared to, lets say, NTFS? I know theses are 2 completely different file systems, but I'm really interested why is there such a big difference in deletion?

    Read the article

  • Can I use a micro ec2 instance as a load balancer for my other large ec2 instances?

    - by Ryan Detzel
    The issue I'm having is I want to upgrade that instance often(security patches, etc) but I'm affriad something will fail and the site will be down. So, I want to have another server setup and load balance between the two that way I can easily disable one, upgrade it and once it's working add it back in the mix and repeat. What kind of machine is needed for a load balancer? Would the micro instance work just fine? The site gets anywhere from 3-10k hits/day. I plan on using nginx as the load balancer.

    Read the article

  • How can I convert a large number of Word documents to HTML as fast as possible?

    - by metal gear solid
    I have to convert 500 Microsoft Word 2003 files into HTML documents. What would be the shortest possible way? I'm not just talking about extension .doc to HTML. I want to convert word files's data into HTML tags. Word 2007 is installed in my system. Any suggestion which can help to accomplish this task quickly would be nice. If you will suggest any tool then that should not be commercial. Should be free or portable.

    Read the article

  • How can I incrementally backup a large amount of data [with rsync]?

    - by Annan
    A website contains ~40GB images + files which needs to be backed up. Rollbacks need to be possible daily for the last 30 days. And backup server < 1.2TB My idea is to have one full backup from 30 days ago, then incremental backups for the last 30 days. On each day the last incremental backup is combined with the full backup and a new incremental backup is added. Can this strategy be implemented with rsync, if so how? Are there any problems with this plan? A better plan? PS: Incremental backups, not backup incrementally (which rsync does automatically)

    Read the article

  • How can I crop every page of a large PDF file?

    - by Andrew
    I have a 1300 page PDF file of a scanned book that was unfortunately not cropped when scanned. The actual book page dimensions are around 6x9", but each scanned page is 8.5x11", the size of the scanner bed. For much smaller PDF files I could throw it into Photoshop and crop the page, but this is a huge file. What is the best way to losslessly crop all of the pages of the file, in either Windows or OS X?

    Read the article

  • Do large corporations block jQuery content on web pages?

    - by Max Vernon
    We are currently redesigning our website. The company we've hired to do the redesign is advocating the use of jQuery to render the pages dynamically. Our SEO specialist is under the impression that many larger corporations may have jQuery blocked in their proxies to prevent their users from visiting sites like Facebook. Is this something you are aware of? Forgive me if this is off topic for SF.SE!

    Read the article

  • Oracle: 1 Large Server vs. 2 Smaller Servers?

    - by nvahalik
    We are in the planning stages of setting up our production Oracle 10gR2 environment. Our budget gives us the ability to buy 2 processor licenses of Oracle DB Standard Edition. We have minimal experience with Oracle so I'll defer to anyone who has used it. We are trying to decide if we should set up a single dual quad-core box or 2 individual quad-core boxes in a RAC configuration. Our DB right now is about 60 GB, and at our peak, we'll have up to 150 concurrent users. Most of the big stuff is done via batch processing at night. My gut tells me that having 2 boxes in a RAC configuration can't be a bad thing because it provides a true hardware failover solution. DB stored in a shared LUN on a SAN via iSCSI. Plus if we ever need to add capacity, we already have boxes in place that can be upgraded with extra procs (I assume with zero downtime, since it's set up in a RAC config) if we add extra licenses, or RAM. Does RAC have any performance penalties? Will it add extra latency? Is there any true advantage for having dual processor boxes running these systems? If we build out the Oracle boxes with special hardware: hardware iSCSI cards, TOE NICs, will these boxes be solid? We are deploying on 64-bit Windows. So what would you do? One box or two?

    Read the article

  • Windows Explorer - How can an large file have a zero "Size on disk" value? What does it mean

    - by Jaans
    I would expect some discrepancy between "Size" and "Size on disk" in Windows Explorer due to file system allocations etc. Below is a screenshot of an example file on a Windows 2012 R2 file server that has a 81.4 MB "Size" but for the "Size on disk" it's 0 bytes. What gives? I have other files doing the same, but yet another set of files and folders behaving as expected showing the size on disk relatively close to the actual file size. The volume is a basic disk, formatted with NTFS and the default 4K allocation units. No compression is set for any file or folder on the volume. (For those more paranoid, I did a malware scan, and also confirmed there is not ADS streams associated with the file in question). The user account running Windows Explorer is the domain administrator, and the file owner is also the domain administrator. Thanks for reading!

    Read the article

  • Email Discovery from Fairly Large Mailbox (15gig) Exchange 2003.

    - by nysingh
    I have a request from our legal team to search a users' mailbox. the mailbox is 15gig and it is on exchange 2003. I am trying to run windows desktop search and google desktop. I have gotten them to index mailbox but getting the results into a folder to backup on cd is getting bit difficult. Windows desktop search and google desktop search does not allow you to copy results to another folder. Can anyone point me to right direction? What is the best way to index and copy the results of pst, mailbox or edb file? What is the best discovery methods? Thanks

    Read the article

  • Is there a way to replicate a very large file shares in real-time?

    - by fsckin
    I have an hourly cron job that copies about 40GB of data from a source folder into a new folder with the hour appended on the end. When it's done, the job prunes anything older than 24 hours. This data changes very often during work hours and is on a samba file share. Here's how the folder structure looks: \server\Version.1 \server\Version.2 \server\Version.3 ... \server\Version.24 The contents of each new folder compared to the last one usually doesn't change very much, since this is a hourly job. Now you might be thinking that I'm an idiot for setting dreaming this up. Truth is, I just found out. It's actually been used for years and is so incredibly simple, anyone could delete the ENTIRE 40GB share (imagine that dialog spooling up... deleting thousands and thousands of files) and it would actually be faster to restore by moving the latest copy back to the source than it took to delete. Brilliant! Now to top this off, I need to efficiently replicate this 960GB of "mostly similar" data to a remote server over WAN link, with the replication happening as close to real-time as possible -- think hot spare, disaster recovery, etc. My first thought was rsync. Total failure. Rsync sees it sees a deletion of the folder that is 24 hours old and the addition of a new folder with 30GB of data to sync! I also looked at rdiff-backup and unison, they both appear to use similar algorithms and do not keep enough meta-data to do this intelligently. Best thing that I can find "out of the box" to do this is Windows Server "Distributed Filesystem Replication" which uses "Remote Differential Compression" -- After reading the background information on how this works, it actually looks like exactly what I need. Problem: Both servers are running Linux. D'oh! One approach to this I'm looking at is this, say it's 5AM and the cron job finishes: New Version.5 folder arrives at on local server SSH to remote server and copy Version.4 to Version.5 Run rsync on the local server pushing changes to the remote server. Rsync finally knows to do a differential copy between Version.4 and Version.5 Is there a smarter way to replicate Samba shares as close to real-time as possible? Anything out there that does "Remote Differential Compression" on Linux?

    Read the article

  • How are large companies handling the storing and cataloging of software installation disks?

    - by CT
    I just started working in the IT department of a small-medium sized construction company with about 200 users. One of my responsibilities is to setup and configure all new machines that come in. I would like suggestions on how to best manage the installation disks and licenses of the software that comes with them. Plus any additional licensed software such as Autocad, Photoshop, etc as well as peripheral driver disks such as printers and scanners. Right now every machine is associated and labeled with an asset id. All asset ids are kept in a spreadsheet with applicable serial numbers, current user, warranty info, and software licenses. The physical disks are then kept within a folder in a cabinet. Each folder is marked with the asset id number as well as the current user. My problems with this is that the system was not maintained very completely before I came to the company. There are plenty of software folders with no asset ids labeled on them. Plenty of missing software folders (most likely are a lot of the unlabeled folders). Folders with names but not asset ids. Machines get switched to different users without the folders and spreadsheet being updated. I am not saying this method would necessarily be bad if it was better implemented and managed, but if I am going to have to take a lot of time to fix this system currently in place. I thought I would ask the community first on how others manage this process in case there are easier, more efficient ways of doing so. Thank you.

    Read the article

  • How to contact an Email Administrator at a large company?

    - by Brett G
    Before I've had to deal with other larger companies, client of ours, when we have had e-mail issues with them. Most of the time it's when our messages have been tagged as spam, although right now it's because one of their systems has gone haywire and is repeatedly sending us e-mails. Anyway, my question is how do you get in touch with the email administrator. I've found at some larger companies unless you have the name of the person, the receptionist will refuse to connect you to them (I'd imagine they're acting as the gatekeeper from salespeople asking for "the person responsible for dealing with your printers"). I know we could always try to deal with our contact at the company, but sometimes that can be slow and difficult and these issues are usually time sensitive.

    Read the article

  • Large OSX 10.6 to Windows 7 smb transfers fail?

    - by user41724
    I'm connecting to a windows 7 box from a OSX 10.6 box via smb: smb://ftp1 Connection works fine, I can transfer individual files one at a time, but as soon as I try and move an entier folder I get the following error: The Finder can’t complete the operation because some data in “test” can’t be read or written. (Error code -36) This error happens on all our OSX boxes when trying to push the entier folder to the Win7 box. The folder TEST in the above error message has -R 777 permissions. I can move every image file to the windows box one by one with no errors. But if I try an move the entier folder. bam error out. This error seems to kill the smb client on the Windows box as well. There's an FTP server on the windows box and I can FTP in from the OSX box and move everything just fine. Not sure what is going on here?

    Read the article

< Previous Page | 205 206 207 208 209 210 211 212 213 214 215 216  | Next Page >