Search Results

Search found 3841 results on 154 pages for 'daily deals india'.

Page 103/154 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Smart backup software

    - by gisek
    I use a laptop on daily basis. As I have a lot of important data there it would be nice to do backups of some directories every day. Can you recommend a specific application that would take care of it? Maybe there is an app that would instantly commit changes I make in a directory on my laptop to the backup folder? The important thing is that I have some big files (a few GB's) that have some minor changes very often. I'm talking about VirtualBox disk images. It would be nice if the software could handle it smartly. Also notice that I'd like to store it on an external usb HDD, which sometimes isn't plugged in.

    Read the article

  • How do I know if 'hg clone' is doing the work remotely?

    - by jjfine
    I've got a very simple windows install of Mercurial on my machine. The 'central' repository is located at //mymachine/hg-repos/central. I want remote (VPN) users to be able to create clones of this repository in the hg-repos directory because it gets daily backups. I have given these users full control of the hg-repos directory. My question is this: If I'm on a remote machine, and I run the command: hg clone //mymachine/hg-repos/central //mymachine/hg-repos/central-copy ...is the remote machine doing most of the work? I don't want the client to have to download all of the central repository and then upload it all back because people are going to be using this from across the country. But I suspect this is what's happening here since it works so easily.

    Read the article

  • How should I organize my backups ?

    - by Patrick
    I'm using for the first time rsync to create daily backups of my websites and I was wondering if I should overwrite the previous copy or should I create multiple copies and overwrite only the oldest one ? (I might not have enough space for that, though). I actually have also this question. Let's suppose most of files are accidentally erases.. does rsync delete all these files from the backup space because they don't exist anymore ? How does exactly work in this case ? thanks

    Read the article

  • How can I programmatically renumber pages in a PDF?

    - by Andrew
    As a graduate student, I come across PDFs of articles and book chapters on a daily basis. Sometimes these PDFs are paginated correctly internally (that is, if an article starts on page 67, the PDF starts on page 67 as well; not on page 1). When they aren't, I have to open the file in Acrobat and renumber the pages in the "Page Thumbnails" panel. I would love to be able to automate this whole process with a script (bash, Python, AppleScript, whatever) that lets me pass the first actual page number... something like fixpagination example.pdf 67. However, I cannot find any terminal-based program that can re-paginate PDFs. Neither pdftk nor PyPDF seem to be able to deal with pagination. Are there any scriptable programs that can internally re-paginate PDF files?

    Read the article

  • Auto-scaling EC2 Servers and Updating Code

    - by jstats
    We've come to the point where we need to set up autoscaling for our web server and I'm unsure how to go about the process of scaling servers and updating the the existing code without remaking a new AMI and changing the autoscale config to use it. I've read a bit about people bundling the new code and uploading it to s3 and having new servers grab the bundle on boot up but that doesn't seem all that pleasant either. Currently the web app's files live in a git repo, and when we update the code, we push it to github, ssh into the web app and run a hook to bring down the latest code. So I was thinking that another option could be to just run that hook on an hourly or daily cron task. Unfortunately that doesn't cover everything type of update (for example new blog posts' images and such which aren't included in the git repo) but it's something. Could anyone provide some advice on what a common solution is or anything as to why my proposed solution is a bad idea? Thanks all

    Read the article

  • Win Server 2003 - Task Scheduler - Tasks with GUI and Services

    - by august_month
    I need to run excel macro daily. I scheduled it with Windows Scheduler and it worked fine until I had to change my password. I wonder if it's possible to have a task scheduled without a password? As alternative we have third party scheduling software, but this software cannot launch excel. The tech support said that since excel has gui and scheduling software runs as service with "Allow to interact to Desktop" disabled, it cannot launch excel. Also tech support mentioned that "Allow to interact to Desktop" is not supported as of Vista. I totally trust tech support guy, I just need a work around that would make my network administrator and me happy. Regards.

    Read the article

  • How may I retrieve data from an Excel table based on a variable number of criteria?

    - by Eshwar
    I have the following salary data for example: Country State 2012 2013 -> 2027 ======= ===== ==== ==== China Other 1000 1100 China Shanghai 1310 1400 China Tianjin 1450 1500 India Orissa 1500 1600 So now in another Excel sheet I would want an answer to one of the following questions: What is the salary in Shanghai for 2013? (Answer would be 1400) What is the salary in Hubei province for 2012? (Since it is not listed, use "Other" - 1000) What is the average salary in China for 2013? (Answer would be 1450) What is the highest salary in China for 2012? (Answer is Tianjin) So as in the above order of priority, I would like those numbers in another Excel sheet using some form of query. I considered PivotTables but I was wondering if there is another much better more efficient way of doing this? I imagine SQL is suited for this but I am not clued up on that. Some Excel functionality is much rather preferred. Also suggestions on an appropriate format of data for such queries would be appreciated.

    Read the article

  • Linux CentOS strange memory readings

    - by user2008937
    I am actually a young junior sys admin. I have a question - i am trying to understand how linux deals with memory... while playing around different monitoring programs I found some strange thing. When I run top on my laptop it shows me that FIREFOX process with pid 8778 takes 18,3% of memory (%MEM column). grep "MemTotal" /proc/meminfo Above command give me 1848336kb/1024 = 1805megs of memory (its ok - i have 2 gigs of ram). So if the firefox process takes 18,3% of MEM(according to tops %MEM column) then it takes 0.183 * 1805 which is approximately 325mb of memory. Quite a lot as for firefox... But well, in Linux there are lots of shared libraries that programs commonly uses (like famous libc). And those libraries are added to memory utilization of every process that uses it in the system, despite they are actually reading same file(single object in memory). So top may show too big mem utilization because of those shared libraries. Well, it is time to use PMAP which should show us the real mem utilization of process. But.. pmap -d $(pidof firefox) mapped: 983460K writeable/private: 757164K shared: 66416K so pmap shows that 983460/1024=993MB of memory is mapped to this process. It is in fact much bigger than mem utilization showed by top. Whats wrong here? How pmap can show more than top? even when top adds also the shared libraries (which in fact are single objects in memory) for each process that uses it? and pmap omits it? Regards Krzysztof

    Read the article

  • How can I incrementally backup a large amount of data [with rsync]?

    - by Annan
    A website contains ~40GB images + files which needs to be backed up. Rollbacks need to be possible daily for the last 30 days. And backup server < 1.2TB My idea is to have one full backup from 30 days ago, then incremental backups for the last 30 days. On each day the last incremental backup is combined with the full backup and a new incremental backup is added. Can this strategy be implemented with rsync, if so how? Are there any problems with this plan? A better plan? PS: Incremental backups, not backup incrementally (which rsync does automatically)

    Read the article

  • Protecting a SVN server

    - by user35072
    For various reasons we are finding it increasingly difficult to work with remote workers. We are a very small developer shop and it's becoming impractical to do manual merges on a daily basis. So we're left with little choice (?) but to consider opening up our SVN servers. I'm looking into the following: Full HTTPS session Running non-80 port Strong password policy Is this enough to prevent someone hacking and stealing data? I will also look into VPN but first would like to understand any alternative solutions.

    Read the article

  • rsync without password, none of google (server fault) tutorials worked

    - by Jake Armstrong
    I need to use rsync for a daily backup operation and in the past (on different servers) I managed to just use a rsa key etc, but now none of google (serverfault) tutorials work at all. It keeps asking me for a password. I have webmin and ssh/root access to both servers. My steps: create a key on server 1 send key.pub to server 2 add key.pub to .ssh/authorized_keys chmod 700 .ssh/authorized_keys go back to server 1 and try rsync and it keep asking for password... rsync command: rsync -avz -e ssh file.txt root@server2:/root EDIT: well, I cleaned up everything and this time, instead of inserting a custom name to the key I used the standard one on server1. sent the .pub to server2 and it worked as a charm... So the answer is that server1's ssh wasn't even using the right key...

    Read the article

  • MacBook Air with Bootcamp - How to partition?

    - by Andrew
    I want to buy a MacBook Air for my wife with a 128GB SSD. She has to use Windows 7 but I would like to keep OS X for myself to use somtimes. Using Bootcamp, is it feasible to install the following? Mac partition: 36GB with Mac OS X and Microsoft Office 2011 Word, Excel & Powerpoint and Skype. (minimal use) Windows partition: 92GB with Windows 7 professional and Microsoft Office 2010 Word, Excel & Powerpoint, and Skype (daily use) Media to be kept on SD card or external USB3 drive. (Note: Using Parrallels may save space, but my wife won't go for the user experience)

    Read the article

  • software to monitor internet usage on an XP PC? (browser + non-browser)

    - by user39316
    Hi Is there any (ideally open source) software for Windows that can be used on a PC, to monitor the usage of internet from that PC? It would need to include both browser and non-browser sources (e.g. a service that sync's calendar to gmail). So any software on your PC that uses would need to be configured to point to this local internet monitoring software/proxy. The monitoring software/proxy then would be configured to point to the company proxy server (address, port & credentials). Things that come to mind that might be close but not really focused on solving this might be perhaps: Charles Proxy, Fiddler 2, SQUID? The idea would be it could give you a daily/weekly/monthly report of internet upload/download usage on a per program/process/service basis for the PC it is being run on. thanks

    Read the article

  • What is the best way to handle the multitude of different logs created all around the place?

    - by Low Kian Seong
    I run a few applications which creates their own logs. Then I run cron scripts on the same server to do importing of data for my app. When these cron errors out, the default is it sends emails to the user that runs the cron job. There are just too many places that I need to check the logs and mails for stuff that might have potentially went wrong. My question is, what is the best way to do this or even better is like a log parser application which will go through all the system logs when something really goes wrong instead of me having to go through it daily?

    Read the article

  • how to debug BSOD irql_not_less_or_equal error

    - by Sev
    So I'm getting the irql_not_less_or_equal BSOD. I tried looking at the event viewer for potential causes, and cannot find any. I also checked the CPU temperature in BIOS right after the error and that was fine. I've tried 2 sets of RAM chips already, both give the issues. The error doesn't happen consistently...it happens daily, but many hours can pass and it won't happen, or only 10 minutes can pass and it might happen again. By the way, just bought the parts and built the computer myself a couple of weeks ago. How to debug the cause for this? Hardware info: Asus P6X58D PREMIUM motherboard Intel core i7 930 quad core 2.8 ghz Kingston 128 GB SSD 3 Gb/sec Nvidia Geforce GTX 465 PNY Edition Corsair 12 GB DDR3 1600 Mhz Ram Windows 7 Ultimate

    Read the article

  • Schedules Folder Backup

    - by Junaid Saeed
    i have some folders in C drive on which i work on daily basis and the data in them is very critical.. so every night when i shutdown my PC i copy - paste - overwrite existing files these folders to a separate location... so that of the system crashed or something bad happens.. i will be able to easily format C and all i cannot move these folders from C drive because these folders include C:\wamp\www\ of WAMP server and such folders... is there a tool on which i can schedule that everyday at X time these folders will be backuped to 'Y' path

    Read the article

  • Can I use HP Recovery Discs for a different hard drive capacity and make?

    - by Fasih Khatib
    About two years ago I created HP Recovery Discs (3 of them). Now my hard drive has crashed and new one is still a week from delivery. I was reading up on how to reinstall the genuine OS using the Recovery Discs as i was not given any Windows 7 installation discs. I did my bit of research after getting answers from the community on what these discs do and found out on other sites that people experience issues when recovering their OS from the disc. Especially when they change the make or capacity of the harddrive. Unfortunately I had to change the make as the hard drive that came built in has gone out of production. This question is just a part of my checklist to avoid problems when recovering the OS. I have: HP DV4-2126TX (available only in India I guess) I had: Seagate Momentus 320 GB I ordered: Western Digital Scorpio Black 500 GB Windows 7 Home Premium 32-bit Is there a possibility to encounter any problems due to the changed capacity and make? I only want my genuine OS and drivers – not my data. I was told that Disc 1 contained the OS and drivers, and the rest of the discs contained data. I couldn't verify that.

    Read the article

  • What happens when I delete the first of several AWS EBS snapshots?

    - by Jepper
    On http://aws.amazon.com/ebs/pricing/ it says: "EBS Snapshots [...] For the first snapshot of a volume, Amazon EBS saves a full copy of your data to Amazon S3. For each incremental snapshot, only the changed part of your Amazon EBS volume is saved." I intend to snapshot some of my instances daily and keep snapshots for 7 days after which the snapshots are destroyed. What happens when, eventually, I destroy the first snapshot? Will the subsequent snapshots be worthless given the first is no longer available?

    Read the article

  • Applications starts very slowly from a network path

    - by Snowfox
    Hi We have a windows 2008 server which hosts the network share \\srvcompany\lib. This share contains several applications needed for the daily business. Every client/user (all win xp) has shortcuts on the desktop to these apps. We have the problem that at several (but not all) clients the apps starts very slowly. If I copy the application's programm files to a local folder then they'll start fastly. When I watch the memory usage in the task manager on such a "slow" machine while an applications starts I notice that the memory usage grows much slowier than when I start the app from a "fast" machine. But when I copy files with Windows Explorer from this share, the speed is nearly the same. I've also checked the network driver, both tested clients have the same network card with the same driver version. Has anyone an idea where or what I should check next to solve this problem? Thanks for answers.

    Read the article

  • Sending a large number of mails causing problems on CentOS 6 / Plesk 10

    - by papakost
    I have a VPS running CentOS 6. When the system tries to send daily newsletter, after some time (e.g. after sending about 2000 emails), I get error "Unable to send mail" and the system memory goes really high. Till this moment, the mails are delivered normally. The rest symptoms are: I cannot see anything on /var/log/maillog (File seems not to be written) All files on /var/spool/mail have 0 bytes size. From time to time on httpd log I get errors like: /usr/sbin/sendmail: error while loading shared libraries: libc.so.6: cannot open shared object file: Error 23 "Activate mail service on domain" setting in Plesk is deactivated. Any idea on what's going wrong here?

    Read the article

  • Launching multiple applications with a single command/script/shortcut

    - by Bill
    I realized a few days ago that every time I sit down at work, I do a few things after unlocking my computer. First, I open up Firefox, then I open up Chrome, then I log in to Digsby. I realized I could probably save repeating this daily by writing a small batch script to open up Firefox and Chrome , but I couldn't figure out how to make it work.. and since the whole effort is to save time I don't want to bash my head around in the windows command prompt to do it. I also tired this in powershell but ran in to a bunch of security nonsense. Is there a way to do this that I am missing? Bonus points if somebody has figured out how to manipulate Digsby via COM , scripting, or python =)

    Read the article

  • How to Read XML and Generate SQL Insert

    - by hackerkatt
    I am trying to write a VB Script to read a XML file (downloaded daily) and insert the information into a MSSQL DB. The content of the XML is a list if CDRs (Call Data Records). I need to parse the file and insert the cdr's into a table. I'm a Ruby,Perl,PHP,Javascript,SQL,... programer. But I've really never written any VB Script. I've done some googling and find a number of examples on how to generate XML from a SQL Query, but not the reverse. Any help/suggestions would be greatly appreciated. Thank you!

    Read the article

  • Puppet: hanging at Schedule[weekly]

    - by Andrei Serdeliuc
    Why would puppet hang at Schedule[weekly]? I'm running puppet in a masterless setup, so to apply by manifest I'm just running puppet apply /etc/puppet/manifests/site.pp In debug mode, these are the last things it says before it just hangs debug: /Schedule[never]: Skipping device resources because running on a host debug: /Schedule[daily]: Skipping device resources because running on a host debug: /Schedule[monthly]: Skipping device resources because running on a host debug: /Schedule[puppet]: Skipping device resources because running on a host debug: /Schedule[hourly]: Skipping device resources because running on a host debug: /Schedule[weekly]: Skipping device resources because running on a host If I send a SIGINT, it says Exiting debug: Storing state debug: Stored state in 0.03 seconds debug: Finishing transaction 69992657242500 Thanks

    Read the article

  • Mailing list with dynamically generated addresses

    - by Joe Tomasone
    I am trying to implement a dynamic mailing list from a database that changes quite often. Conditions: Postfix is the MTA Email addresses are in a MySql Database Postfix only allows senders whose emails are in that database (via smtpd_sender_restrictions) Cron job extracts the current emails from the database nightly and puts them into an alias file, then runs postalias on it. This works well, but since the sender remains the same, many domains are rejecting the email since my server is not a DNS listed mail server for the sender's domain. So, I either have to find a way to re-write the outgoing address as "listserv@mydomain" or find some mailing list package that will use database-retrieved emails (either queried directly or in a flat file) as the subscriber list, with that list replaced daily. I've tried Sympa and am pretty much ready to give up on it - it's a nightmare to get working right - but that's the only open source listserver that I have seen that works with dynamic mail lists. Does anyone have any ideas? Thanks, Joe

    Read the article

  • How to execute a scheduled task with "schtasks" without opening a new command line window ?

    - by Misha Moroshko
    I have a batch file that creates a scheduled task using schtasks like this: schtasks /create /tn my_task_name /tr "...\my_path\my_task.bat" /sc daily /st 10:00:00 /s \\my_computer_name /u my_username /p my_password It works OK except the fact that when my_task.bat is executed - a new command line window is opened (and closed after execution). I would like to avoid opening this new window (i.e. to run the task in quiet mode, in the background). I thought to use start /b ...\my_path\my_task.bat but I don't know how, because since I have to call start from the batch file I need to precede it with cmd /c, which again causes the new window to open. How could I solve this problem ?

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >