Search Results

Search found 14173 results on 567 pages for 'online backup'.

Page 101/567 | < Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >

  • Playing online video on iphone web, the iphone seems to cache the reference movie

    - by Mad Oxyn
    We are working on an online mobile video app, and are trying to play a reference movie on an iphone from the iphone browser. The problem is, the movie plays the first time, but when we try to play a different movie the second time, it doesnt. Quicktime gives a general error message sayin "Cannot Play Movie". More precisely this is what we are trying to do: Connect with an iPhone to a webserver that serves a reference movie (generated with quicktime pro). The ref movie automatically gets downloaded to the iphone by quicktime. Quicktime then chooses one of the 3 references in the reference movie, based on the connection speed, and tries to download the designated movie via a relative path. A servlet gets called and forwards the relative path to the right movie. This all works the first time. However the second time when we want download different movies, we get the quicktime error. Test case: Open reference movie 1 Movie plays in Quicktime Open reference movie 2 ERROR: Quicktime gives an error - Cannot Play Movie Shut down Iphone Turn Iphone on again Open reference movie 2 Iphone plays movie Open movie ERROR: Quicktime gives an error - Cannot Play Movie Did anyone encounter similar issues with quicktime and the use of reference movies and is there something we can do to work around that?

    Read the article

  • Need help to create database schema for wholesale online tee store

    - by techiepark
    Hi, I'm currently working on wholesale online t-shirt shop. I have done this for fixed quantity and price, and its working fine. Now i need to do this for variable quantity and price. Here is the reference link, like what i have to do. Basic tables i have created are - CREATE TABLE attribute ( attribute_id int(11) NOT NULL auto_increment, name varchar(100) NOT NULL, PRIMARY KEY (attribute_id) ); CREATE TABLE attribute_value ( attribute_value_id int(11) NOT NULL auto_increment, attribute_id int(11) NOT NULL, value varchar(100) NOT NULL, PRIMARY KEY (attribute_value_id), KEY idx_attribute_value_attribute_id (attribute_id) ); CREATE TABLE product ( product_id int(11) NOT NULL auto_increment, name varchar(100) NOT NULL, description varchar(1000) NOT NULL, price decimal(10,2) NOT NULL, image varchar(150) default NULL, thumbnail varchar(150) default NULL, PRIMARY KEY (product_id), FULLTEXT KEY idx_ft_product_name_description (name,description) ); CREATE TABLE product_attribute ( product_id int(11) NOT NULL, attribute_value_id int(11) NOT NULL, PRIMARY KEY (product_id,attribute_value_id) ); I'm not getting how to store the price based on variable quantity. Please help me to create product and its related tables. my requirement is same as above reference link.

    Read the article

  • Dynamic Collapsable Flow Chart Online

    - by Simon
    Been looking through a number of other related posts relating to flowchart software. I have been asked to put together a document outlining some of the typical problems our users encounter with our software product. What I would like to do, is create an interactive/online flowchart that lets users choose from 4-5 overall headings on whats wrong. Then for this to dynamically expand more choices on pinpointing the problem, and so on and so on, until they can get a resolution to their problem. The key thing that I have not been able to find in some of the flowchart software out there, is having the click + expand element. - I dont want all options to appear to the end-user in a huge flow chart as it will distract away from their specific issue. - I want them to be able to click away and go down a specific avenue that will end up giving them some good things to try, based on their decisions/clicks. I was originally thinking of perhaps putting something in Flex or Silverlight (ideally someone would have a template out there) but am now thinking of taking advantage of 3rd party (ideally free) software. This will need to be hosted in a browser. Any ideas?

    Read the article

  • Developping an online music store

    - by Simon
    We need to develop an application to sell music online. No need to specify that all will be done quite legally and in so doing, we have to plan an interface to pay artists. However, we are confronted with a question: What is the best way to store music on the server? Should we save it on server's disk from a HTTP fileupload? Should we save via FTP or would it be wiser to save it in the database? No need to say that we need it to be the most safiest as possible. So maybe an https is required here. But, we what you think is the best way? Maybe other idea? Because in all HTTP case, upload songs (for administration) is quite long and boring, but easly linkable to a song that admin create in his web application comparativly to an FTP application to upload song on server and then list directory in admin part to link the correct uploaded song to the song informations in database. I know that its maybe not quite clear, it's because i'm french but tell me and I will try to explain part that you don't understand.

    Read the article

  • Ember Data Sycn - LocalStorage+REST+RealTime+Online/Offline

    - by Miguel Madero
    We have a combination of requirements in terms o data access. Pre-load some reference data. We need reference data to survive browser restarts instead of just living in memory to avoid loading it all the time. I'm currently using the LocalStorageAdapter for that. Once we have it, we would like to sync changes (polling or using Socket.IO in the background and updating the LocalStorage could do the trick) There're other models that are more transactional, where we would need to directly go to the Server and get/save them. It would be nice to use something like the RESTAdapter for that. Lastly, there're some operations that should work off-line and changes should be synced later. To make it more concrete: * We pre-load vendor and "favorite products" into Local Storage. We work offline with those. * We need to sync server changes to vendor and product information. * If they search the full catalog, that requires them to be online. * When offline, we need to allow users to add something to their cart or even submit and order. We would like to queue this action and submit it when they have an Internet Connection. So a few questions are derived from this: * Is there a way to user RESTAdapter in combination with LocalStorage? * Is there some Socket.IO support? (Happy to do this part manually) * Is there Queueing support? Ideally at the Ember-Data level. I know we will have to do a lot of this manually and pull together the different lego pieces, but I wanted to ask for some perspective from experience Ember devs.

    Read the article

  • youtube video downloaed succesfully on local, but on server(online) only 0 byte flv is downloading

    - by I Like PHP
    i m using php tube class to download youtube video. Video is automatically downloaded in a folder by submitting youtube video url. everything is working fine on local (XAMPP), but when i upload this to server(online), THEN ONLY 0 byte flv file is saved. i also debug the code both side( local and global ). i get the url like this( on local) http://www.youtube.com/get_video?video_id=1jNWCFwqPvM&=vjVQa1PpcFMyYCECyfmYENWklpwXGyhVgJpSFPgNAEc%3D whereas on server i got link like this http://www.youtube.com/ get_video?video_id=1jNWCFwqPvM&t=vjVQa1PpcFNS3SPxIXi8hy-NF4qZFFrScDXCjGLLrLc%3D when i copy the link getting from local and paste on browser then a download box appear on browser, whereas when i copy the link getting from server and paste on browser then only a blank page appears please solve my problem, where is the error??? how to know that pear is installed or not on server? on server side sometimes i get link like http://www.youtube.com/get_video?video_id=1jNWCFwqPvM&t=vjVQa1PpcFNS3SPxIXi8hy-NF4qZFFrScDXCjGLLrLc%3D but sometimes i get only http://www.youtube.com/get_video?video_id=&t= please guide me how to solve these? Thanks in advance to all my seniors

    Read the article

  • PHP | Online Notepad

    - by user2947423
    I recently made a Chat application in Visual Basic using PHP. I used this code: <?php $msg = $_GET['w']; $logfile= 'Chats.php'; $fp = fopen($logfile, "a"); fwrite($fp, $msg); fclose($fp); ?> I'm now trying to make a Online Notepad. What i want to do is in Visual Basic create a unique ID. That unique ID, has to be his filename. I'm not very good with PHP so what i want to know is: I want the unique ID to be the filename of the "Note". Like: $logfile= '{uniqueID.php}'; Whenever the user opens the program, it'll open his uniqueID.php file and he can edit that in my program. Long Story Short (TL;DR) Program generates uniqueID uniqueID is going to be a new file; {uniqueID}.php On next open it will check if {uniqueID}.php of him/her exists else it will make a new one. I know this isn't really secure but it's to learn something for myself.

    Read the article

  • Error during Time Machine backups on OS X Lion

    - by user92401
    After I turn on my machine, the first couple of Time Machine backups seem to go OK, but after about an hour I get this error: Unable to complete backup. An error occurred while creating the backup folder. Latest successful backup: 7/31/11 at 12:32 PM I'm running 10.7. Time Machine is backing up an internal HD to an external USB HD. I've already run Disk Utility to repair the Time Machine partition. It's a relatively new hard drive and didn't have any issues. Here's what I've found in the Console's log filtered for backupd: 7/31/11 12:31:21.223 PM com.apple.backupd: Starting standard backup 7/31/11 12:31:21.447 PM com.apple.backupd: Backing up to: /Volumes/MyMac TM Backup/Backups.backupdb 7/31/11 12:31:29.146 PM com.apple.backupd: 983.7 MB required (including padding), 391.90 GB available 7/31/11 12:32:19.471 PM com.apple.backupd: Copied 3156 files (36.0 MB) from volume Macintosh HD. 7/31/11 12:32:20.017 PM com.apple.backupd: Copied 3173 files (36.0 MB) from volume LI. 7/31/11 12:32:20.136 PM com.apple.backupd: 934.8 MB required (including padding), 391.86 GB available 7/31/11 12:32:54.755 PM com.apple.backupd: Copied 916 files (117.8 MB) from volume Macintosh HD. 7/31/11 12:32:54.894 PM com.apple.backupd: Copied 933 files (117.8 MB) from volume LI. 7/31/11 12:32:55.937 PM com.apple.backupd: Starting post-backup thinning 7/31/11 12:32:55.937 PM com.apple.backupd: No post-back up thinning needed: no expired backups exist 7/31/11 12:32:55.960 PM com.apple.backupd: Backup completed successfully. 7/31/11 1:21:28.624 PM com.apple.backupd: Starting standard backup 7/31/11 1:21:28.631 PM com.apple.backupd: Backing up to: /Volumes/MyMac TM Backup/Backups.backupdb 7/31/11 1:21:28.682 PM com.apple.backupd: Error: (22) setxattr for key:com.apple.backupd.HostUUID path:/Volumes/MyMac TM Backup/Backups.backupdb/Will’s Mac Pro size:37 7/31/11 1:21:28.683 PM com.apple.backupd: Error: (22) setxattr for key:com.apple.backupd.HostUUID path:/Volumes/MyMac TM Backup/Backups.backupdb/Will’s Mac Pro size:37 7/31/11 1:21:38.694 PM com.apple.backupd: Backup failed with error: 2

    Read the article

  • Time Machine is getting stuck at "Preparing to Back Up" and my Trash isn't emptying

    - by zarose
    I have encountered two separate problems, but I am putting them in the same question in case they are related. First, my Trash would not empty. It seems to be getting stuck on certain files, because I will reset my Macbook and some of the files will be deleted, and then if I remove a file or two at random, more can be deleted. Some of these files had strange characters in their names. I tried changing the names to single characters, but this did not help. Next, I attempted to backup my Macbook using Time Machine. I plugged in the HDD I've been using for this, but every time I try to start the backup, Time Machine gets stuck at "Preparing to Back Up". I definitely need to know how to fix the Time Machine problem, but I am curious how to solve the trash problem as well, and whether or not these problems are related. EDIT: Console.app logged the following this morning before I left on a trip. I did not bring the HDD with me. 6/5/12 7:41:28.312 AM com.apple.backupd: Starting standard backup 6/5/12 7:41:46.877 AM com.apple.backupd: Error -35 while resolving alias to backup target 6/5/12 7:41:58.368 AM com.apple.backupd: Backup failed with error: 19 6/5/12 7:59:08.999 AM com.apple.backupd: Starting standard backup 6/5/12 7:59:10.187 AM com.apple.backupd: Backing up to: /Volumes/Seagate 3TB Mac/Backups.backupdb 6/5/12 7:59:13.308 AM com.apple.backupd: Event store UUIDs don't match for volume: Macintosh HD 6/5/12 7:59:13.331 AM com.apple.backupd: Event store UUIDs don't match for volume: Blank 6/5/12 7:59:13.683 AM com.apple.backupd: Deep event scan at path:/ reason:must scan subdirs|new event db| 6/5/12 8:23:31.807 AM com.apple.backupd: Backup canceled. 6/5/12 8:23:33.373 AM com.apple.backupd: Stopping backup to allow backup destination disk to be unmounted or ejected. 6/5/12 9:51:21.572 PM com.apple.backupd: Starting standard backup 6/5/12 9:51:22.515 PM com.apple.backupd: Error -35 while resolving alias to backup target 6/5/12 9:51:32.741 PM com.apple.backupd: Backup failed with error: 19

    Read the article

  • XML based website - how to create?

    - by Aravind
    I would like to create an xml based website. I want to use xml files as datasources since it is a kind of online directory site. Can someone please give me a starting point? Are there any good online resources that I can refer to? I am pretty confortable with ASP and Javascript. Thank you in advance. Best regards, Aravind

    Read the article

  • navigator.onLine

    - by cf_PhillipSenn
    I'm playing with the incomplete example found at http://www.w3.org/TR/offline-webapps/ But I'm distressed to see comments in it like: "renders the note somewhere", and "report error", and "// …" So, will someone please help me write a valid example? Here's what I've got so far: <!DOCTYPE HTML> <html manifest="cache-manifest"> <head> <script> var db = openDatabase("notes", "", "The Example Notes App!", 1048576); function renderNote(row) { // renders the note somewhere } function reportError(source, message) { // report error } function renderNotes() { db.transaction(function(tx) { tx.executeSql('CREATE TABLE IF NOT EXISTS Notes(title TEXT, body TEXT)', []); tx.executeSql(‘SELECT * FROM Notes’, [], function(tx, rs) { for(var i = 0; i < rs.rows.length; i++) { renderNote(rs.rows[i]); } }); }); } function insertNote(title, text) { db.transaction(function(tx) { tx.executeSql('INSERT INTO Notes VALUES(?, ?)', [ title, text ], function(tx, rs) { // … }, function(tx, error) { reportError('sql', error.message); }); }); } </script> <style> label { display:block; } </style> </head> <body> <form> <label for="mytitle">Title:</label> <input name="mytitle"> <label for="mytext">Text:</label> <textarea name="mytext"></textarea> <!-- There is no submit button because I want to save the info on every keystroke --> </form> </body> </html> I also know that I have to incorporate this in there somewhere: if (navigator.onLine) { // Send data using XMLHttpRequest } else { // Queue data locally to send later } But I'm not sure what even I would tie that too.

    Read the article

  • sybase bcp import fails

    - by chromeplatedbanana
    We're trying to export some tables from our production database to our test database using bcp. The bcp export seems to work fine, but the import always fails with a data type error (see below). We tested on our test database exporting the table content to a file, then importing it in again immediately, but that failed too. e.g., bcp TABLENAME out ~/tempfile -S servername -U username generates a file as expected. If we use -c option then the number of lines is as expected. However, bcp TABLENAME in ~/tempfile -S servername -U username fails with CTLIB Message: - L0/0D/S0/N0/0/0: blk_int(): blk_layer: CT library error: Cannot find an equivalent CS_TYPE for this TDS data type 49 blk_init failed. We get this whenever we try to copy into TABLENAME, whether from the production or test table dump file. I don't understand why export and import for the same TABLENAME is generating a data type error. What am I doing wrong here? Thanks

    Read the article

  • What's the most efficient way to reclaim disk space after deleting lots of data from a database on Sybase ASE 15?

    - by Ernie Longmire
    As I understand it, based on some research but zero real-world experience with Sybase ASE, the only way to reclaim disk space once it's been allocated to a database is to export that database, create a new DB with the same schema, and reload all the exported data to the new database. Is this correct, or is there some other method? Then: assuming the above is correct and a full export-recreate-reload is required, what's the most efficient way to do that? Are there tools that will automate all or part of that process? I'm being told we would have to write separate bcp export and import commands for each and every object in the database, which if true sounds easily scriptable by someone who knows Sybase ASE well enough. (I don't.) This seems to me like a really basic housekeeping task, and it feels like I'm missing something obvious.

    Read the article

  • export shared services from MOSS

    - by vittocia
    Hello, using the stsadm command I have been able to export a MOSS website and restore it on a different server which works fine. I tried the same for the shared services, it gave no errors, but it does not have all the import connections when I check around. Is there a better way to export and restore shared services, or a way to synch the import connections and user list?

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • MySQL equivalent to .pgpass, or automatic authentication in a cron job for mySQL

    - by Ibrahim
    I'm writing a bash script to back up my databases. Most are postgresql, and in postgres there's a way to avoid having to authenticate by creating a ~/.pgpass file which contains the postgres password. I put this in root's home directory and made it chmod 0600, so that root could dump the postgres databases without having to authenticate. Now I want to do something similar for mysql, although I only have one mysql database. How can I do this? I don't want to specify the password on the command line for mysqldump because this is part of a script that might be somewhat visible to other users. Is there a better way (i.e. built in to mysql) to do this than make a file that only root can read and then read that to get the mysql password, and then use that in the bash script as a variable?

    Read the article

  • Easy Oracle Log-Shipping

    - by ItsAMystery
    Hi All I am looking for a decent way of keeping a secondary Oracle database up to date without exporting and importing the database each time. There are 3 users on the instance that I would essentially like to 'log ship' if thats what it is called on Oracle! Can anyone suggest anything? The database is well under a GB total and we are running 10g express (although I have thought about using 10g standard as we have a spare license). Cheers Chris

    Read the article

  • Automate uploading of videos to YouTube

    - by John
    Here's the problem: I would like to keep lots of home made videos. Of course, they are subject to being lost, or somebody could steal the the computer, or water or fire could destroy them. Secondly, I have to plug in my hard drive every time I want to watch something, which I find slow and cumbersome. I was thinking that perhaps I could upload the videos to Youtube with the privacy set to invite only and then delete the video from the hard drive automatically. Could this be done?

    Read the article

  • Windows Home Server style redundancy/multi-disk-support on Windows Server 2008 R2?

    - by user19597
    I'm setting up a fileserver for our department. It'll be connected to the domain. I want it to have a very large amount of storage (several TB). Ideally, it should also preserve disk space by identifying identical files and only storing them once. It should be fault tollerant so that if one of the drives fails, that drive can be replaced without losing any data. All of these features are available in Microsoft's consumer offering - Windows Home Server. However, I can't find these kind of features within the enterprise Windows Server 2008 R2. Am I missing something? I know that I could buy a Drobo, or similar, and use this instead. However, I would prefer to use a built-in feature of Windows Server should it exist. It seems surprising to me that these features should be available in Home Server but not in an enterprise fileserver.

    Read the article

  • LTO(4) tape shelf life estimation?

    - by emilp
    LTO tapes, Maxell in this case, are often marketed as having 30 years or more shelf life when stored under "optimal conditions" Is there a way to get a good estimation to the shelf life, given parameters such as relative humidity and temperature etc? Obsolescence of the tapes aside, is there a way of determining the impact to shelf life of any deviance from the optimal. In other words how many years are lost when storing say 1 degree above the specified range? regards Emil

    Read the article

  • Bacula v5.0.2 Windows Installation Issues

    - by JohnyD
    First off, I am very new to Bacula but I'm very intriqued from what I've read. I'm looking to set up Bacula 5.0.2 on a Windows 2008 R2 server. I've run the installer and at the end it asks me to configure DIR name, DIR password, DIR Address. Windows documentation is somewhat hard to come by and I'm not certain what exactly I'm supposed to enter here. Do I need to create a local account that matches this info? Will the installation process create the account for me? Will this be the account that handles the FD daemon/service? I'm also not certain if Address means network location or local direcory. I apologize for my ignorance. Currently I'm trying to use the following information: Name: john pass: john address: thin1 (server name although I have also tried thin1.fqdm.local and 10.0.0.104) This info allows for the installer to complete successfully. However, when I run the BAT it hangs at, "Connecting to Director thin1:9101". The Bacula File Service is currently running under the local system account. What am I doing wrong? What do I have yet to do? Once I get this working properly I assume I will need to install clients on all my Windows boxes? Also, this is a 64-bit cpu but I am installing the 32-bit client. Are there any issues with this? Should I be using the 32-bit client? Thanks very much for the help.

    Read the article

  • Ubuntu: Move fsbackup backups to Amazon S3

    - by Alexander Gladysh
    I have a legacy server (Ubuntu 9.10 Karmic x86), where previous admin set up backups with fsbackup. This server lives in a VPS (under some kind of Xen), and it is low on HDD space (16 GB total). Now it came to a point, where fsbackup backups take more space than the rest of data in the system. The filesystem is 100% filled, and I already cleaned up all that I could, aside from actual backups. I do not have any experience managing fsbackup, and I do not want to break or lose the backups. Googling fsbackup gives surprisingly low quality results... Here is how my backups look like: $ sudo ls -lh /var/archives total 8.1G -rw-rw---- 1 root root 318 2011-01-06 06:26 myserver-20110106.md5 -rw-rw---- 1 root root 258 2011-01-07 06:26 myserver-20110107.md5 -rw-rw---- 1 root root 318 2011-01-08 06:26 myserver-20110108.md5 -rw-rw---- 1 root root 318 2011-01-09 06:26 myserver-20110109.md5 -rw-rw---- 1 root root 346 2011-01-10 06:43 myserver-20110110.md5 -rw-rw---- 1 root root 14M 2011-01-06 06:26 myserver-all-mysql-databases.20110106.sql.bz2 -rw-rw---- 1 root root 14M 2011-01-07 06:26 myserver-all-mysql-databases.20110107.sql.bz2 -rw-rw---- 1 root root 14M 2011-01-08 06:26 myserver-all-mysql-databases.20110108.sql.bz2 -rw-rw---- 1 root root 14M 2011-01-09 06:26 myserver-all-mysql-databases.20110109.sql.bz2 -rw-rw---- 1 root root 862 2011-01-10 06:43 myserver-all-mysql-databases.20110110.sql.bz2 -rw-rw---- 1 root root 827K 2011-01-03 06:25 myserver-etc.20110103.master.tar.gz -rw-rw---- 1 root root 16K 2011-01-06 06:25 myserver-etc.20110106.tar.gz -rw-rw---- 1 root root 16K 2011-01-07 06:25 myserver-etc.20110107.tar.gz -rw-rw---- 1 root root 16K 2011-01-08 06:25 myserver-etc.20110108.tar.gz -rw-rw---- 1 root root 16K 2011-01-09 06:25 myserver-etc.20110109.tar.gz -rw-rw---- 1 root root 827K 2011-01-10 06:25 myserver-etc.20110110.master.tar.gz -rw------- 1 root root 36K 2011-01-10 06:25 myserver-etc.incremental.bin -rw-rw---- 1 root root 29M 2011-01-03 06:25 myserver-home.20110103.master.tar.gz -rw-rw---- 1 root root 11K 2011-01-06 06:25 myserver-home.20110106.tar.gz -rw-rw---- 1 root root 14K 2011-01-07 06:25 myserver-home.20110107.tar.gz -rw-rw---- 1 root root 11K 2011-01-08 06:25 myserver-home.20110108.tar.gz -rw-rw---- 1 root root 11K 2011-01-09 06:25 myserver-home.20110109.tar.gz -rw-rw---- 1 root root 2.0M 2011-01-10 06:25 myserver-home.20110110.master.tar.gz -rw------- 1 root root 27K 2011-01-10 06:25 myserver-home.incremental.bin -rw-rw---- 1 root root 1.5G 2011-01-03 06:29 myserver-opt.20110103.master.tar.gz -rw-rw---- 1 root root 1.5M 2011-01-06 06:25 myserver-opt.20110106.tar.gz -rw-rw---- 1 root root 1.5M 2011-01-07 06:25 myserver-opt.20110107.tar.gz -rw-rw---- 1 root root 1.5M 2011-01-08 06:25 myserver-opt.20110108.tar.gz -rw-rw---- 1 root root 1.5M 2011-01-09 06:25 myserver-opt.20110109.tar.gz -rw-rw---- 1 root root 1.5G 2011-01-10 06:30 myserver-opt.20110110.master.tar.gz -rw------- 1 root root 201K 2011-01-10 06:30 myserver-opt.incremental.bin -rw-rw---- 1 root root 2.3G 2011-01-03 06:41 myserver-srv.20110103.master.tar.gz -rw-rw---- 1 root root 44M 2011-01-06 06:26 myserver-srv.20110106.tar.gz -rw-rw---- 1 root root 27M 2011-01-07 06:25 myserver-srv.20110107.tar.gz -rw-rw---- 1 root root 39M 2011-01-08 06:26 myserver-srv.20110108.tar.gz -rw-rw---- 1 root root 2.0M 2011-01-09 06:25 myserver-srv.20110109.tar.gz -rw-rw---- 1 root root 2.7G 2011-01-10 06:42 myserver-srv.20110110.master.tar.gz -rw------- 1 root root 3.4M 2011-01-10 06:42 myserver-srv.incremental.bin I'm thinking about moving backups to Amazon S3, but before that I have to free some space, so the server can work. Perhaps I can mount /var/archives to an Amazon S3 bucket somehow... Any advice?

    Read the article

  • Auto-archive IMAP mail folders on OS X

    - by Pradeep
    Hi, I am trying to achieve the following. Download all messages from mail server(and remove downloaded messages from server). Downloaded messages should be in a local mailbox preserving folder structure as was defined on server. The download process should be automatic and shouldn't create duplicates. I am on OSX and looking for solutions using Apple Mail or Thunderbird or similar. So far I have found POP is not the way to go (as it looses folder structure and potentially can cause duplicates). The solution described here seems very good but isn't yet available for thunderbird or apple mail. http://getsatisfaction.com/mozilla_messaging/topics/auto_archive_and_keep_folder_structure. The other alternative is outlook which has auto archive which is paid and I think exports to pst instead of the more common mbox format. Yet another alternative is http://www.pop4.org/ which adds support for folder management to POP. Which I don't think is going to become usable soon. Any other better solutions.? Thank you

    Read the article

  • How to warehouse data that is not needed from MS SQL server

    - by I__
    I have been asked to truncate a large table in MS SQL Server 2008. The data is not needed but might be needed once every two years. It will NEVER have to be changed, only viewed. The question is, since I don't need the data on a day-to-day basis, what do I do with it to protect and back it up? Please keep in mind that I will need to have it accessible maybe once every two years, and it is FINE for us if the recovery process takes a few hours. The entire table is about 3 million rows and I need to truncate it to about 1 million rows.

    Read the article

< Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >