Search Results

Search found 8979 results on 360 pages for 'backup sessions'.

Page 31/360 | < Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >

  • spring roo backup command lost my files

    - by Moddy
    I generated spring roo project and modifies .jspx files to my styles. Unfortunately, when i used the backup command, spring roo was auto-generated files to the original one. Thus, my .jspx files are noy my styles. How should i do to recovery my files back from this command.

    Read the article

  • Simple oracle backup without using exp or expdp

    - by Jacob
    I have Oracle 10g installed on Windows in C:\oracle. If I stop all Oracle services, is it safe to backup by just copying the entire directory (e.g., to C:\oracle_bak), or am I significantly better off using expdp? Pointers to docs/websites very welcome, I wasn't able to Google up anything relevant.

    Read the article

  • Copy files from sub directories into one directory.

    - by Derek Organ
    Ok I have a bunch of files in this file structure format. /backup/daily/database1/database1-2011-01-01.sql /backup/daily/database1/database1-2011-01-02.sql /backup/daily/database1/database1-2011-01-03.sql /backup/daily/database1/database1-2011-01-04.sql /backup/daily/database1/database1-2011-01-05.sql /backup/daily/database1/database1-2011-01-06.sql /backup/daily/database1/database1-2011-01-07.sql /backup/daily/anotherdb/anotherdb-2011-01-01.sql /backup/daily/anotherdb/anotherdb-2011-01-02.sql /backup/daily/anotherdb/anotherdb-2011-01-03.sql /backup/daily/anotherdb/anotherdb-2011-01-04.sql /backup/daily/anotherdb/anotherdb-2011-01-05.sql /backup/daily/anotherdb/anotherdb-2011-01-06.sql /backup/daily/anotherdb/anotherdb-2011-01-07.sql /backup/daily/stuff/stuff-2011-01-01.sql /backup/daily/stuff/stuff-2011-01-02.sql /backup/daily/stuff/stuff-2011-01-03.sql /backup/daily/stuff/stuff-2011-01-04.sql /backup/daily/stuff/stuff-2011-01-05.sql /backup/daily/stuff/stuff-2011-01-06.sql /backup/daily/stuff/stuff-2011-01-07.sql And there are lots lots more. ultimately I want to import all the 2011-01-07.sql files into my mysql database. This works for one mysql -u root -ppassword < /backup/daily/database1/database1-2011-01-07.sql That will nicely restore that database from this backupfile. I want to run a process where it does this for all databases. So my plan is to first cp all 2011-01-07 sql files into a tmp dir e.g. cp /backup/daily/*/*2011-01-07*.sql /tmp/all The command above unfortunately isn't working I get an error: cp: cannot stat ..... No such file or directory So can you guys help me out with this. For bonus points if you can tell me how to do the next step which is import all databases in one command doing one at a time that would be great too. I really want to do these in two separate steps because I need to delete a few sql files manually from the tmp dir before I run the restore command. So I need: 1) command to copy all 2011-01-07 sql files to a tmp dir 2) command to import all those files in that dir into mysql I know its possible to do in one but for lots of reasons I really would prefer to do it in two steps.

    Read the article

  • DPM 2007 clashing with existing SQL backup job

    - by Paul D'Ambra
    I've recently installed a DPM2007 server on Server 2003 and have set up a protection group against a server 2003 server running SQL 2005 SP3. The SQL server in question has a full backup (as a sql agent job) once a day and transaction log backups hourly. These are zipped up and FTP'd to a server offsite by a scheduled task. Since adding the DPM job I'm receiving many error messages: DPM tried to do a SQL log backup, either as part of a backup job or a recovery to latest point in time job. The SQL log backup job has detected a discontinuity in the SQL log chain for database SERVER_NAME\DB_Name since the last backup. All incremental backup jobs will fail until an express full backup runs. My google-fu suggests that I need to change the full backup my sqlagent job is running to a copy_only job. But I think this means that I can't use that backup with the transaction_logs to restore the database if the building (including the DPM server) burns down. I'm sure I'm missing something obvious and thought I'd see what the hivemind suggests. It is an option to set-up a co-located DPM server elsewhere and have DPM stream the backup but that's obviously more expensive than the current set up. Many thanks in advance

    Read the article

  • Exclusive Expert and Peer-Led Sessions—Only at Oracle OpenWorld

    - by jhpierce -Oracle
    With more than 2,500 sessions, dozens of hands-on labs, hundreds of demos, four Exhibition Halls, and countless meet-ups, Oracle OpenWorld is the place to learn, share, and network. Planning ahead is always a smart move and here are some links to help you plan your Oracle OpenWorld schedule. You will hear directly from Oracle Thought leaders, Oracle Support experts and their peers about how to succeed across the Oracle stack—from Oracle Consulting Thought Leader sessions dedicated to the cloud to hands on demos showing the value of My Oracle Support—Oracle Open World is your one stop shop for everything Oracle. Featured sessions include: Is Your Organization Trying to Focus on an ERP Cloud Strategy? Modernize Your Analytics Solutions Is Your Organization Trying to Focus on a CX Cloud Strategy? Best Practices for Deploying a DBaaS in a Private Cloud Model Visit the Support & Services Oracle OpenWorld website to discover how you can take advantage of all Oracle OpenWorld has to offer. With 500 Services experts, 50+ sessions, networking events and demos of powerful new support tools, customers will find relevant, useful information about how Oracle Services enables the success of their Oracle hardware and software investments.

    Read the article

  • Exclusive Expert and Peer-Led Sessions—Only at Oracle OpenWorld

    - by Phil Catalano-Oracle
    With more than 2,500 sessions, dozens of hands-on labs, hundreds of demos, four Exhibition Halls, and countless meet-ups, Oracle OpenWorld is the place to learn, share, and network. Planning ahead is always a smart move and here are some links to help you plan your Oracle OpenWorld schedule. You will hear directly from Oracle Thought leaders, Oracle Support experts and their peers about how to succeed across the Oracle stack—from Oracle Consulting Thought Leader sessions dedicated to the cloud to hands on demos showing the value of My Oracle Support—Oracle Open World is your one stop shop for everything Oracle. Featured sessions include: Is Your Organization Trying to Focus on an ERP Cloud Strategy? Modernize Your Analytics Solutions Is Your Organization Trying to Focus on a CX Cloud Strategy? Best Practices for Deploying a DBaaS in a Private Cloud Model Visit the Support & Services Oracle OpenWorld website to discover how you can take advantage of all Oracle OpenWorld has to offer. With 500 Services experts, 50+ sessions, networking events and demos of powerful new support tools, customers will find relevant, useful information about how Oracle Services enables the success of their Oracle hardware and software investments.

    Read the article

  • Exclusive Expert and Peer-Led Sessions—Only at Oracle OpenWorld

    - by AlanBoucher-Oracle
    With more than 2,500 sessions, dozens of hands-on labs, hundreds of demos, four Exhibition Halls, and countless meet-ups, Oracle OpenWorld is the place to learn, share, and network. Planning ahead is always a smart move and here are some links to help you plan your Oracle OpenWorld schedule. You will hear directly from Oracle Thought leaders, Oracle Support experts and their peers about how to succeed across the Oracle stack—from Oracle Consulting Thought Leader sessions dedicated to the cloud to hands on demos showing the value of My Oracle Support—Oracle Open World is your one stop shop for everything Oracle. Featured sessions include: Is Your Organization Trying to Focus on an ERP Cloud Strategy? Modernize Your Analytics Solutions Is Your Organization Trying to Focus on a CX Cloud Strategy? Best Practices for Deploying a DBaaS in a Private Cloud Model Visit the Support & Services Oracle OpenWorld website to discover how you can take advantage of all Oracle OpenWorld has to offer. With 500 Services experts, 50+ sessions, networking events and demos of powerful new support tools, customers will find relevant, useful information about how Oracle Services enables the success of their Oracle hardware and software investments.

    Read the article

  • Exclusive Expert and Peer-Led Sessions—Only at Oracle OpenWorld

    - by Robert Schweighardt
    With more than 2,500 sessions, dozens of hands-on labs, hundreds of demos, four Exhibition Halls, and countless meet-ups, Oracle OpenWorld is the place to learn, share, and network. Planning ahead is always a smart move and here are some links to help you plan your Oracle OpenWorld schedule. You will hear directly from Oracle Thought leaders, Oracle Support experts and their peers about how to succeed across the Oracle stack—from Oracle Consulting Thought Leader sessions dedicated to the cloud to hands on demos showing the value of My Oracle Support—Oracle Open World is your one stop shop for everything Oracle. Featured sessions include: Is Your Organization Trying to Focus on an ERP Cloud Strategy? Modernize Your Analytics Solutions Is Your Organization Trying to Focus on a CX Cloud Strategy? Best Practices for Deploying a DBaaS in a Private Cloud Model Visit the Support & Services Oracle OpenWorld website to discover how you can take advantage of all Oracle OpenWorld has to offer. With 500 Services experts, 50+ sessions, networking events and demos of powerful new support tools, customers will find relevant, useful information about how Oracle Services enables the success of their Oracle hardware and software investments.

    Read the article

  • SQL Server transaction log backups,

    - by krimerd
    Hi there, I have a question regarding the transaction log backups in sql server 2008. I am currently taking full backups once a week (Sunday) and transaction log backups daily. I put full backup in folder1 on Sunday and then on Monday I also put the 1st transaction log backup in the same folder. On tuesday, before I take the 2nd transaction log backup I move the first transaction log backup from folder1 an put it into folder2 and then I take the 2nd transaction log backup and put it in the folder1. Same thing on Wed, Thurs and so on. Basicaly in folder1 I always have the latest full backup and the latest transaction log backup while the other transaction log backups are in folder2. My questions is, when sql server is about to take, lets say 4th (Thursday) transaction log backup, does it look for the previous transac log backups (1st, 2nd, and 3rd) so that this new backup will only include the transactions from the last backup or it has some other way of knowing whether there are other transac log backups. Basically, I am asking this because all my transaction log backups seem to be about the same size and I thought that their size will depend on the amount of transactions since the last transaction log backup. Can anyone please explain if my assumptions are right? Thanks...

    Read the article

  • NTDS Replication Warning (Event ID 2089)

    - by Chris_K
    I have a simple little network with 3 AD servers in 2 sites. Site A has Win2k3 SP2 and Win2k SP4 servers, site B has a single Win2k3 SP2 server. All have been in place for at least 3 years now. Just last week I started getting Event 2089 "not backed up" warnings (example below) on both of the win2k3 servers. I understand what the message means, no need to send me links to the technet article explaining it. I'll improve my backups. What I'm more curious about is why did I just start getting this message now? Why haven't I been getting it for the past 3 years?!? Perhaps this is related: I recently decommissioned a few other sites and AD controllers (there used to be 3 more sites, each with their own controller). Don't worry, I did proper DCpromo exercises and made sure we didn't lose anything. But would shutting those down possibly be related to why I get this error now? This won't keep me awake at night but I am curious as to what changed... Event Type: Warning Event Source: NTDS Replication Event Category: Backup Event ID: 2089 Date: 3/28/2010 Time: 9:25:27 AM User: NT AUTHORITY\ANONYMOUS LOGON Computer: RedactedName Description: This directory partition has not been backed up since at least the following number of days. Directory partition: DC=MyDomain,DC=com 'Backup latency interval' (days): 30 It is recommended that you take a backup as often as possible to recover from accidental loss of data. However if you haven't taken a backup since at least the 'backup latency interval' number of days, this message will be logged every day until a backup is taken. You can take a backup of any replica that holds this partition. By default the 'Backup latency interval' is set to half the 'Tombstone Lifetime Interval'. If you want to change the default 'Backup latency interval', you could do so by adding the following registry key. 'Backup latency interval' (days) registry key: System\CurrentControlSet\Services\NTDS\Parameters\Backup Latency Threshold (days) For more information, see Help and Support Center at http://go.microsoft.com/fwlink/events.asp.

    Read the article

  • How do I restore a backup of my keyring (containing ssh key passprases, nautilus remote filesystem passwords and wifi passwords)?

    - by con-f-use
    I changed the disk on my laptop and installed Ubuntu on the new disk. Old disk had 12.04 upgraded to 12.10 on it. Now I want to copy my old keyring with WiFi passwords, ftp passwords for nautilus and ssh key passphrases. I have the whole data from the old disk available (is now a USB disk and I did not delete the old data yet or do anything with it - I could still put it in the laptop and boot from it like nothing happend). The old methods of just copying ~/.gconf/... and ~/.gnome2/keyrings won't work. Did I miss something? 1. Edit: I figure one needs to copy files not located in the users home directory as well. I copied the whole old /home/confus (which is my home directory) to the fresh install to no effect. That whole copy is now reverted to the fresh install's home directory, so my /home/confus is as it was the after fresh install. 2. Edit: The folder /etc/NetworkManager/system-connections seems to be the place for WiFi passwords. Could be that /usr/share/keyrings is important as well for ssh keys - that's the only sensible thing that a search came up with: find /usr/ -name "*keyring* 3. Edit: Still no ssh and ftp passwords from the keyring. What I did: Convert old hard drive to usb drive Put new drive in the laptop and installed fresh version of 12.10 there Booted from old hdd via USB and copied its /etc/NetwrokManager/system-connections, ~/.gconf/ and ~/.gnome2/keyrings, ~/.ssh over to the new disk. Confirmed that all keys on the old install work Booted from new disk Result: No passphrase for ssh keys, no ftp passwords in keyring. At least the WiFi passwords are migrated.

    Read the article

  • Amazon EC2 EBS automatic backup one-liner works manually but not from cron

    - by dan
    I am trying to implement an automatic backup system for my EBS on Amazon AWS. When I run this command as ec2-user: /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** everything works fine. But if I add this line into /etc/crontab and restart the crond service: 15 12 * * * ec2-user /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** that doesn't work. I checked var/log/cron and there is this line, therefore the command gets executed: Dec 13 12:15:01 ip-10-204-111-94 CROND[4201]: (ec2-user) CMD (/opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** ) Can you please help me to troubleshoot the problem? I guess is some environment problem - maybe the lack of some variable. If that's the case I don't know what to do about it. Thanks.

    Read the article

  • Using rsync to take backup of folder

    - by Ali
    Hi, I have a server (Linux) with NAS which is mounted as folder "mount" I have website in "public_html" folder. I want to take backup of website in mount folder automatically at certain intervals for e.g. every hour. I read that there is something called "rsync" which is used to make two folders sync. And it doesn't copy all files every time and instead matches if the file has been changed and then only update changed files. How do I use it to make automatic backups? I have root access to server. Thanks

    Read the article

  • How to schedule daily backup in SQL Server 2008 Web Edition

    - by Xenon
    In SQL Server Management Studio I created a maintenance plan but it won't work Error is; "Message Executed as user: LITESPELL-19C34\Administrator. Microsoft (R) SQL Server Execute Package Utility Version 10.0.1600.22 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. The SQL Server Execute Package Utility requires Integration Services to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. To install Integration Services, run SQL Server Setup and select Integration Services. The package execution failed. The step failed." But in Microsoft page http://www.microsoft.com/sqlserver/2008/en/us/web.aspx in Automate tasks and policies section it is written that backup can be scheduled in this edition but how?

    Read the article

  • How to schedule daily backup in MSSQL Server 2008 Web Edition

    - by Xenon
    In MSSQL Management Studio I created a maintenance plan but it won't work Error is; "Message Executed as user: LITESPELL-19C34\Administrator. Microsoft (R) SQL Server Execute Package Utility Version 10.0.1600.22 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. The SQL Server Execute Package Utility requires Integration Services to be installed by one of these editions of SQL Server 2008: Standard, Enterprise, Developer, or Evaluation. To install Integration Services, run SQL Server Setup and select Integration Services. The package execution failed. The step failed." But in Microsoft page http://www.microsoft.com/sqlserver/2008/en/us/web.aspx in Automate tasks and policies section it is written that backup can be scheduled in this edition but how?

    Read the article

  • Backup AWS Dynamodb to S3

    - by Ali
    It has been suggested on Amazon docs http://aws.amazon.com/dynamodb/ among other places, that you can backup your dynamodb tables using Elastic Map Reduce, I have a general understanding of how this could work but I couldn't find any guides or tutorials on this, So my question is how can I automate dynamodb backups (using EMR)? So far, I think I need to create a "streaming" job with a map function that reads the data from dynamodb and a reduce that writes it to S3 and I believe these could be written in Python (or java or a few other languages). Any comments, clarifications, code samples, corrections are appreciated.

    Read the article

  • Exchange 2010 and ESE Backup API

    - by Hannes de Jager
    Exchange 2010 does not support the ESE API for doing backups like it did in 2003 and 2007 according to MSDN. I Quote: "Exchange 2010 no longer supports the ESE streaming APIs for backup and restore of program files or data. Instead, Exchange 2010 supports only VSS-based backups." So my question is, if this is the case, why is the DLL (ESEBCLI2.DLL) still shipped with exchange 2010? I found it under C:\Program Files\Microsoft\Exchange Server\V14\Bin. Am I missing something here?

    Read the article

  • Flash messages in ASP.NET MVC without sessions

    - by Fernando Correia
    I'm developing a web application for Windows Azure using ASP.NET MVC 4. I would like to enforce one restriction in the architecture: do not use Session. To achieve availability on Azure, and since there is no sticky sessions, I would need to store the session data in some central service, probably either SQL Azure or the Caching Service. I would rather avoid sessions on the SQL database to avoid the increased latency, and the caching service on Azure is very expensive for the ammount of memory offered. On the other hand, I would like to have the ability to easily pass Flash-style messages among redirects. TempData is the recommended way to do this, but by default it uses the session object. So I would like to know: Is there an alternative way to use TempData that doesn't require sessions or shared data between servers? Cookies perhaps? Is there a better alternative I'm overlooking?

    Read the article

< Previous Page | 27 28 29 30 31 32 33 34 35 36 37 38  | Next Page >