Search Results

Search found 31902 results on 1277 pages for 'sql backup'.

Page 196/1277 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Moving SQL 2008 from 2003 OS to virtual 2012 OS

    - by David
    If we wanted to move SQL 2008 from a 2003 OS to a virtual 2012 OS (using VM Ware), does anyone know if there are any licensing or technical problems that would get in the way? All the instructions I've seen on moving SQL server from one machine to another assume the new machine has the same OS. I realize that there are licences have more cores and fail over capability, but for now we are fine with a simple installation.

    Read the article

  • SQL Transactional Replication snapshot not applying

    - by dmch2
    Hi, I'm using SQL Transactional Replication with pull subscriptions to replicate databases (hosting their own distribution database) from several servers across a VPN to a central server. I've got the first 2 databases working fine but the 3rd one is causing me problems. My subscription server is SQL 2008, the source systems are all SQL 2005. The source databases are a few 100Mb in size and contain audit data so are simply growing slowly by adding new records at approx 1kb a second. As far as the replication monitor, Agent logs and event logs show everything is working fine - except that no data appears in my subscription database. The distribution agent doesn't seem to want to read the snapshot (and hence the initial state and schema) from the publisher. New transactions aren't applied although they do seem to be arriving OK as the replication monitor shows things like '5 transactions with 10 commands were delivered'. I would expect (as in previous times) to see statements about data being BCPed in the replication monitor. The snapshot is on the publisher on a shared folder. The subscriber can view the snapshot OK (\\repldata) and the alt snapshot folder is pointing at it. But the distribution agent doesn't seem to be making an attempt to do read it. I tried changing the snapshot path to something that's incorrect and didn't even get an error saying that it couldn't access it. After lots of googling etc I found that sp_MSget_repl_commands is called by the subscriber on the distribution database on the publisher. Running a profiler I can see that it's only called for one agent Id. After a reinit it's called for sequence number 0x0 as expected so I thought that would mean it's would look for the snapshot. However, looking on the publisher I see that there's data for two agents - the snapshot agent and the log reader agent (which is being queries). So I guess I need to tell the distribution agent to get the data for both. But how? and more importantly - why? It worked fine on the other two servers I've replicated. I'm not an SQL novice but this is pretty much my first go at replication so don't be afraid to accuse me of missing something obvious/stupid! I can get log files (eg from the distribution agent) if you want but they don't seem to have any errors in them - it just starts up and starts applying log reader agent changes. Cheers Dave

    Read the article

  • How to log all sql going through JBoss datasource with log4j

    - by Ichorus
    I've looked at log4jdbc (which does not support datasources), p6spy which seems to be what I am looking for but it has not been updated since 2003 which makes me nervous and lists only JBoss 3.x (we use JBoss 5), and JAMon which seems heavyweight for what I am trying to accomplish (a simple log of all SQL statements running through a JBoss application server). I was hoping that JBoss itself would have a switch to log all the sql (as Websphere does) but I cannot find any documentation for it so that functionality might not exist.

    Read the article

  • Retrieve MS SQL database or table structure in XML

    - by clutch
    Is there a way to export the database schema in well formed XML of a MS 2000 SQL Server. I'm looking for just the structure not the data and the more detailed the better. The XML may be used in a migration processes. I'm more familiar with MySQL then with SQL Server so please be detailed if you have time. Thanks

    Read the article

  • SQL server in VMware

    - by UndertheFold
    Please provide your tips and best practices for virtualizing SQL Server in VMWare ESX I am interested in advanced configurations and settings. Please provide reasoning behind your recommendations Edit: Just to clarify, I already have over 70 Virtual SQL servers in separate clusters using an ISCSI equallogic San - What I am really looking for are those advanced configurations like: How you configured your disks / RDM's Do you make use of settings like Mem.ShareScanGHz - http://communities.vmware.com/thread/143828 - that are not well documented

    Read the article

  • SQL Server 2008 hosting (for development)

    - by hazimdikenli
    Hello, We are doing distributed development, working at home, office and sometimes at customers. We are using assembla for source-repository and we need a centralized-remote SQL Server 2008 database hosting for (similar to svn on assembla) our SQL development server. Can you name / recommend any service providers?

    Read the article

  • Which ports to open for Microsoft SQL Server?

    - by dnolan
    Having searched the internet a few times on the best way to open up SQL Server connectivity through windows firewall i've yet to find a best way of doing it. Does anyone have a guaranteed way of finding which ports SQL is running on so you can open them in windows firewall?

    Read the article

  • Preventing users from deleting SQL data

    - by me2011
    We just purchased a program that requires the users to have an account in the MS SQL server, with read/write access to the program's database. My concern is that since these users will now have write access to the database, they could directly connect to the SQL server outside of the program's client and then mess with the data directly in the tables. Is there anyway I can prevent access to the database while still allowing access via the client program?

    Read the article

  • How can a Perfmon "% Processor Time" counter be over 100%?

    - by Bill Paetzke
    The counter, Process: % Processor Time (sqlservr), is hovering around 300% on one of my database servers. This counter reflects the percent of total time SQL Server spent running on CPU (user mode + privilege mode). The book, Sql Server 2008 Internals and Troubleshooting, says that anything greater than 80% is a problem. How is it possible for that counter to be over 100%?

    Read the article

  • Can a SQL Server have a CPU bottleneck when Processor Time is under 30%

    - by Sleepless
    Is it in principle possible for the CPU to be the bottleneck on a SQL Server if the Performance Counter Processor:Processor Time is constantly under 30% on all cores? Or does low Processor Time automatically allow me to rule out the CPU as a potential trouble source? I am asking this because SQL Nexus lists CPU as the top bottleneck on a server with low Processor Time values.

    Read the article

  • SQL Server 2008 Restore hangs on 100%

    - by CL4NCY
    Hi, I have backed up a large database from SQL 2005 and am trying to restore it to a SQL 2008 database. It seems to work ok until it gets to 100% when it hangs indefinitely. I've managed to restore smaller databases to this server ok. Any ideas?

    Read the article

  • Backup Azure Tables, schedule Azure scripts&hellip; and more

    - by Herve Roggero
    Well – months of effort are now officially over… or should I say it’s just the beginning?   Enzo Cloud Backup 2.0 (beta) is now officially out!!! This tool will let you do the following: * Backup SQL Database (and SQL Server to a limited extend) * Backup Azure Tables * Restore SQL Backups into another SQL environment * Restore Azure Tables in Azure Storage, or SQL Environment * Manage and schedule database maintenance scripts * Drop database schema containers (with preview) for SaaS environments * Receive alerts (SMTP) when operations complete or fail That’s it at a high level… but you need to see the flexibility around these features. For example you can select a specific backup strategy for Azure Tables allowing faster backup operations when partition keys use GUIDs. You can also call custom stored procedures during the restore operation of Azure Tables, allowing you to transform the data along the way. You can also set a performance threshold during Azure Table backup operations to help you control possible throttling conditions in your Storage Account. Regarding database scripts, you can now define T-SQL scripts and schedule them for execution in a specific order. You can also tell Enzo to execute a pre and post script during Azure Table restore operations against a SQL environment. The backup operation now supports backing up to multiple devices at the same time. So you can execute a backup request to both a local file, and a blob at the same time, guaranteeing that both will contain the exact same data. And due to the level of options that are available, you can save backup definitions for later reuse. The screenshot below backs up Azure Tables to two devices (a blob and a SQL Database). You can also manage your database schemas for SaaS environments that use schema containers to separate customer data. This new edition allows you to see how many objects you have in each schema, backup specific schemas, and even drop all objects in a given schema. For example the screenshot below shows that the EnzoLog database has 4 user-defined schemas, and the AFA schema has 5 tables and 1 module (stored proc, function, view…). Selecting the AFA schema and trying to delete it will prompt another screen to show which objects will be deleted. As you can see, Enzo Cloud Backup provides amazing capabilities that can help you safeguard your data in SQL Database and Azure Tables, and give you advanced management functions for your Azure environment. Download a free trial today at http://www.bluesyntax.net.

    Read the article

  • Windows Server Backup "Reading Data; please wait..."

    - by Reafidy
    On windows Server 2008 R2 I have recently added the windows server backup (WSB) feature. Opening WSB I get the message "Reading Data; please wait...". This message fails to go away, even after leaving the server for over 12 hours. I also notice in the task manager that svchost.exe (username: networkservice) is using all available processing power. So I terminated that process and then WSB comes on-line. However after restarting the server and WSB the issue reoccurs. WSB also fails to recognize my store-in-go flash drive (2gb). What is the underlying problem here?

    Read the article

  • SQL Server Issue: Could not allocate space for object ... primary filegroup is full

    - by Luke
    Trying to figure out a problem at an office that has SQL Server 2005 installed on Windows SBS Server 2008. Here's the setup: It's an office, and the person who set this all up is nowhere to be found. I'm the best hope they have... One of the programs they use on a workstation gives them an error of "Could not allocate space for object 'Billing' in database "MyDatabase" because primary filegroup is full" when trying to save an entry in their software. I searched around for hours, looking for possible solutions. One was to check for available disk space, and another was to defrag. I checked the hard drives on the server, and there is plenty of space free. I also defragged, which may have helped the problem somewhat. It's hard to say, because it seems like with the nature of the error, if you try over and over you might get it to actually save. My next step was to try to see if autogrowth was enabled on the database. This would seem to be a likely / possible solution, but I can't access the database! If I run the SQL Management Studio, I can log in as my Windows user and view the list of databases. However, if I try to do anything (actually view the database, view the properties, add or edit users), I get errors that I don't have permission. For what it's worth, I also tried runing Management Studio as Administrator, in case that would help. No difference, though. Now, what I'm guessing is going on -- from my limited knowledge of SQL and from reading online -- is that though I'm logged in as a Windows administrator, that account does NOT have SQL access. I do see a list of SQL users, including SA, but I again don't have permission to add one or to change the password on an existing one. And nobody at the office has any idea what the SQL passwords could be. So... here's my thinking thus far: 1 - The "Could not allocate" error likely points to a database that needs to be allowed to autogrow. Especially since I verified there is plenty of free space and the HD has been defragmented. 2 - Enabling autogrow would be very easy to do if I had the proper access within SQL Management Stuido. That leads me to this link: http://blogs.technet.com/b/sqlman/archive/2011/06/14/tips-amp-tricks-you-have-lost-access-to-sql-server-now-what.aspx It sounds like it's a step-by-step guide for giving me the access I need to SQL. I'm guessing that if I followed this guide, I would be able to then log in to the SQL server via Management Studio with the proper permissions, and would be able to enable autogrow (or simply view the status of the existing database), and hopefully solve the "Could not allocate space" problem! So I guess I have a few questions: 1 - Would you guys agree with my "diagnosis"? Think I'm barking up the right tree? 2 - Is there any risk at all in hurting / disabling / wrecking the current SQL database or setup with me going through the guide to regain SQL access? I understand that per the guide, I would have to temporarily shut down SQL, so obviously it wouldn't be accessible during that time. But it wouldn't be worth the risk if there's a chance I could mess anything up... Like I said, the workstations ARE currently accessing the database somehow, but nobody knows with what login info or anything. Basically, it's set up, it works (usually), but if they had to reload the software, nobody would know how. Any feedback would be appreciated!! The problem is such that it's not an emergency for them, but an annoyance. If I could fix it, it would be wonderful. But if not, I think they'll manage, especially as they are going to eventually stop using this software. Thank you so much for your time! Luke

    Read the article

  • File History - Unable to scan user libraries for changes and perform backup of modified files for configuration

    - by azl
    When trying to run the File History tool in Windows 8 it runs for about 2 seconds then stops. No files are backed up to the selected drive. In the event viewer the only error that appears is: Unable to scan user libraries for changes and perform backup of modified files for configuration C:\Users\win8User\AppData\Local\Microsoft\Windows\FileHistory\Configuration\Config I've tried deleting both the configuration files and the FileHistory directory on the target drive. Setting up File History again results in the same error. Is there a better way to track down what is causing the failure? Or somehow get the File History tool to create a more verbose log file that shows what is causing the problem?

    Read the article

  • What is the best backup solution for VMware Infrastructure system that hosts a wide variety of VMs?

    - by SBWorks
    In a situation where you are running: VMware Infrastructure 4.x with multiple hosts Over 150 VMs with a wide variety of operating systems (Linux in a half dozen distros, Solaris, every MS version, etc.) in multiple languages with almost every mix of installed software (luckily, no Exchange mail servers) Using an EMC fiber channel SAN The VWs that need need to be backed up use about 2 terabytes of data (total) The goal is to keep backups for about 3-months At this rough scale, what backup solutions have worked well for you? And, as an add-on question, did any of them have de-duplication that you thought was effective and useful?

    Read the article

  • Are there any home/soho NAS devices that will backup/sync to the cloud?

    - by 3rdparty
    Looking for a home office (SOHO) market (priced) network hard drive (NAS) that will sync some or all of its content to a cloud-based backup service. The only option I've been able to find so far is NetGear's [ReadyNAS Vault][1] however from what I've read it's not as secure as it could be, and the service is quite expensive ($200/yr for 50GB of cloud storage) - it's 'powered' by ElephantDrive Ideally would love to see something like Wuala integrated into a Lacie Network HDD - conveniently, I suspect this is in the works as Lacie recently acquired Wuala, however nothing has come of it yet. I know there are options to use rsync with a customizable NAS (such as the very versatile and hackable D-Link DNS-323, but the easier this is to setup and maintain, the better. Thanks! ps. I had many links posted within this question, but was limited to posting with only one due to anti-spam restrictions - gotta get my 'reputation' higher!

    Read the article

  • Best way to use my windows box as a backup?

    - by DFischer
    I put a 1.5 TB HD in my Windows 7 box and my main computer is a MBP. I have a lot of professional files/folders in a FireWire 800 external HD connected to the MBP and I want to use the 1.5 TB HD in my Windows 7 box as a backup for both the HD and MBP. Right now I am just copying files manually to the HD over the network and that's very slow and open to failure (not rsync'd.) Anyone suggest some appropriate solutions? Should I just figure out how to setup RSync on the windows box or is there a better alternative? Thanks!

    Read the article

  • Best way to use my windows box as a backup?

    - by user29336
    I put a 1.5 TB HD in my Windows 7 box and my main computer is a MBP. I have a lot of professional files/folders in a FireWire 800 external HD connected to the MBP and I want to use the 1.5 TB HD in my Windows 7 box as a backup for both the HD and MBP. Right now I am just copying files manually to the HD over the network and that's very slow and open to failure (not rsync'd.) Anyone suggest some appropriate solutions? Should I just figure out how to setup RSync on the windows box or is there a better alternative? Thanks!

    Read the article

  • Why /home folder is missing from my backup archive created by tar?

    - by Konstantin Pereyaslov
    So I'm doing full backup of my VPS using the following command (as root, of course): tar czvf 20120604.tar.gz / Everything seems to be fine, all files seem to appear in the list. The size of archive is 6 Gb and gunzipped version is 11 Gb which includes /home, because I totally have 11 Gb of data on VPS. But when I try actually to unpack archive, or open it using mc or WinRAR, there's no /home folder. And WinRAR tells 20120604.tar.gz - TAR+GZIP archive, unpacked size 894 841 346 bytes. It can't be WinRAR's bug, because when I type tar xzvf 20120604.tar.gz, /home folder isn't unpacked either. Why is /home folder missing from my archive? And what can I do to include it there? tar --version outputs the following: tar (GNU tar) 1.15.1

    Read the article

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

  • Ad Agency storage/file server +backup needed (NAS or something else?)

    - by Rob
    Looking for a "this is all you need" recommendation. We're a small ad agency with both mac & pcs that access and share files from a 3 yr old Windows 2000 box (no server software). We currently have 1TB on the "server" and back it up to 2 different Seagate Free Agent Pro 1TB external drives. But we're low on space and are looking for something that's bigger, that we can still access from Mac & PC, EASY backup system, secure from viruses, firewall enabled. Not sure if a NAS will work or if we should have a real server. We don't really get on that box except to restore files, or run Norton on it. I hope I've provided enough for a general recommendation. Thanks. Rob Phx

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >