Search Results

Search found 89127 results on 3566 pages for 'backup server'.

Page 87/3566 | < Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >

  • Windows 2008 as home file server and more

    - by Christian W
    I currently have a freenas-unit as a NAS, and a Win2k8R2-unit as server. However I would like to consolidate these to units in one. What I really like about the freenas-unit is the ZFS filesystem. And the only reason I care about the ZFS filesystem is the easy way I can grow an existing filesystem just by inserting a new drive. How would this work in Win2k8? If I setup my unit with a separate drive as C: and a 1TB drive as D:. The D: would then be segmented into d:\Videos d:\Music d:\Pictures. When everything gets close to filling the storage-drive, I would like to expand the storage, but I wouldn't want to have E:\Videos or d:\Videos2 (using the NTFS folder mount thingy). I still want all my Videos to reside in D:\Videos and I want the OS to decide which drive it's going to be stored on... Some kind of on-the-fly jbod expansion :) Is this at all possible in Windows 2008?

    Read the article

  • SQL Server 7 Transaction Logs Issues

    - by nate
    Over the week my database server transaction log was full. With our app people could select from the database but could not update or insert into the database. In the past we have just truncated the transaction logs. After that, everything was back to normal. This week I truncated the transaction logs, and shrink that database. Now we can select, update, and insert into the database. The only issue is when we do a big job, and to a lot on inserting or updating, we get the following error: Database error: S1008:[Microsft][ODBC]Operation canceled We never had this issue before, I am assuming the that is the same as a timeout error. Has anyone else had this issue before, or know how I resolve this?

    Read the article

  • Download databasename.bak file

    - by Jordon
    I have downloaded databasename.bak file from my hosting company, when i tried to restore that DB file in SQL server 2008 it is keep on giving me following error. The media family on device 'C:\go4sharepoint_1384_8481.bak' is incorrectly formed. SQL Server cannot process this media family. RESTORE HEADERONLY is terminating abnormally. (Microsoft SQL Server, Error: 3241) According to this error and from following link http://www.sqlcoffee.com/Troubleshooting047.htm It is clear that either file i am downloading is corrupt or it is getting corrupted on the way? Any idea, why I am keep on receiving this error? I tried almost all ways but unable to fix this problem, please help me.

    Read the article

  • How Can I Automate the Backup of a Quickbooks Server?

    - by Nick
    I have three computers: The first is the company file server which has the Quick Books company file, is always on, and lives in the closet. The other two are Quick Books Clients. All are XP Pro. I need a way to automatically backup the QB data file, without any user intervention. Quick Books has a built in scheduled backup utility, but from what I've read, it only works when the software is running in single user mode. (and obviously putting the server into single user mode defeats the concept of having a server). Also, I'm not actually running QB itself on the server, just the "QB Database Server" process that sits in the system tray. Surely there must be a way to automate this? I'm open to any ideas/suggestions. Thanks!

    Read the article

  • Amazon EC2 EBS automatic backup one-liner works manually but not from cron

    - by dan
    I am trying to implement an automatic backup system for my EBS on Amazon AWS. When I run this command as ec2-user: /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** everything works fine. But if I add this line into /etc/crontab and restart the crond service: 15 12 * * * ec2-user /opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** that doesn't work. I checked var/log/cron and there is this line, therefore the command gets executed: Dec 13 12:15:01 ip-10-204-111-94 CROND[4201]: (ec2-user) CMD (/opt/aws/bin/ec2-create-snapshot --region us-east-1 -K /home/ec2-user/pk.pem -C /home/ec2-user/cert.pem -d "vol-******** snapshot" vol-******** ) Can you please help me to troubleshoot the problem? I guess is some environment problem - maybe the lack of some variable. If that's the case I don't know what to do about it. Thanks.

    Read the article

  • Using rsync to take backup of folder

    - by Ali
    Hi, I have a server (Linux) with NAS which is mounted as folder "mount" I have website in "public_html" folder. I want to take backup of website in mount folder automatically at certain intervals for e.g. every hour. I read that there is something called "rsync" which is used to make two folders sync. And it doesn't copy all files every time and instead matches if the file has been changed and then only update changed files. How do I use it to make automatic backups? I have root access to server. Thanks

    Read the article

  • Backup AWS Dynamodb to S3

    - by Ali
    It has been suggested on Amazon docs http://aws.amazon.com/dynamodb/ among other places, that you can backup your dynamodb tables using Elastic Map Reduce, I have a general understanding of how this could work but I couldn't find any guides or tutorials on this, So my question is how can I automate dynamodb backups (using EMR)? So far, I think I need to create a "streaming" job with a map function that reads the data from dynamodb and a reduce that writes it to S3 and I believe these could be written in Python (or java or a few other languages). Any comments, clarifications, code samples, corrections are appreciated.

    Read the article

  • Temporary "Backup" of SharePoint Content During Feature and Solution Deployment

    - by ccomet
    I need to decide on a method for storing a subset of the content in a SharePoint site, so that when I delete and recreate certain lists as part of a feature activation, I can re-insert all of this content back where it should belong. I have an idea myself, but I don't know if it's the only method and more importantly, the right method. My client has me creating a SharePoint system for them to communicate with their clients. The business process has maybe 5 stages in it (maybe it's more, I don't even know because they don't tell me everything), and the current system I've written over the past months is maybe 2 stages through. This meets our deadline of completing those systems by Monday next week... but at that point my client is planning on making the site live from that point. In effect, their work with their clients will be running parallel with my work for them. As I complete my own work on a separate test server, I'll push each following stage of the process onto the live server. Scheduled downtimes during non-business times (like a weekend) will be available for me to perform these pushes. Keeping pace so that my development is faster than the actual business process is my own problem and off-topic... so let's get back to the problem I stated at the start of this post. In this system, we have sets of features which will create lists for their associated content types and field types when activated, and delete these lists when the feature is deactivated. Most updates don't need to deactivate and reactivate these features, such as workflow changes, custom actions, custom forms, and similar ilk. But there are some parts which do require this. On my test server, it's okay for me to obliterate lists, but once the site is live and there's real correspondence data, it's absolutely unacceptable to do this. So when I need to implement a new change in functionality, I need to be able to store the currently present data in several lists, deactivate the feature, reactivate the feature, and restore all of this data. Perhaps I have hoist myself by my own petard with the feature system I implemented. Unfortunately, the necessity to later on make several of these "project sites" meant I had to do a lot of my code with the concept of "Can be deployed repeatedly" in mind. My current plan is to run through lists and libraries which will be affected by the particular feature that is to be reset. Files and all of their versions will be saved in a directory on the server. Then, a set of text files will be used to store all of the important field values for the items. This includes a lot of cross-list reference lookups that will need to be maintained, but that's simple enough. Then, I deactivate the feature, deploy the new solution, and reactivate the feature. We upload all of the files in the order specified by their versions and update them with the stored fields for those versions, so that we retain the version structure. As each one is first uploaded, the new ID is picked out, and all relevant lookups in the rest of the files are updated (in some manner that I make sure I don't re-update it later with an incorrect value, of course). After that, we run through all the rest of the items in the order most conducive to keeping the relational data correct. This roughly summarizes what my current plan is. To my advantage, there are no long running workflows in the system that will be affected by this, so there's nothing I will have to worry about making sure nothing is "still running" when I do this stuff. I don't really know all the cons of this approach... I can imagine they're quite hefty. But I'm unsure what other choices I even have, and my searches haven't turned up anything. Is there anyone who can think of a better idea? Or will anyone just tell me that I really have no other choice? Thanks in advance!

    Read the article

  • SQLAuthority News – Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 – Microsoft Whitepaper

    - by pinaldave
    I recently presented session on Statistics and Best Practices in Virtual Tech Days on Nov 22, 2010. The sessions was very popular and I got many questions right after the sessions. The number question I had received was where everybody can get the further information. I am very much happy that my sessions created some curiosity for one of the most important feature of the SQL Server. Statistics are the heart of the SQL Server. Microsoft has published a white paper on the subject how statistics are useful to Query Optimizer. Here is the abstract of the same white paper from Microsoft. Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 Writer: Eric N. Hanson and Yavor Angelov Microsoft SQL Server 2008 collects statistical information about indexes and column data stored in the database. These statistics are used by the SQL Server query optimizer to choose the most efficient plan for retrieving or updating data. This paper describes what data is collected, where it is stored, and which commands create, update, and delete statistics. By default, SQL Server 2008 also creates and updates statistics automatically, when such an operation is considered to be useful. This paper also outlines how these defaults can be changed on different levels (column, table, and database). In addition, it presents how certain query language features, such as Transact-SQL variables, interact with use of statistics by the optimizer, and it provides guidance for using these features when writing queries so you can obtain good query performance. Link to white paper Statistics Used by the Query Optimizer in Microsoft SQL Server 2008 ?Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: Pinal Dave, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • Insufficient storage available to create shadow copy

    - by Bob.at.SBS
    I have used the "Windows 7 File Recovery" tool under Windows 8 to create system image backups to an external USB hard drive. I built a new Windows 8.1 machine, and I want to create my first system image backup of that machine to the same USB hard drive. The "Windows 7 File Recovery" tool is gone in Windows 8.1, but wbAdmin is alive and well: wbAdmin start backup -backupTarget:\\?\Volume{2a2b...994f} -allCritical -quiet fails with this text displayed: wbadmin 1.0 - Backup command-line tool (C) Copyright 2013 Microsoft Corporation. All rights reserved. Retrieving volume information... This will back up (EFI System Partition),(C:),Recovery (300.00 MB) to \?\Volume {2a2b1255-3a86-11e3-be86-b8ca3a83994f}. The backup operation to F: is starting. Creating a shadow copy of the volumes specified for backup... Summary of the backup operation: The backup operation stopped before completing. The backup operation stopped before completing. Detailed error: ERROR - A Volume Shadow Copy Service operation error has occurred: (0x8004231f) Insufficient storage available to create either the shadow copy storage file or other shadow copy data. The EFI System Partition is 100 MB The Recovery Partition is 300 MB The C partition is 1.72 TB, NTFS, 218 GB used, 1.51 TB free The destination drive is 1.81 TB, NTFS, 678 GB used, 1.15 TB free I've fiddled with vssadmin resize shadowstorage, with no change in the error. vssadmin list shadowstorage displays: Shadow Copy Storage association For volume: (C:)\?\Volume{37a0...263}\ Shadow Copy Storage volume: (C:)\?\Volume{37a0...263}\ Used Shadow Copy Storage space: 2.39 GB (0%) Allocated Shadow Copy Storage space: 2.81 GB (0%) Maximum Shadow Copy Storage space: 531 GB (30%) Shadow Copy Storage association For volume: (F:)\?\Volume{2a2...94f}\ Shadow Copy Storage volume: (F:)\?\Volume{2a2...94f}\ Used Shadow Copy Storage space: 334 GB (17%) Allocated Shadow Copy Storage space: 337 GB (18%) Maximum Shadow Copy Storage space: UNBOUNDED (922154758%) (Yeah, the "percent calculation" for UNBOUNDED is seriously bogus.) I've run SFC /verifyonly and it seems happy. I've verified that the new `Volume Shadow Copy" service starts when I start the backup operation. Any suggestions?

    Read the article

  • La beta du Feature Pack est disponible pour Team Foundation Server 2010 et Project Server Integration

    La beta du Feature Pack est disponible Pour Team Foundation Server 2010 et Project Server Integration Microsoft vient d'annoncer la disponibilité de la beta du Feature Pack de Team Foundation Server 2010 et Projet Server Integration ce qui marque la fin des CTP(community technical preview). La beta du Feature Pack de Team Foundation Server 2010 et Project Server (TFS-PS) est disponible uniquement pour les abonnées MSDN et sur licence « Go Live », ce qui signifie qu'elle peut déjà être utilisée dans un environnement de production. Pour mémoire, Team Foundation Server est un outil de travail collaboratif accompagnant la suite Visual Studio Team System(VSTS). Il permet la gest...

    Read the article

  • Bacula & Multiple Tape Devices, and so on

    - by Tom O'Connor
    Bacula won't make use of 2 tape devices simultaneously. (Search for #-#-# for the TL;DR) A little background, perhaps. In the process of trying to get a decent working backup solution (backing up 20TB ain't cheap, or easy) at $dayjob, we bought a bunch of things to make it work. Firstly, there's a Spectra Logic T50e autochanger, 40 slots of LTO5 goodness, and that robot's got a pair of IBM HH5 Ultrium LTO5 drives, connected via FibreChannel Arbitrated Loop to our backup server. There's the backup server.. A Dell R715 with 2x 16 core AMD 62xx CPUs, and 32GB of RAM. Yummy. That server's got 2 Emulex FCe-12000E cards, and an Intel X520-SR dual port 10GE NIC. We were also sold Commvault Backup (non-NDMP). Here's where it gets really complicated. Spectra Logic and Commvault both sent respective engineers, who set up the library and the software. Commvault was running fine, in so far as the controller was working fine. The Dell server has Ubuntu 12.04 server, and runs the MediaAgent for CommVault, and mounts our BlueArc NAS as NFS to a few mountpoints, like /home, and some stuff in /mnt. When backing up from the NFS mountpoints, we were seeing ~= 290GB/hr throughput. That's CRAP, considering we've got 20-odd TB to get through, in a <48 hour backup window. The rated maximum on the BlueArc is 700MB/s (2460GB/hr), the rated maximum write speed on the tape devices is 140MB/s, per drive, so that's 492GB/hr (or double it, for the total throughput). So, the next step was to benchmark NFS performance with IOzone, and it turns out that we get epic write performance (across 20 threads), and it's like 1.5-2.5TB/hr write, but read performance is fecking hopeless. I couldn't ever get higher than 343GB/hr maximum. So let's assume that the 343GB/hr is a theoretical maximum for read performance on the NAS, then we should in theory be able to get that performance out of a) CommVault, and b) any other backup agent. Not the case. Commvault seems to only ever give me 200-250GB/hr throughput, and out of experimentation, I installed Bacula to see what the state of play there is. If, for example, Bacula gave consistently better performance and speeds than Commvault, then we'd be able to say "**$.$ Refunds Plz $.$**" #-#-# Alas, I found a different problem with Bacula. Commvault seems pretty happy to read from one part of the mountpoint with one thread, and stream that to a Tape device, whilst reading from some other directory with the other thread, and writing to the 2nd drive in the autochanger. I can't for the life of me get Bacula to mount and write to two tape drives simultaneously. Things I've tried: Setting Maximum Concurrent Jobs = 20 in the Director, File and Storage Daemons Setting Prefer Mounted Volumes = no in the Job Definition Setting multiple devices in the Autochanger resource. Documentation seems to be very single-drive centric, and we feel a little like we've strapped a rocket to a hamster, with this one. The majority of example Bacula configurations are for DDS4 drives, manual tape swapping, and FreeBSD or IRIX systems. I should probably add that I'm not too bothered if this isn't possible, but I'd be surprised. I basically want to use Bacula as proof to stick it to the software vendors that they're overpriced ;) I read somewhere that @KyleBrandt has done something similar with a modern Tape solution.. Configuration Files: *bacula-dir.conf* # # Default Bacula Director Configuration file Director { # define myself Name = backuphost-1-dir DIRport = 9101 # where we listen for UA connections QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 20 Password = "yourekiddingright" # Console password Messages = Daemon DirAddress = 0.0.0.0 #DirAddress = 127.0.0.1 } JobDefs { Name = "DefaultFileJob" Type = Backup Level = Incremental Client = backuphost-1-fd FileSet = "Full Set" Schedule = "WeeklyCycle" Storage = File Messages = Standard Pool = File Priority = 10 Write Bootstrap = "/var/lib/bacula/%c.bsr" } JobDefs { Name = "DefaultTapeJob" Type = Backup Level = Incremental Client = backuphost-1-fd FileSet = "Full Set" Schedule = "WeeklyCycle" Storage = "SpectraLogic" Messages = Standard Pool = AllTapes Priority = 10 Write Bootstrap = "/var/lib/bacula/%c.bsr" Prefer Mounted Volumes = no } # # Define the main nightly save backup job # By default, this job will back up to disk in /nonexistant/path/to/file/archive/dir Job { Name = "BackupClient1" JobDefs = "DefaultFileJob" } Job { Name = "BackupThisVolume" JobDefs = "DefaultTapeJob" FileSet = "SpecialVolume" } #Job { # Name = "BackupClient2" # Client = backuphost-12-fd # JobDefs = "DefaultJob" #} # Backup the catalog database (after the nightly save) Job { Name = "BackupCatalog" JobDefs = "DefaultFileJob" Level = Full FileSet="Catalog" Schedule = "WeeklyCycleAfterBackup" # This creates an ASCII copy of the catalog # Arguments to make_catalog_backup.pl are: # make_catalog_backup.pl <catalog-name> RunBeforeJob = "/etc/bacula/scripts/make_catalog_backup.pl MyCatalog" # This deletes the copy of the catalog RunAfterJob = "/etc/bacula/scripts/delete_catalog_backup" Write Bootstrap = "/var/lib/bacula/%n.bsr" Priority = 11 # run after main backup } # # Standard Restore template, to be changed by Console program # Only one such job is needed for all Jobs/Clients/Storage ... # Job { Name = "RestoreFiles" Type = Restore Client=backuphost-1-fd FileSet="Full Set" Storage = File Pool = Default Messages = Standard Where = /srv/bacula/restore } FileSet { Name = "SpecialVolume" Include { Options { signature = MD5 } File = /mnt/SpecialVolume } Exclude { File = /var/lib/bacula File = /nonexistant/path/to/file/archive/dir File = /proc File = /tmp File = /.journal File = /.fsck } } # List of files to be backed up FileSet { Name = "Full Set" Include { Options { signature = MD5 } File = /usr/sbin } Exclude { File = /var/lib/bacula File = /nonexistant/path/to/file/archive/dir File = /proc File = /tmp File = /.journal File = /.fsck } } Schedule { Name = "WeeklyCycle" Run = Full 1st sun at 23:05 Run = Differential 2nd-5th sun at 23:05 Run = Incremental mon-sat at 23:05 } # This schedule does the catalog. It starts after the WeeklyCycle Schedule { Name = "WeeklyCycleAfterBackup" Run = Full sun-sat at 23:10 } # This is the backup of the catalog FileSet { Name = "Catalog" Include { Options { signature = MD5 } File = "/var/lib/bacula/bacula.sql" } } # Client (File Services) to backup Client { Name = backuphost-1-fd Address = localhost FDPort = 9102 Catalog = MyCatalog Password = "surelyyourejoking" # password for FileDaemon File Retention = 30 days # 30 days Job Retention = 6 months # six months AutoPrune = yes # Prune expired Jobs/Files } # # Second Client (File Services) to backup # You should change Name, Address, and Password before using # #Client { # Name = backuphost-12-fd # Address = localhost2 # FDPort = 9102 # Catalog = MyCatalog # Password = "i'mnotjokinganddontcallmeshirley" # password for FileDaemon 2 # File Retention = 30 days # 30 days # Job Retention = 6 months # six months # AutoPrune = yes # Prune expired Jobs/Files #} # Definition of file storage device Storage { Name = File # Do not use "localhost" here Address = localhost # N.B. Use a fully qualified name here SDPort = 9103 Password = "lalalalala" Device = FileStorage Media Type = File } Storage { Name = "SpectraLogic" Address = localhost SDPort = 9103 Password = "linkedinmakethebestpasswords" Device = Drive-1 Device = Drive-2 Media Type = LTO5 Autochanger = yes } # Generic catalog service Catalog { Name = MyCatalog # Uncomment the following line if you want the dbi driver # dbdriver = "dbi:sqlite3"; dbaddress = 127.0.0.1; dbport = dbname = "bacula"; DB Address = ""; dbuser = "bacula"; dbpassword = "bbmaster63" } # Reasonable message delivery -- send most everything to email address # and to the console Messages { Name = Standard mailcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula: %t %e of %c %l\" %r" operatorcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula: Intervention needed for %j\" %r" mail = root@localhost = all, !skipped operator = root@localhost = mount console = all, !skipped, !saved # # WARNING! the following will create a file that you must cycle from # time to time as it will grow indefinitely. However, it will # also keep all your messages if they scroll off the console. # append = "/var/lib/bacula/log" = all, !skipped catalog = all } # # Message delivery for daemon messages (no job). Messages { Name = Daemon mailcommand = "/usr/lib/bacula/bsmtp -h localhost -f \"\(Bacula\) \<%r\>\" -s \"Bacula daemon message\" %r" mail = root@localhost = all, !skipped console = all, !skipped, !saved append = "/var/lib/bacula/log" = all, !skipped } # Default pool definition Pool { Name = Default Pool Type = Backup Recycle = yes # Bacula can automatically recycle Volumes AutoPrune = yes # Prune expired volumes Volume Retention = 365 days # one year } # File Pool definition Pool { Name = File Pool Type = Backup Recycle = yes # Bacula can automatically recycle Volumes AutoPrune = yes # Prune expired volumes Volume Retention = 365 days # one year Maximum Volume Bytes = 50G # Limit Volume size to something reasonable Maximum Volumes = 100 # Limit number of Volumes in Pool } Pool { Name = AllTapes Pool Type = Backup Recycle = yes AutoPrune = yes # Prune expired volumes Volume Retention = 31 days # one Moth } # Scratch pool definition Pool { Name = Scratch Pool Type = Backup } # # Restricted console used by tray-monitor to get the status of the director # Console { Name = backuphost-1-mon Password = "LastFMalsostorePasswordsLikeThis" CommandACL = status, .status } bacula-sd.conf # # Default Bacula Storage Daemon Configuration file # Storage { # definition of myself Name = backuphost-1-sd SDPort = 9103 # Director's port WorkingDirectory = "/var/lib/bacula" Pid Directory = "/var/run/bacula" Maximum Concurrent Jobs = 20 SDAddress = 0.0.0.0 # SDAddress = 127.0.0.1 } # # List Directors who are permitted to contact Storage daemon # Director { Name = backuphost-1-dir Password = "passwordslinplaintext" } # # Restricted Director, used by tray-monitor to get the # status of the storage daemon # Director { Name = backuphost-1-mon Password = "totalinsecurityabound" Monitor = yes } Device { Name = FileStorage Media Type = File Archive Device = /srv/bacula/archive LabelMedia = yes; # lets Bacula label unlabeled media Random Access = Yes; AutomaticMount = yes; # when device opened, read it RemovableMedia = no; AlwaysOpen = no; } Autochanger { Name = SpectraLogic Device = Drive-1 Device = Drive-2 Changer Command = "/etc/bacula/scripts/mtx-changer %c %o %S %a %d" Changer Device = /dev/sg4 } Device { Name = Drive-1 Drive Index = 0 Archive Device = /dev/nst0 Changer Device = /dev/sg4 Media Type = LTO5 AutoChanger = yes RemovableMedia = yes; AutomaticMount = yes; AlwaysOpen = yes; RandomAccess = no; LabelMedia = yes } Device { Name = Drive-2 Drive Index = 1 Archive Device = /dev/nst1 Changer Device = /dev/sg4 Media Type = LTO5 AutoChanger = yes RemovableMedia = yes; AutomaticMount = yes; AlwaysOpen = yes; RandomAccess = no; LabelMedia = yes } # # Send all messages to the Director, # mount messages also are sent to the email address # Messages { Name = Standard director = backuphost-1-dir = all } bacula-fd.conf # # Default Bacula File Daemon Configuration file # # # List Directors who are permitted to contact this File daemon # Director { Name = backuphost-1-dir Password = "hahahahahaha" } # # Restricted Director, used by tray-monitor to get the # status of the file daemon # Director { Name = backuphost-1-mon Password = "hohohohohho" Monitor = yes } # # "Global" File daemon configuration specifications # FileDaemon { # this is me Name = backuphost-1-fd FDport = 9102 # where we listen for the director WorkingDirectory = /var/lib/bacula Pid Directory = /var/run/bacula Maximum Concurrent Jobs = 20 #FDAddress = 127.0.0.1 FDAddress = 0.0.0.0 } # Send all messages except skipped files back to Director Messages { Name = Standard director = backuphost-1-dir = all, !skipped, !restored }

    Read the article

  • SQLAuthority News – Download Whitepaper – SQL Server 2008 R2 Analysis Services Operations Guide

    - by pinaldave
    SQL Server Analysis Service (SSAS) has been always interesting subject for research. Analysis Services cubes are a very powerful tool in the hands of the business intelligence (BI) developer. They provide an easy way to expose even large data models directly to business users. Microsoft has published very informative white paper on Analysis Services Operations Guide. This white paper is authored by Thomas Kejser, John Sirmon, and Denny Lee. In this guide you will find information on how to test and run Microsoft SQL Server Analysis Services in SQL Server 2005, SQL Server 2008, and SQL Server 2008 R2 in a production environment. The focus of this guide is how you can test, monitor, diagnose, and remove production issues on even the largest scaled cubes. This paper also provides guidance on how to configure the server for best possible performance. It is the goal of this guide to make your operations processes as painless as possible, and to have you run with the best possible performance without any additional development effort to your deployed cubes. In this guide, you will learn how to get the best out of your existing data model by making changes transparent to the data model and by making configuration changes that improve the user experience of the cube. Download SQL Server 2008 R2 Analysis Services Operations Guide Note: Abstract taken white paper. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, SQLAuthority News, T SQL, Technology

    Read the article

  • Problem During Installation SQL Server 2005 on Windows 7

    - by mlife
     Yesterday I was trying to install SQL Server 2005 on windows 7. During installation a popup error dialog shown with this message: The SQL Server service failed to start. For more information, see the SQL Server Books Online topics, "How to: View SQL Server 2005 Setup Log Files" and "Starting SQL Server Manually. Here is the captured screen: But in books online there was no useful information! After some hours googling, I did not found any useful information and at 3 o'clock of midnight, I was scratching my head! Believe it, I attempted to install SQL Server more than 15 times with different manners (with command prompt & parameters and else). Eventually I found the resource of problem, that was "BitDefender Internet Security 2010"! After uninstalling BitDefender Internet Security, I installed SQL Server 2005 and then reinstalled BitDefender. Just that! Problem resolved. Conclusion: After installing a new version of windows and it's requirements (like IIS and language specifications & else), first install the SQL Server and the Visual Studio and then other applications.Hope be helpful.  

    Read the article

  • ldap client cannot contact ldap server

    - by Van
    I have followed these instructions: https://help.ubuntu.com/12.04/serverguide/openldap-server.html#openldap-auth-config The ldap server works fine. I can log into it using an ldap account. However, I configured another Ubuntu 12.04 server as a ldap client for authentication but I cannot contact the server. Here is the error: On the client: # ldapsearch -Q -LLL -Y EXTERNAL -H ldapi://ldap01.domain.local -b cn=config dn ldap_sasl_interactive_bind_s: Can't contact LDAP server (-1) The server can receive requests: On the client: # telnet ldap01.domain.local 389 Trying 10.3.17.10... Connected to sisn01.domain.local. Escape character is '^]'. On the client: # ldapsearch -x -h ldap01.domain.local -b cn=config dn # extended LDIF # # LDAPv3 # base <cn=config> with scope subtree # filter: (objectclass=*) # requesting: dn # # search result search: 2 result: 32 No such object # numResponses: 1 On the server: # ps aux | grep slapd openldap 3759 0.0 0.2 564820 8228 ? Ssl 08:39 0:00 /usr/sbin/slapd -h ldap:/// ldapi:/// -g openldap -u openldap -F /etc/ldap/slapd.d I suspect I am missing a configuration parameter either on the server or on the client. I just cannot figure out what. Any help here would be appreciated.

    Read the article

  • SQLAuthority News – Download SQL Server 2012 SP1 CTP4

    - by pinaldave
    There are few trends I often see in the industry, for example i) running servers on n-1 version ii) wait till SP1 to released to adopt the product. Microsoft has recently released SQL Server 2012 SP1 CTP4. CTP stands for Community Technology Preview and it is not the final version yet. The SQL Server 2012 SP1 CTP release is available for testing purposes and use on non-production environments. What’s new for SQL Server 2012 SP1: AlwaysOn Availability Group OS Upgrade: Selective XML Index FIX: DBCC SHOW_STATISTICS works with SELECT permission New dynamic function returns statistics properties SSMS Complete in Express SlipStream Full installation Business Intelligence Excel Update You can download SQL Server 2012 SP1 CTP4 from here. SQL Server 2012 SP1 CTP4 feature pack is available for download from here. Additionally, SQL Server 2012 SP1 CTP Express is available to download as well from here. Note that SQL Server 2012 SP1 CTP has SSMS as well. Reference: Pinal Dave (http://blog.SQLAuthority.com)       Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • How to clear the resent server name list in SQL Server Management Studio

    - by Pavan Kumar Pabothu
    If you are using SQL Server management Studio much the we can observer that the list of server names in the log in of it. As you can imagin a period of time after 6 month or 1 year you will see a long list of server names in the login dialog. How to clear this list...? I doesn't provide a mechanism to clean nor clear the list, so you'll have to do a little browsing through your file system. For SQl Server 2005 Management Studio, we should delete the below file C:\Documents and Settings\<user>\Application Data\Microsoft\Microsoft SQL Server\90\Tools\Shell\mru.dat. For SQl Server 2008 Management Studio, we should delete the below file C:\Documents and Settings\<user>\Application Data\Microsoft\Microsoft SQL Server\90\Tools\Shell\SQLStudio.bin. After deletion we can re-login the Management studio and can see the empty list.

    Read the article

  • SQL SERVER – Puzzle to Win Print Book and Free 30 Days Online Training Material

    - by pinaldave
    Yesterday I had asked a simple question SQL SERVER – Puzzle to Win Print Book – Write T-SQL Self Join Without Using LEAD and LAG with keeping two simple intention. We can all learn about new feature of SQL Server 2012 We can learn new feature of SQL Server 2012 while practicing on earlier version of SQL Server. While I was creating question due to copy-paste error the question was not correctly created. In simple word – I made a mistake. This created some confusion and I feel bad about this. Here is what we will do. Please read the question again and attempt to answer the question which I have asked in the blog post. Yesterday the give away was my SQL Server Interview Questions and Answers book. As the question was corrected after a while, the give away are now got sweeter. SQL Server Interview Questions and Answers book – 2 Copy 30 Days Online Training Material of Pluralsight. They have excellent learning resources – I have written my 6 hour learning experience over Learning SSAS (SQL Server Analysis Services) Online in 6 Hours. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Training, SQLServer, T SQL, Technology

    Read the article

  • Server-side Architecture for Online Game

    - by Draiken
    Hi, basically I have a game client that has communicate with a server for almost every action it takes, the game is in Java (using LWJGL) and right now I will start making the server. The base of the game is normally one client communicating with the server alone, but I will require later on for several clients to work together for some functionalities. I've already read how authentication server should be sepparated and I intend on doing it. The problem is I am completely inexperienced in this kind of server-side programming, all I've ever programmed were JSF web applications. I imagine I'll do socket connections for pretty much every game communication since HTML is very slow, but I still don't really know where to start on my server. I would appreciate reading material or guidelines on where to start, what architecture should the game server have and maybe some suggestions on frameworks that could help me getting the client-server communication. I've looked into JNAG but I have no experience with this kind of thing, so I can't really tell if it is a solid and good messaging layer. Any help is appreciated... Thanks !

    Read the article

  • can't access SAMBA shares on UBUNTU-server from other computers

    - by larand
    Installed UBUNTU-server 12.04 and configured /etc/samba/smb.conf as: #======================= Global Settings ======================= [global] workgroup = HEMMA server string = %h server (Samba, Ubuntu) security = user wins support = yes dns proxy = no log file = /var/log/samba/log.%m max log size = 1000 syslog = 0 panic action = /usr/share/samba/panic-action %d encrypt passwords = no passdb backend = tdbsam obey pam restrictions = yes unix password sync = yes passwd program = /usr/bin/passwd %u passwd chat = *Enter\snew\s*\spassword:* %n\n *Retype\snew\s*\spassword:* %n\n *password\supdated\ssuccessfully* . pam password change = yes map to guest = bad user ############ Misc ############ usershare allow guests = yes #======================= Share Definitions ======================= [printers] comment = All Printers browseable = no path = /var/spool/samba printable = yes guest ok = no read only = yes create mask = 0700 # Windows clients look for this share name as a source of downloadable # printer drivers [print$] comment = Printer Drivers path = /var/lib/samba/printers browseable = yes read only = yes guest ok = no [Bilder original] comment = Original bilder path = /mnt/bilder/org browseable = yes read only = no guest ok = no create mask = 0755 [Bilder publika] comment = Bilder för allmän visning path = /mnt/bilder/public browseable = yes read only = yes guest ok = yes [Musik] comment = Musik path = /mnt/music/public browseable = yes read only = yes guest ok = yes I have a network setup around a 4G router "HUAWEI B593" where some computers are connected by WIFI and others by LAN. The server is connected by LAN. On one computer running windows XP I can see the server but are not allowed to acces them. On another computer on the WIFI-net running win7 I cannot see the server at all but I can ping the server and I can see the smb-protocoll is running when sniffing with wireshark. I don't primarily want to use passwords, computers on the lan and wifi should be able to connect without any login-procedure. I'm sure my config is not sufficient but have hard to understand how I should do. Theres a lot of descriptions on the net but most is old and none have been of any help. I'm also confused by the fact that I can not se the sever on my win7-machine even though it communicates with the samba-server. Would be very happy if anyone could spread some light over this mess.

    Read the article

  • SQL SERVER – Fix Visual Studio Error : Connections to SQL Server files (.mdf) require SQL Server Express 2005 to function properly. Please verify the installation of the component or download from the URL

    - by pinaldave
    In one of the virtual environment while I was trying to add SQL Server Database (.mdf) file to asp.net project I encountered following error: Connections to SQL Server files (.mdf) require SQL Server Express 2005 to function properly. Please verify the installation of the component or download from the URL:  For a long time I am using SQL Server 2012 but this error was a bit interesting to me. I realize that there should not be any need of the SQL Server 2005 installation. I quickly figured out that I can remove this error if I do as mentioned below: Open Microsoft Visual Studio Select Tools >> Options >> Database Tools >> Data Connections Enter the name of an installed instance in “SQL Server Instance Name” field. Click OK If you do not know the instance name, you can follow either of the options. 1) Use the command line sqlcmd utility 2) Using SQL Server Management Studio Is there any other way to resolve this error? Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Error Messages, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: sqlcmd, Visual Studio

    Read the article

  • Windows Server or Linux for final project

    - by user1433490
    A few weeks ago I came up with an idea to develop a mobile app which will direct students in my university to the nearest printer availiable. The whole thing is part of my final project. The Android based app will need to perform the following tasks: The user's location in the campus is sent to the server. Assume this part works just fine. The server sends an SNMP request to the printers in the user's vicinity. I'll probably use PHP or Python for that part. The data requested by SNMP is processed and sent back to the client My question concerns the server. The university's IT manager offered me a designated server for development, which sounds great. Now I need to choose which OS I want installed on the server - Windows server or Linux (don't know which versions). I don't have any server programming/operating experince, but generally speaking I feel more comfortable in Windows enviroment (just because that has always been my OS). I don't have much time for learning a new OS, but when does it make sense generally to host or develop server side applications on a Windows environment versus a Linux environment?

    Read the article

  • ???TCO?????????!? WebLogic Server vs. JBoss Enterprise Application Platform

    - by Tatsuhiro Yamaguchi
    WebLogic Server?????????????Java EE??????????·?????JBoss Enterprise Application Platform(EAP)????????????????WebLogic Server?????????JBoss EAP????????????????????????????????????????TCO(??????)???????????...?????????????????????????1?????????????????????????????????TCO?JBoss EAP??WebLogic Server??????????????????????(???) ????????????????????????WebLogic Server??Java????????????? Java EE??????????·???????????WebLogic Server????????????????????·?????????????????????JBoss EAP??JBoss EAP?????????????????????·??????????????????????????????????????????????????????????????????? ????????·?????????????????????????????????1?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ?????????????????????????????????????????????????????????????????????????????????????????????????????????????????????? ????????????????Java??????????????????????????Java SE 6?????????????2013?2????????????????????????????????????????? ?????? Java SE 6????????????????(Oracle Technology Network) ???????????????JBoss EAP 5????Java SE 6??????????2013?2??????????????????????????????????????? ?????2012?8????????????JBoss EAP 6????????? ????Java SE 7?????????Java SE 7?????????????2014?7??????????JBoss EAP 6????????????2013??????????????????????????????Java SE 7???????????????1????????????????????????????Java SE 7????????????????????Java SE??????????????????????????????? ???????WebLogic Server??????????????????Java SE???????????????????Java SE 6?2016?12????Java SE 7?????2019?7???????????????????????????????????????????WebLogic Server????????????????????????????????Java SE??????????? JBoss EAP?????????????????Java SE????????????Java SE???????????????????????WebLogic Server??????????????????JBoss EAP?????????????????????????????????? ??????????????????WebLogic Server?????????????????? ??????????????????JBoss EAP?WebLogic Server????????????1????????????????WebLogic Server????????????????????????????????????????????????????????????????? ???JBoss EAP?????????????????????????????????????????????????????????????????????????????????????????WebLogic Server??????????????????????? ?????????????????????????????????????????????WebLogic Server?JBoss EAP?????????????????????????????????????????????????????????????????????????????????????????????????????WebLogic Server?JBoss EAP????????????? ??????????????? 3~6????????WebLogic Server????????????????????????WebLogic Server??? ??????????????????????????????????????????????????? JBoss EAP??2????????????24??365????????????Premium?????(??~??)9?~5????????????Standard??????????????????????????(CPU?????????????16??)???????????6???????????????????? · JBoss EAP Standard · JBoss EAP Premium » ???????:?0 » ???????:?0 » ??????:?85?/1?? » ??????:?125?/1?? » 6?????:?85?×6?=?510? » 6?????:?125?×6=?750? ???WebLogic Server???????????????????????????????/?????????CPU??????(CPU????????)?8???CPU?2?(CPU???=16??)??????????????????????6?????????????? · WebLogic Server Standard Edition » ???????:?216?(?108?×2CPU) » ????:?48?/1??(?24?×2CPU) ?????????22% » 6?????:?216+?48?×6=?504? ????JBoss EAP Standard??Premium?WebLogic Server Standard Edition???????????????????? ???????JBoss EAP Premium???3?????Standard??6????????????? ???????????WebLogic Server Standard Edition?????????????????????JBoss EAP????CPU??????????WebLogic Server Standard Edition?CPU?????????????1CPU????????????????????WebLogic Server Standard Edition????????????? ?????WebLogic Server??Java???????????????????????????????????????JBoss EAP?????????????Java SE????????????????????????WebLogic Server???????? ????WebLogic Server???????????????????????????????????JBoss EAP???????????????????????????????????????WebLogic Server?????????????????????WebLogic Server??????????????·??????????????????? ???2012?9????Oracle Technolorgy Network???????????WebLogic Server???????????????????????????????????????????????????????? ????????????????????????????????JBoss EAP????TCO???????????????????????????????????WebLogic Server????????????????????????????WebLogic Server????????????????????????????????????????????

    Read the article

  • Why does tracerpt use up all of my Sql Server's memory?

    - by Cypher
    We have a MS Sql Server 2008 machine with 12 GB of RAM... twice now within the last week this server was knocked on its backside by a process called "tracerpt.exe" which was found to have taken up ALL of the system's memory and leaving nothing for sqlserver. Done my homework, figured out what this program is... but still no idea why it's hogging up so much RAM (though I have an idea), nor what application is actually executing it. This server is the back-end to a Microsoft Dynamics CRM 4.0 application which is hosted on a separate server and is our production database used for just about everything. If this program is necessary, I would like to be able to find the application that is executing this thing and remove it or disable whatever feature is causing this quite annoying occurrence. Any ideas?

    Read the article

  • How to get remote firewall administration working with Windows Server Core 2008 R2?

    - by Daniel15
    I'm setting up a Windows Server Core 2008 R2 installation in a VMware virtual machine before setting it up on a live VPS. I've gotten remote administration via MMC working on my computer (a PC running Windows 7) for things like event logs, but I can't seem to get the firewall administration working. No matter what I do, I get the following error mesage: You do not have the correct permissions to open the Windows Firewall with Advanced Security console. You must be a member of the Administrators group or the Network Operators group to perform this task. For more information, contact you system administrator. Error code: 0x5. I've used cmdkey to add valid server credentials on my computer, and enabled remote management with the following commands: netsh advfirewall firewall set rule group="remote administration" new enable=yes netsh advfirewall firewall set rule group="windows firewall remote management" new enable=yes netsh advfirewall set currentprofile settings remotemanagement enable I am not running on a domain (just a workgroup), this is the only Windows Server 2008 computer I have. I've tried turning off the firewall completely, but remote administration is still failing How do I debug this issue? Does anyone know how to fix it? I found a few forum topics about it (eg. Remotely managing Windows Firewall on Server Core gives access denied (error 0x5) on Windows Server TechCenter) but they didn't help (I've already tried most of the fixes listed).

    Read the article

< Previous Page | 83 84 85 86 87 88 89 90 91 92 93 94  | Next Page >