Search Results

Search found 95301 results on 3813 pages for 'client server'.

Page 60/3813 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • SQL Server Licensing in a VMware vSphere Cluster

    - by Helvick
    If I have SQL Server 2008 instances running in virtual machines on a VMware vSphere cluster with vMotion\DRS enabled so that the VM's can (potentially) run on any one of the physical servers in the cluster what precisely are the license requirements? For example assume that I have 4 physical ESX Hosts with dual physical CPU's and 3 separate single vCPU Virtual Machines running SQL Server 2008 running in that cluster. How many SQL Standard Processor licenses would I need? Is it 3 (one per VM) or 12 (one per VM on each physical host) or something else? How many SQL Enterprise Processor licenses would I need? Is it 3 (one per VM) or 8 (one for each physical CPU in the cluster) or, again, something else? The range in the list prices for these options goes from $17k to $200k so getting it right is quite important. Bonus question: If I choose the Server+CAL licensing model do I need to buy multiple Server instance licenses for each of the ESX hosts (so 12 copies of the SQL Server Standard server license so that there are enough licenses on each host to run all VM's) or again can I just license the VM and what difference would using Enterprise per server licensing make? Edited to Add Having spent some time reading the SQL 2008 Licensing Guide (63 Pages! Includes Maps!*) I've come across this: • Under the Server/CAL model, you may run unlimited instances of SQL Server 2008 Enterprise within the server farm, and move those instances freely, as long as those instances are not running on more servers than the number of licenses assigned to the server farm. • Under the Per Processor model, you effectively count the greatest number of physical processors that may support running instances of SQL Server 2008 Enterprise at any one time across the server farm and assign that number of Processor licenses And earlier: ..For SQL Server, these rule changes apply to SQL Server 2008 Enterprise only. By my reading this means that for my 3 VM's I only need 3 SQL 2008 Enterprise Processor Licenses or one copy of Server Enterprise + CALs for the cluster. By implication it means that I have to license all processors if I choose SQL 2008 Standard Processor licensing or that I have to buy a copy of SQL Server 2008 Standard for each ESX host if I choose to use CALs. *There is a map to demonstrate that a Server Farm cannot extend across an area broader than 3 timezones unless it's in the European Free Trade Area, I wasn't expecting that when I started reading it.

    Read the article

  • SQL Server Licensing in a VMware vSphere Cluster

    - by Helvick
    If I have SQL Server 2008 instances running in virtual machines on a VMware vSphere cluster with vMotion\DRS enabled so that the VM's can (potentially) run on any one of the physical servers in the cluster what precisely are the license requirements? For example assume that I have 4 physical ESX Hosts with dual physical CPU's and 3 separate single vCPU Virtual Machines running SQL Server 2008 running in that cluster. How many SQL Standard Processor licenses would I need? Is it 3 (one per VM) or 12 (one per VM on each physical host) or something else? How many SQL Enterprise Processor licenses would I need? Is it 3 (one per VM) or 8 (one for each physical CPU in the cluster) or, again, something else? The range in the list prices for these options goes from $17k to $200k so getting it right is quite important. Bonus question: If I choose the Server+CAL licensing model do I need to buy multiple Server instance licenses for each of the ESX hosts (so 12 copies of the SQL Server Standard server license so that there are enough licenses on each host to run all VM's) or again can I just license the VM and what difference would using Enterprise per server licensing make? Edited to Add Having spent some time reading the SQL 2008 Licensing Guide (63 Pages! Includes Maps!*) I've come across this: • Under the Server/CAL model, you may run unlimited instances of SQL Server 2008 Enterprise within the server farm, and move those instances freely, as long as those instances are not running on more servers than the number of licenses assigned to the server farm. • Under the Per Processor model, you effectively count the greatest number of physical processors that may support running instances of SQL Server 2008 Enterprise at any one time across the server farm and assign that number of Processor licenses And earlier: ..For SQL Server, these rule changes apply to SQL Server 2008 Enterprise only. By my reading this means that for my 3 VM's I only need 3 SQL 2008 Enterprise Processor Licenses or one copy of Server Enterprise + CALs for the cluster. By implication it means that I have to license all processors if I choose SQL 2008 Standard Processor licensing or that I have to buy a copy of SQL Server 2008 Standard for each ESX host if I choose to use CALs. *There is a map to demonstrate that a Server Farm cannot extend across an area broader than 3 timezones unless it's in the European Free Trade Area, I wasn't expecting that when I started reading it.

    Read the article

  • Windows server RAS VPN client can't connect to internet

    - by Dragouf
    I configured a windows server 2008 RAS to connect automatically to a pptp vpn server. Problem is that when it connect I can't access internet from this server (the vpn client connect through RAS) Usually I ask vpn not to be use as the default gateway but this part is disable int the network interface - VPN interface properties : And I don't find how to ask to connect to internet directly....

    Read the article

  • DNS - Redirect from old server to new.

    - by jyoseph
    I have a server, (Server A) Windows Server 2003 that I was hosting some sites on. Now they are hosted on a different server (Server B). I recently switched the DNS at godaddy to point to the new nameservers. Is there something I can do on Server A to point all requests to Server A to Server B (basically a redirect from Server A to B)? What type of record would that be? This is while I'm waiting for the DNS changes I made to fully resolve. edit To further clarify. test.com may still be resolving to Server A, I'd like a DNS record on Server A that tells it to go to the new server. Is that possible?

    Read the article

  • ByPass credential prompting on drive map - windows server 2k3

    - by Tone
    I have 2 windows server 2k3 machines - server A and server B, Server A is on the company domain, and Server B is not. I have a need to bypass the credential prompting that happens every time I map a drive from Server B to Server A. The reason i need to do this is that I'm running a program called SourceAnywhere on Server A that points to a VSS database on Server B. (SourceAnywhere solves the slowness issues that VSS http access has). While I can configure SourceAnywhere to point to this VSS database (after mapping drive and getting access), I cannot connect to the VSS database from my development machine - i get an error saying I can access database alias.. I'm thinking this might have to do with Server B prompting me for credentials from Server A since it isn't on the domain. Is there anyway to store these credentials? or do i need to get Server B added to the domain?

    Read the article

  • Windows Server 2008 (sp2) stops responding on network share requests from Windows Vista and 7 client

    - by Peter LaComb Jr.
    I have two Windows Server 2008 SP2 machines (TFS and TFSBUILD). Periodically, the TFSBUILD server shares (\TFSBUILD\ShareName or \TFSBUILD\C$) become unresponsive to requests from Windows Vista (Server 2008) and Windows 7 client requests. Windows XP machines are still able to connect. No events in the server log indicate any problem. A simple restart corrects the issue temporarily, but it always returns. No, it is not this http://support.microsoft.com/kb/976266 (we aren't using that software). All anti-virus software has been disabled, firewall is disabled by policy. No other network activity is affected. Any help would be greatly appreciated.

    Read the article

  • Can't set up IIS Web Server on Server 2008 x64 correctly

    - by balexandre
    Using a VM I installed Windows 2008 Server x64 and as the image below shows, added the IIS Role full image and assigned all role features of IIS full image But if I have an ASP.NET (aspx) page that does (C#) Session["test-session"] = "A"; and read in other page I always get nothing! NOTE: I do have an entire ASP.NET web application, the example above is to be succinct and explicit on what is the problem I'm facing. Can anyone know what do I have to do to the Server, so I can use the Session variables? All help is greatly appreciated, Thank you

    Read the article

  • Remotely managing Hyper-V VMs from Windows 8 Client

    - by Vazgen
    Currently, I have a core Hyper-V Server hosting VMs for a domain controller and several domain-joined VDI infrastructure servers. The VMs are connected in that domain environment, but the remote management of the physical Hyper-V Server is set up using the same WORKGROUP (as the Windows 8 client I'm managing from) This makes it cumbersome to manage the VMs hosted on the physical server from my remote management Windows 8 client because I can only connect to the physical Hyper-V server and not the individual VMs hosted inside. Can I make my set up more flexible by hosting a second domain controller in a VM hosted on my Windows 8 machine and switching my remote management set up to use the same domain through? Meaning ALL physical and virtual machines including the VDI infrastructure under the same domain? I'm new to this just looking for some suggestions.

    Read the article

  • Choosing the right e-mail client

    - by CFP
    Hi all, I'm currently using Outlook 2007 (under windows 7), but I much prefer free software (open source being the best of course), so I thought I'd ask for expert advice here. I thought it might be easier if I included a small "wanted list": I receive about 15 to 30 e-mails every day, but I have large archives (10'000 emails), which I frequently need to access. I usually open and close my mail program many times, so I'd like it to start pretty fast I cannot use an online mailbox, because I have too many email addresses (about 5: 1 for work, 1 for home, 1 semi-private, 1 for specific emails, and 1 for newletters By order of importance, the things I'd like my mail client to be able to: Efficiently categorize e-mails. Until now, I've mostly been using Outlook folders, because filtering by tags was not easy, but I'd rather one large list of mails, neatly tagged so I can easily filter. I'd love being able to select mails by tags (eg in a click or too (could be a tab) show all mails tagged with "software") Create "tagging rules", such as "if the mail was sent to this address, add this tag", or "if the body contains ..., add that tag" Sync contacts with Gmail, handle tasks (syncing with toodledo would be awesome), possibly provide a calendar Create e-mail templates, signatures... Other ideas: A timeline, scripting support, being able to import MS Outlook emails, provide a nice backup format... Thanks for sharing ideas and suggestions!

    Read the article

  • Recommend an SFTP solution for Windows (Server & Client) that integrates well with Dreamweaver

    - by aaandre
    Could you please recommend a secure FTP solution for updating files on a remote windows server from windows workstations? We would like to replace the FTP-based workflow with a secure FTP one. Windows Server 2003 on the remotely hosted webserver, WinXP on the workstations. We manage the files via Dreamweaver's built-in FTP. My understanding is Dreamweaver supports sFTP out of the box so I guess I am looking for a good sFTP server for Windows Server 2003. Ideally, that would not require cygwin. Ideally, the solution would use authentication based on the existing windows accounts and permissions. Thank you!

    Read the article

  • Migrating SQL Server Compact Edition (SQL CE) database to SQL Server using Web Matrix

    - by Harish Ranganathan
    One of the things that is keeping us busy is the Web Camps we are delivering across 5 cities.  If you are a reader of this blog, and also attended one of these web camps, there is a good chance that you have seen me since I was there in all the places, so far.  The topics that we cover include Visual Studio 2010 SP1, SQL CE, ASP.NET MVC & HTML5.  Whenever I talk about SQL CE, the immediate response is that, people are wow that Microsoft has shipped a FREE compact edition database, which is an embedded database that can be x-copy deployed.  If you think, well didn’t Microsoft ship SQL Express which is FREE?  The difference is that, SQL Express runs as a service in the machine (if you open SQL Configuration Manager, you can notice that SQL Express is running as a service along with your SQL Server Engine (if you have installed ).  This makes it that, even if you are willing to use SQL Express when you deploy your application, it needs to be installed on the production machine (hosting provider) and it needs to run as a service.  Many hosters don’t allow such services to run on their space. SQL CE comes as a x-Copy deploy-able database with just a few DLLs required to run it on the machine and they don’t even need to be installed in GAC on the production machine.  In fact, if you have Visual Studio 2010 SP1 installed, you can use the “Add Deployable Dependencies” option in Project-Properties and it would detect that SQL CE is something you would probably want to add as a deploy-able dependency for your project.  With that, it bundles the required DLLs as a part of the “_bin_deployableAssemblies” folder.  So your project can be x-Copy deployed and just works fine. However, SQL CE has the limit of 4GB storage space.  Real world applications often require more than just 4GB of data storage and it often turns out that people would like to use SQL CE for development/ramp up stages but would like to migrate to full fledged SQL Server after a while.  So, its only natural that the question arises “How do I move my SQL CE database to SQL Server”  And honestly, it doesn’t come across as a straight forward support.  I was talking to Ambrish Mishra (PM in SQL CE Team, Hyderabad) since I got this question in almost all the places where we talked about SQL CE.   He was kind enough to demonstrate how this can be accomplished using Web Matrix.  Open Web Matrix (Web Matrix can be installed for free from www.microsoft.com/web) and click on “Site from Template” Click on the “Bakery” template (since by default it uses a SQL CE database and has all the required sample data) and click “Ok”. In the project, you can navigate to the Database tab and will be able to find that the Bakery site uses a SQL CE database “bakery.sdf” Select the “bakery.sdf” and you will be able to see the “Migrate” button on the top right Once you click on the “Migrate” button, you will notice that the popup wizard opens up and by default is configured for SQL Express.  You can edit the same to point to your local SQL Server instance, or a remote server. Upon filling in the Server Name, Username and Password, when you click “Ok”, couple of things happen.  1. The database is migrated to SQL Server (local or remote – subject to permissions on remote server).   You can open up SQL Server Management Studio and connect to the server to verify that the “bakery” database exists under “Databases” node. 2. You can also notice that in Web Matrix, when you navigate to the “Files” tab and open up the web.config file, connection string now points to the SQL Server instance (yes, the Migrate button was smart enough to make this change too ) And there it is, your SQL Server Compact Edition database, now migrated to SQL Server!! In a future post, I would explain the steps involved when using Visual Studio. Cheers !!!

    Read the article

  • Worker roles in Windows Azure to host a multiplayer server

    - by MrWiggels
    I've been doing research on where to host a simple multi-player backend for a simple game I'm developing. So as a first choice I downloaded the Windows Azure SDK, which provides a nice and simple emulator environment where you can test out your application before uploading. I also download the Azure Social Game Toolkit (Visit), and followed as far as my understanding can take me. So, down to the main question. Is there anybody with experience developing Azure applications. I'm developing a Action RPG game, in a similar vein to Diablo III. I was thinking of putting up Matchmaking, Friends Lists, etc. Is there another way to connect to Azure services via something like UDP or TCP for sending packets or does everything have to go through HTTP requests? Is it even possible to use HTTP request/response for something like this? All game commands will be simple. Because the game server and the clients will be kept in-sync and will have deterministic actions, I'm just going to send actions like "Use Primary Skill" and "Use Secondary Skill". Any hints, ideas, light bulbs or a smack-in-the-face presentation will be much appreciated.

    Read the article

  • sql-server: Can I update two table with Single Query?

    - by RedsDevils
    How can I write single UPDATE query to change value of COL1 to ‘X’ if COL2 < 10 otherwise change it to ‘Y’, where the following two tables are linked by ID CREATE TABLE TEMP(ID TINYINT, COL1 CHAR(1)) INSERT INTO TEMP(ID,COL1) VALUES (1,'A') INSERT INTO TEMP(ID,COL1) VALUES (2,'B') INSERT INTO TEMP(ID,COL1) VALUES (11,'A') INSERT INTO TEMP(ID,COL1) VALUES (17,'B') CREATE TABLE TEMP2(ID TINYINT, COL2 TINYINT) INSERT INTO TEMP2(ID,COL2) VALUES (1,1) INSERT INTO TEMP2(ID,COL2) VALUES (2,5) INSERT INTO TEMP2(ID,COL2) VALUES (11,10) INSERT INTO TEMP2(ID,COL2) VALUES (17,15) Thanks in advance!

    Read the article

  • what's a good way to synchronize a sql server 2008 database from a 2005 database automatically?

    - by Keith Nicholas
    Ok, the scenario is... two servers, on completely different parts of the internet. The sql 2008 database just needs to get data updates and schema changes. It doesn't need to send anything to the 2005 database. Basically just suck data and schema as efficiently as possible automatically as a scheduled task. The database is quite huge.... but the changes per day are probablly around 20/30 megabytes of data/ I can't run any of the inbuilt replication on the 2005 database. I've had a wee look at the Sync Framework, I think that might do what I want, but seems a bit painful and requires a bit of work to get going. I'm wondering if there is tooling out there to make this easier? or?? not quite sure what my options are.

    Read the article

  • secure data transport between web server and database server

    - by atypicalgeek
    I'm planning on provisioning a web server and database server in a server farm environment. They will be in the same network but not in the same domain, both windows server 2008 and the database server is sql server 2008. My question being, what is the best way to secure data in transport between the servers? I've looked into IPSEC and SSL but not sure how to go about implementing either.

    Read the article

  • Client/Server app

    - by Knowing me knowing you
    Guys could anyone tell me what's wrong am I doing here? Below are four files Client, main, Server, main. I'm getting an error on Client side after trying to fromServer.readLine(). Error: Error in Client Software caused connection abort: recv failed package client; import java.io.*; import java.net.*; import java.util.Scanner; public class Client { private PrintWriter toServer; private BufferedReader fromServer; private Socket socket; public Client( )throws IOException { socket = new Socket("127.0.0.1",3000); } public void openStreams() throws IOException { // // InputStream is = socket.getInputStream(); // OutputStream os = socket.getOutputStream(); // fromServer = new BufferedReader(new InputStreamReader(is)); // toServer = new PrintWriter(os, true); toServer = new PrintWriter(socket.getOutputStream(),true); fromServer = new BufferedReader(new InputStreamReader(socket.getInputStream())); } public void closeStreams() throws IOException { fromServer.close(); toServer.close(); socket.close(); } public void run()throws IOException { openStreams(); String msg = ""; Scanner scanner = new Scanner(System.in); toServer.println("Hello from Client."); // msg = scanner.nextLine(); while (msg != "exit") { System.out.println(">"); // msg = scanner.nextLine(); toServer.println("msg"); String tmp = fromServer.readLine(); System.out.println("Server said: " + tmp); } closeStreams(); } } package server; import java.net.*; import java.io.*; public class Server { private ServerSocket serverSocket; private Socket socket; private PrintWriter toClient; private BufferedReader fromClient; public void run() throws IOException { System.out.println("Server is waiting for connections..."); while (true) { openStreams(); processClient(); closeStreams(); } } public void openStreams() throws IOException { serverSocket = new ServerSocket(3000); socket = serverSocket.accept(); toClient = new PrintWriter(socket.getOutputStream(),true); fromClient = new BufferedReader(new InputStreamReader(socket.getInputStream())); } public void closeStreams() throws IOException { fromClient.close(); toClient.close(); socket.close(); serverSocket.close(); } public void processClient()throws IOException { System.out.println("Connection established."); String msg = fromClient.readLine(); toClient.println("Client said " + msg); } } package client; import java.io.IOException; public class Main { /** * @param args the command line arguments */ public static void main(String[] args) { // TODO code application logic here try { Client client = new Client(); client.run(); } catch(IOException e) { System.err.println("Error in Client " + e.getMessage()); } } } package server; import java.io.IOException; public class Main { /** * @param args the command line arguments */ public static void main(String[] args) { // TODO code application logic here Server server = new Server(); try { server.run(); } catch(IOException e) { System.err.println("Error in Server " + e.getMessage()); } } }

    Read the article

  • SQLAuthority News – Microsoft SQL Server 2005 Service Pack 4 RTM

    - by pinaldave
    Service Pack 4 (SP4) for Microsoft SQL Server 2005 is now available for download. SQL Server 2005 service packs are cumulative, and this service pack upgrades all service levels of SQL Server 2005 to SP4 . Download Microsoft SQL Server 2005 Service Pack 4 RTM Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Service Pack, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • SQL SERVER – How to Recover SQL Database Data Deleted by Accident

    - by Pinal Dave
    In Repair a SQL Server database using a transaction log explorer, I showed how to use ApexSQL Log, a SQL Server transaction log viewer, to recover a SQL Server database after a disaster. In this blog, I’ll show you how to use another SQL Server disaster recovery tool from ApexSQL in a situation when data is accidentally deleted. You can download ApexSQL Recover here, install, and play along. With a good SQL Server disaster recovery strategy, data recovery is not a problem. You have a reliable full database backup with valid data, a full database backup and subsequent differential database backups, or a full database backup and a chain of transaction log backups. But not all situations are ideal. Here we’ll address some sub-optimal scenarios, where you can still successfully recover data. If you have only a full database backup This is the least optimal SQL Server disaster recovery strategy, as it doesn’t ensure minimal data loss. For example, data was deleted on Wednesday. Your last full database backup was created on Sunday, three days before the records were deleted. By using the full database backup created on Sunday, you will be able to recover SQL database records that existed in the table on Sunday. If there were any records inserted into the table on Monday or Tuesday, they will be lost forever. The same goes for records modified in this period. This method will not bring back modified records, only the old records that existed on Sunday. If you restore this full database backup, all your changes (intentional and accidental) will be lost and the database will be reverted to the state it had on Sunday. What you have to do is compare the records that were in the table on Sunday to the records on Wednesday, create a synchronization script, and execute it against the Wednesday database. If you have a full database backup followed by differential database backups Let’s say the situation is the same as in the example above, only you create a differential database backup every night. Use the full database backup created on Sunday, and the last differential database backup (created on Tuesday). In this scenario, you will lose only the data inserted and updated after the differential backup created on Tuesday. If you have a full database backup and a chain of transaction log backups This is the SQL Server disaster recovery strategy that provides minimal data loss. With a full chain of transaction logs, you can recover the SQL database to an exact point in time. To provide optimal results, you have to know exactly when the records were deleted, because restoring to a later point will not bring back the records. This method requires restoring the full database backup first. If you have any differential log backup created after the last full database backup, restore the most recent one. Then, restore transaction log backups, one by one, it the order they were created starting with the first created after the restored differential database backup. Now, the table will be in the state before the records were deleted. You have to identify the deleted records, script them and run the script against the original database. Although this method is reliable, it is time-consuming and requires a lot of space on disk. How to easily recover deleted records? The following solution enables you to recover SQL database records even if you have no full or differential database backups and no transaction log backups. To understand how ApexSQL Recover works, I’ll explain what happens when table data is deleted. Table data is stored in data pages. When you delete table records, they are not immediately deleted from the data pages, but marked to be overwritten by new records. Such records are not shown as existing anymore, but ApexSQL Recover can read them and create undo script for them. How long will deleted records stay in the MDF file? It depends on many factors, as time passes it’s less likely that the records will not be overwritten. The more transactions occur after the deletion, the more chances the records will be overwritten and permanently lost. Therefore, it’s recommended to create a copy of the database MDF and LDF files immediately (if you cannot take your database offline until the issue is solved) and run ApexSQL Recover on them. Note that a full database backup will not help here, as the records marked for overwriting are not included in the backup. First, I’ll delete some records from the Person.EmailAddress table in the AdventureWorks database.   I can delete these records in SQL Server Management Studio, or execute a script such as DELETE FROM Person.EmailAddress WHERE BusinessEntityID BETWEEN 70 AND 80 Then, I’ll start ApexSQL Recover and select From DELETE operation in the Recovery tab.   In the Select the database to recover step, first select the SQL Server instance. If it’s not shown in the drop-down list, click the Server icon right to the Server drop-down list and browse for the SQL Server instance, or type the instance name manually. Specify the authentication type and select the database in the Database drop-down list.   In the next step, you’re prompted to add additional data sources. As this can be a tricky step, especially for new users, ApexSQL Recover offers help via the Help me decide option.   The Help me decide option guides you through a series of questions about the database transaction log and advises what files to add. If you know that you have no transaction log backups or detached transaction logs, or the online transaction log file has been truncated after the data was deleted, select No additional transaction logs are available. If you know that you have transaction log backups that contain the delete transactions you want to recover, click Add transaction logs. The online transaction log is listed and selected automatically.   Click Add if to add transaction log backups. It would be best if you have a full transaction log chain, as explained above. The next step for this option is to specify the time range.   Selecting a small time range for the time of deletion will create the recovery script just for the accidentally deleted records. A wide time range might script the records deleted on purpose, and you don’t want that. If needed, you can check the script generated and manually remove such records. After that, for all data sources options, the next step is to select the tables. Be careful here, if you deleted some data from other tables on purpose, and don’t want to recover them, don’t select all tables, as ApexSQL Recover will create the INSERT script for them too.   The next step offers two options: to create a recovery script that will insert the deleted records back into the Person.EmailAddress table, or to create a new database, create the Person.EmailAddress table in it, and insert the deleted records. I’ll select the first one.   The recovery process is completed and 11 records are found and scripted, as expected.   To see the script, click View script. ApexSQL Recover has its own script editor, where you can review, modify, and execute the recovery script. The insert into statements look like: INSERT INTO Person.EmailAddress( BusinessEntityID, EmailAddressID, EmailAddress, rowguid, ModifiedDate) VALUES( 70, 70, N'[email protected]' COLLATE SQL_Latin1_General_CP1_CI_AS, 'd62c5b4e-c91f-403f-b630-7b7e0fda70ce', '20030109 00:00:00.000' ); To execute the script, click Execute in the menu.   If you want to check whether the records are really back, execute SELECT * FROM Person.EmailAddress WHERE BusinessEntityID BETWEEN 70 AND 80 As shown, ApexSQL Recover recovers SQL database data after accidental deletes even without the database backup that contains the deleted data and relevant transaction log backups. ApexSQL Recover reads the deleted data from the database data file, so this method can be used even for databases in the Simple recovery model. Besides recovering SQL database records from a DELETE statement, ApexSQL Recover can help when the records are lost due to a DROP TABLE, or TRUNCATE statement, as well as repair a corrupted MDF file that cannot be attached to as SQL Server instance. You can find more information about how to recover SQL database lost data and repair a SQL Server database on ApexSQL Solution center. There are solutions for various situations when data needs to be recovered. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • SQL SERVER – Automation Process Good or Ugly

    - by pinaldave
    This blog post is written in response to T-SQL Tuesday hosted by SQL Server Insane Asylum. The idea of this post really caught my attention. Automation – something getting itself done after the initial programming, is my understanding of the subject. The very next thought was – is it good or evil? The reality is there is no right answer. However, what if we quickly note a few things, then I would like to request your help to complete this post. We will start with the positive parts in SQL Server where automation happens. The Good If I start thinking of SQL Server and Automation the very first thing that comes to my mind is SQL Agent, which runs various jobs. Once I configure any task or job, it runs fine (till something goes wrong!). Well, automation has its own advantages. We all have used SQL Agent for so many things – backup, various validation jobs, maintenance jobs and numerous other things. What other kinds of automation tasks do you run in your database server? The Ugly This part is very interesting, because it can get really ugly(!). During my career I have found so many bad automation agent jobs. Client had an agent job where he was dropping the clean buffers every hour Client using database mail to send regular emails instead of necessary alert related emails The best one – A client used new Missing Index and Unused Index scripts in SQL Agent Job to follow suggestions 100%. Believe me, I have never seen such a badly performing and hard to optimize database. (I ended up dropping all non-clustered indexes on the development server and ran production workload on the development server again, then configured with optimal indexes). Shrinking database is performance killer. It should never be automated. SQL SERVER – Shrinking Database is Bad – Increases Fragmentation – Reduces Performance The one I hate the most is AutoShrink Database. It has given me hard time in my career quite a few times. SQL SERVER – SHRINKDATABASE For Every Database in the SQL Server Automation is necessary but common sense is a must when creating automation. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • problem installing mysql on ubuntu server 10.10 machine

    - by badperson
    Hi, I tried installing mysql a couple of times and I'm having problems. First of all, when I install it gives me a message that it's setting up and it just hangs. I can't ctl + c out of it, so I reboot the server and try to log into the db with sudo mysql -u root -p I enter my password and then get ERROR 2002 (HY000): Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld.sock' (2) I restart the server: sudo /etc/init.d/mysql start Rather than invoking init scripts through /etc/init.d, use the service(8) utility, e.g. service mysql start Since the script you are attempting to invoke has been converted to an Upstart job, you may also use the start(8) utility, e.g. start mysql ~$ I try this: aptitude search mysql | grep ^i i A libdbd-mysql-perl - Perl5 database interface to the MySQL data i libmysql-java - Java database (JDBC) driver for MySQL i A libmysqlclient16 - MySQL database client library i mysql-client-5.1 - MySQL database client binaries i A mysql-client-core-5.1 - MySQL database core client binaries i mysql-common - MySQL database common files, e.g. /etc/mys i mysql-embedded - MySQL - embedded library i mysql-server-core-5.1 - MySQL database server binaries When I navigate to the folder to see if the *.sock file exists: '/var/run/mysqld/mysqld.sock' it does not. I also try this: service mysql status status: Unable to connect to system bus: Failed to connect to socket /var/run/dbus/system_bus_socket: No such file or directory Any ideas? On my other machines installing mysql has been a snap, not sure what the problem is here. bp

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >