Search Results

Search found 28733 results on 1150 pages for 'sql cheat sheet'.

Page 43/1150 | < Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >

  • SQL Server 2008 R2 Released (RTM)

    - by Aamir Hasan
    Microsoft announced the release of SQL Server 2008 R2 (Release to manufacturing) on (21st April 2010). See the official announcement here. The key enhancements Microsoft emphasized in the release note are: Managed self-service business intelligence (BI) for reporting and analysisEnterprise-class scalability and greater IT efficiencyPlatform integration spanning the data center to the cloud How to Get Started Download a try SQL Server R2 from the official download page.

    Read the article

  • Sell Yourself! Presentation

    - by Mike C
    Thanks to everyone who attended my "Sell Yourself!" presentation at SQLSaturday #61 in Washington, D.C., and thanks to NOVA SQL for setting up the event! I'm uploading the presentation deck here in PDF, original length, with new materials (I had to cut some slides out due to time limits). This deck includes a new section on recruiters and a little more information on the resume. BTW, if you're rewriting your resume I highly recommend the book Elements of Resume Style by S. Bennett. I've used it as...(read more)

    Read the article

  • SQL Server 2008 Cluster Installation - First network name always fails

    - by boflynn
    I'm testing failover clustering in Windows Server 2008 to host a SQL Server 2008 installation using this installation guide. My base cluster is installed and working properly, as well as clustering the DTC service. However, when it comes time to install SQL Server, my first attempt at installation always fails with the same message and seems to "taint" the network name. For example, with my previous cluster attempt, I was installing SQL Server as VSQL. After approximately 15 attempts of installation and trying to resolve the errors, e.g. changing domain accounts for SQL, setting SPNs, etc., I typoed the network name as VQSL and the installation worked. Similarly on my current cluster, I tried installing with the SQL service named PROD-C1-DB and got the same errors as last time until I tried changing the name to anything else, e.g. PROD-C1-DB1, SQL, TEST, etc., at which point the install works. It will even install to VSQL now. While testing, my install routine was: Run setup.exe from patched media, selecting appropriate options After the install fails, I'd chose "Remove node from a SQL Server failover cluster" and remove the single, failed, node Attempt to diagnose problem, inspect event logs, etc. Delete the computer account that was created for the SQL Service from Active Directory Delete the MSSQL10.MSSQLSERVER folder from the shared data drive The error message I receive from the SQL Server installer is: The following error has occurred: The cluster resource 'SQL Server' could not be brought online. Error: The group or resource is not in the correct state to perform the requested operation. (Exception from HRESULT: 0x8007139F) Along with hundreds of the following errors in the Application event log: [sqsrvres] checkODBCConnectError: sqlstate = 28000; native error = 4818; message = [Microsoft][SQL Server Native Client 10.0][SQL Server]Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. System configuration notes: Windows Server 2008 Enterprise Edition x64 SQL Server 2008 Enterprise Edition x64 using slipstreamed SP1+CU1 media Dell PowerEdge servers Fibre attached storage

    Read the article

  • SQL Server 2008 Cluster Installation - First network name always fails

    - by boflynn
    I'm testing failover clustering in Windows Server 2008 to host a SQL Server 2008 installation using this installation guide. My base cluster is installed and working properly, as well as clustering the DTC service. However, when it comes time to install SQL Server, my first attempt at installation always fails with the same message and seems to "taint" the network name. For example, with my previous cluster attempt, I was installing SQL Server as VSQL. After approximately 15 attempts of installation and trying to resolve the errors, e.g. changing domain accounts for SQL, setting SPNs, etc., I typoed the network name as VQSL and the installation worked. Similarly on my current cluster, I tried installing with the SQL service named PROD-C1-DB and got the same errors as last time until I tried changing the name to anything else, e.g. PROD-C1-DB1, SQL, TEST, etc., at which point the install works. It will even install to VSQL now. While testing, my install routine was: Run setup.exe from patched media, selecting appropriate options After the install fails, I'd chose "Remove node from a SQL Server failover cluster" and remove the single, failed, node Attempt to diagnose problem, inspect event logs, etc. Delete the computer account that was created for the SQL Service from Active Directory Delete the MSSQL10.MSSQLSERVER folder from the shared data drive The error message I receive from the SQL Server installer is: The following error has occurred: The cluster resource 'SQL Server' could not be brought online. Error: The group or resource is not in the correct state to perform the requested operation. (Exception from HRESULT: 0x8007139F) Along with hundreds of the following errors in the Application event log: [sqsrvres] checkODBCConnectError: sqlstate = 28000; native error = 4818; message = [Microsoft][SQL Server Native Client 10.0][SQL Server]Login failed for user 'NT AUTHORITY\ANONYMOUS LOGON'. System configuration notes: Windows Server 2008 Enterprise Edition x64 SQL Server 2008 Enterprise Edition x64 using slipstreamed SP1+CU1 media Dell PowerEdge servers Fibre attached storage

    Read the article

  • SQL Server 2008 - Shrinking the Transaction Log - Any way to automate?

    - by Albert
    I went in and checked my Transaction log the other day and it was something crazy like 15GB. I ran the following code: USE mydb GO BACKUP LOG mydb WITH TRUNCATE_ONLY GO DBCC SHRINKFILE(mydb_log,8) GO Which worked fine, shrank it down to 8MB...but the DB in question is a Log Shipping Publisher, and the log is already back up to some 500MB and growing quick. Is there any way to automate this log shrinking, outside of creating a custom "Execute T-SQL Statement Task" Maintenance Plan Task, and hooking it on to my log backup task? If that's the best way then fine...but I was just thinking that SQL Server would have a better way of dealing with this. I thought it was supposed to shrink automatically whenever you took a log backup, but that's not happening (perhaps because of my log shipping, I don't know). Here's my current backup plan: Full backups every night Transaction log backups once a day, late morning (maybe hook the Log shrinking onto this...doesn't need to be shrank every day though) Or maybe I just run it once a week, after I run a full backup task? What do you all think?

    Read the article

  • MSBuild / PowerShell: Copy SQL Server 2012 database to SQL Azure via BACPAC (for Continuous Integration)

    - by giveme5minutes
    I'm creating a continuous integration MSBuild script which copies a database in on-premise SQL Server 2012 to SQL Azure. Easy right? Methods After a fair bit of research I've come across the following methods: Use PowerShell to access the DAC library directly, then use the MSBuild PowerShell extension to wrap the script. This would require installing PowerShell 3 and working out how to make the MSBuild PowerShell extension work with it, as apparently MS moved the DAC API to a different namespace in the latest version of the library. PowerShell would give direct access to the API, but may require quite a bit of boilerplate. Use the sample DAC Framework Client Side Tools, which requires compiling them myself, as the downloads available from Codeplex only include the Hosted version. It would also require fixing them to use DAC 3.0 classes as they appear to currently use an earlier version of DAC. I could then call these tools from an <Exec Command="" /> in the MSBuild script. Less boilerplate and if I hit any bumps in the road I can just make changes to the source. Processes Using whichever method, the process could be either: Export from on-premise SQL Server 2012 to local BACPAC Upload BACPAC to blog storage Import BACPAC to SQL Azure via Hosted DAC Or: Export from on-premise SQL Server 2012 to local BACPAC Import BACPAC to SQL Azure via Client DAC Question All of the above seems to be quite a lot of effort for something that seems to be a standard feature... so before I start reinventing the wheel and documenting the results for all to see, is there something really obvious that I've missed here? Is there pre-written script that MS has released that I have not yet uncovered? There's an command in the GUI of SQL Server Management Studio 2012 that does EXACTLY what I'm trying to do (right click on local database, click "Tasks", click "Deploy Database to SQL Azure"). Surely if it's a few clicks in the GUI it must be a single command on the command line somewhere??

    Read the article

  • Upgrading log shipping from 2005 to 2008 or 2008R2

    - by DavidWimbush
    If you're using log shipping you need to be aware of some small print. The general idea is to upgrade the secondary server first and then the primary server because you can continue to log ship from 2005 to 2008R2. But this won't work if you're keeping your secondary databases in STANDBY mode rather than IN RECOVERY. If you're using native log shipping you'll have some work to do. If you've rolled your own log shipping (ahem) you can convert a STANDBY database to IN RECOVERY like this:   restore database [dw]   with norecovery; and then change your restore code to use WITH NORECOVERY instead of WITH STANDBY. (Finally all that aggravation pays off!) You can either upgrade the secondary server in place or rebuild it. A secondary database doesn't actually get upgraded until you recover it so the log sequence chain is not broken and you can continue shipping from the primary. Just remember that it can take quite some time to upgrade a database so you need to factor that into the expectations you give people about how long it will take to fail over. For more details, check this out: http://msdn.microsoft.com/en-us/library/cc645954(SQL.105).aspx

    Read the article

  • Introducing SSIS Reporting Pack for SQL Server code-named Denali

    - by jamiet
    In recent blog posts I have introduced the new SSIS Catalog that is forthcoming in SQL Server Code-named Denali: What's new in SSIS in Denali Introduction to SSIS Projects in Denali Parameters in SSIS In Denali SSIS Server, Catalogs, Environments and Environment Variables in SSIS in Denali The SSIS Catalog is responsible for executing SSIS packages and also for capturing the metadata from those executions. However, at the time of writing there is no mechanism provided to view analyse and drill into that metadata and that is the reason that I am, in this blog post, introducing a suite of SSIS Catalog reports called the SSIS Reporting Pack which you can download from my SkyDrive at http://cid-550f681dad532637.office.live.com/self.aspx/Public/SSIS%20Reporting%20Pack/SSISReportingPack%20v0.1.zip. In this first release the SSIS Reporting Pack includes five reports: Catalog – A high-level summary of all activity in the Catalog Folders – A summary of activity in each Catalog Folder Folder – Project-level activity per single Folder Executions – A visualisation of all executions per Folder/Project/Package/Environment or subset thereof Execution – Information about an individual execution Here is a screenshot of the Executions report: Notice that the SSIS Reporting Pack provides a visual overview of all executions in the Catalog. Each execution is represented as a bar on the bar chart, the success or otherwise of each execution is indicated by the colour of the bar and the execution time is indicated by the bar height. I have recorded a video that gives an overview of the SSIS Reporting which I have embedded below. If you are having any trouble viewing the video go see it at http://vimeo.com/17617974 I must stress that this is a very early version of the SSIS Reporting Pack and I am expecting it to change a lot over the coming year. I am very keen to get some feedback about this, specifically: let me know if anything does not work as you expect give me your feature requests The easiest way to get hold of of me for now is within the comments section of this blog post. That’s all for now. I hope the SSIS Reporting Pack proves useful and I look forward to hearing your feedback. Lastly, that download link again: http://cid-550f681dad532637.office.live.com/self.aspx/Public/SSIS%20Reporting%20Pack/SSISReportingPack%20v0.1.zip. @jamiet

    Read the article

  • SQL Server 2008 R2 Installation and the Phantom of SQL Server 2005 Express

    - by Davide Mauri
    Today I’ve happy started to install SQL Server 2008R2 on my development machine, which has this software installed Windows Server 2008 R2 Standard SQL Server 2008 SP1 CU5 Visual Studio 2008 SP1 BOL October 2009 AdventuresWorks2008 Databases SR4 Visual Studio 2010 RTM So, all the basic standard stuff. SQL Server 2008 R2 installation went smooth ‘till somewhere in the middle, where the rule engine checks that software pre-requisite are satisfied before starting to copy files. Here I had this @][@@[?!?! error: “The SQL Server 2005 Express Tools are installed. To continue, remove the SQL Server 2005 Express Tools.” Fun enough, I don’t have and I’ve never had SQL Server 2005 Express on my machine. Armed with patience I analyzed the install log here C:\Program Files\Microsoft SQL Server\100\Setup Bootstrap\Log\yyyymmdd_hhmmss\Detail.txt and I’ve found that the rule “Sql2005SsmsExpressFacet” is the one in charge of this check and it look for existance of the registry key HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Microsoft SQL Server\90\Tools\ShellSEM (on x86) HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Microsoft SQL Server\90\Tools\ShellSEM (on x64) In my registry I’ve found that key existsing, due to the installation of the uber-cool Red-Gate SQL Search. I removed the registry key and here it is! SQL Server 2008 R2 is installing while I’m writing this post. A note to Microsoft: can you please add more detailed information on the setup while such error happens. Just saying “you have SQL Server 2005 Express installed” is not enough. Please show us what the rule look for and why it has failed directly in the Detailed Report, so that we don’t have to spend time to look for the needle in the logs. Thanks! :) PS I did a side-by-side installation with the existing SQL Server 2008 instance. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Six in Six - SQL Server 2012 Webinars

    - by JustinL
    We're running six webinars over the next six months covering our experiences with SQL Server 2012 and customer deployments. I'm presenting the first on upgrading to SQL Server 2012 next month, subsequent sessions will be delivered by colleagues: NOVEMBER: SQL Server 2012 Upgrade Approach and considerations. Friday 23rd November 12:00 – 13:00 Present approaches for upgrade testing, managing risk and rollback. The session will include details on minimizing downtime and upgrading from SQL Server 2000, 2005, and 2008 including.... More details and register. DECEMBER: Delivering Mission Critical BI with SQL Server 2012– Friday 14th December 12:00-13:00 Information is the lifeblood of many organisations and the availability of timely, accurate information is critical to strategic decision making. This session covers the features and capabilities… More details and register. JANUARY: Architecting Highly Available solutions with SQL Server 2012 – Friday 18th January 12:00- 13:00 Overview and comparison of the high availability features available within SQL Server 2012. The session considers business requirements for availability and recoverability and presents a number of alternative solution designs to meet… More details and register. FEBRUARY: Private cloud deployments with SQL Server 2012 – Friday 15th February 12:00- 13:00 Cloud based technology provide cost effective scale and flexibility. This session provides an overview of the benefits organisations can realise through private cloud… More details and register. MARCH: Visualising data patterns with SQL Server 2012 – Friday 22nd March 12:00- 13:00 This webinar demonstrates the ease of delivering business insight by exploring information and identifying trends through data visualisation. SQL Server 2012 provides new capability with enhanced performance and … More details and register. APRIL: Architecting Highly Available solutions with SQL Server 2012 – Friday 26th April 12:00- 13:00 Customers are increasingly interested in leveraging the benefits of cloud based solutions to provide scalable and flexible infrastructure to host their applications. This session looks at common design patterns and workloads… More details and register. Justin Langford - Coeo Ltd SQL Server Consultants | SQL Server Remote DBA

    Read the article

  • Oracle SQL Developer: Fetching SQL Statement Result Sets

    - by thatjeffsmith
    Running queries, browsing tables – you are often faced with many thousands, if not millions, of rows. Most people are happy with looking at the first few rows. But occasionally you need to see more. SQL Developer doesn’t show you all records, all at once. Instead, it brings the records down in ‘chunks,’ or as-needed. How It Works There is a preference that tells SQL Developer how many records to get in a single request, or ‘fetch’ of records. The default is 50… So if I run a query that returns MORE than 50 rows: There’s more than 50 records in this resultset, but we have 50 in the grid to start with. We don’t know how many records are in this result set actually. To show the record count here, we actually go physically query the database with a row count type query. All we know is that the query has finished executing, and that there are rows available to go fetch. It tells us when it’s done. As you scroll through the grid, if you get to record 50 and scroll more, we’ll get 50 more records. Or, you can cheat to get to the ‘bottom’ of the result set. You can ask SQL Developer to just to get all the records at once… Once all the records have been fetched, you’ll see this: All rows fetched! A word of caution There’s a reason we have the default set to 50 and not 1000. Bringing back data can get expensive and heavy. We’ve found the best performance to be found in that 50 to 200 record range.

    Read the article

  • LINQ to SQL vs Entity Framework for an app with a future SQL Azure version

    - by Craig L
    I've got a vertical market Dot Net Framework 1.1 C#/WinForms/SQL Server 2000 application. Currently it uses ADO.Net and Microsoft's SQLHelper for CRUD operations. I've successfully converted it to Dot Net Framework 4 C#/WinForms/ SQL Server 2008. What I'd like to do is also offer my customers the ability to use SQL Azure as a backend storage for their data instead of local/LAN SQL Server. If I know SQL Azure is in my application's future, should I: A. Switch to LINQ to SQL B. Swith to Entity Framework C. Stick with ADO.Net and SQLHelper Thanks !

    Read the article

  • Database Firewall

    - by ???02
    Database Firewall?????SQL????????SQL????????????WEB?????HTTP??????SQL????????SQL????????????????????????????????????????????????????????????SQL??????????????????????????????????????·WEB???????????????????·??????????????????WEB???????????WEB??????????Web?????????????????IPA???????????SQL?????????????????????SQL??????????????????????SQL????????????????WEB??????????????????????????????????????????????????????????????????????????????????????????????WEB????????????????????????????????????????B to B?B to C???WEB????????????????????????????????????????????????????????????????????????????WEB?????????????????????WEB??????????????????·??????????????????????????????????????????????????????????????????????????????????????????WEB???????SQL?????????????????????????????Oracle Database Firewall???SQL??????????????????????????????????????????????????????????????????????????????2011?Oracle Database Firewall?????????·????????????????????????????????????????Oracle Database Firewall??????????????????SQL?????????·?????????????????????? Oracle Database Fireawall ?Oracle Database Firewall???????????????????????????????????????????????????????????SQL??????????????????????????????????????????2????????????????????SQL???Pass?Block????????????SQL?????????????????????????????????????????????????????????????????????????????????SQL????????????????????????????????????????????????????????????????????????????????????????????SQL?Oracle Database Firewall???????????????????SQL????????????????WEB??????????Oracle Database Firewall???????????????????????????????????Oracle Database??????SQL Server?DB2?Sybase??????????2????????Oracle Database Firewall?????????????????????????????????????????????????Oracle Database Firewall?????????????????????????????????????????????SQL???????????????????????????????????????????????????????????SPAN???(?????????)?????????????????????????????????SQL???????????????????????????????????????????SQL?Block???Pass??????????????????????????????????IDS?IPS????????????????????WAF (Web Application Firewall)? ??????????????????????????????????Database Firewall???????????????SQL????????????SQL????400????????????????(ISO/IEC 9075)??????????Oracle Database Firewall???????????????????????????????????????????SQL?????????????????????SQL??????Oracle Database Firewall??Oracle Database, SQL Server, DB2??????????????????????SQL???????????????????SQL??????????????????????????????????SQL???????????????????????????????????SQL????????????????????????????????????Oracle Database Firewall???SQL??????Oracle Database Firewall?????????????????????????????????????????????????????????????????????????????(Oracle Database???10gR2??XML??????????????????????????????????????????????????????????????Oracle Database???????????????????Oracle Database???????????????????????????????)???Oracle Database Firewall??????????????????????????????????????????????????????????????Oracle Database Firewall????????????????????????????????????????????????????????? ?????? Oracle Direct

    Read the article

  • Whether to use UNION or OR in SQL Server Queries

    - by Dinesh Asanka
    Recently I came across with an article on DB2 about using Union instead of OR. So I thought of carrying out a research on SQL Server on what scenarios UNION is optimal in and which scenarios OR would be best. I will analyze this with a few scenarios using samples taken  from the AdventureWorks database Sales.SalesOrderDetail table. Scenario 1: Selecting all columns So we are going to select all columns and you have a non-clustered index on the ProductID column. --Query 1 : OR SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 OR ProductID =709 OR ProductID =998 OR ProductID =875 OR ProductID =976 OR ProductID =874 --Query 2 : UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 709 UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 998 UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 875 UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 976 UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 874 So query 1 is using OR and the later is using UNION. Let us analyze the execution plans for these queries. Query 1 Query 2 As expected Query 1 will use Clustered Index Scan but Query 2, uses all sorts of things. In this case, since it is using multiple CPUs you might have CX_PACKET waits as well. Let’s look at the profiler results for these two queries: CPU Reads Duration Row Counts OR 78 1252 389 3854 UNION 250 7495 660 3854 You can see from the above table the UNION query is not performing well as the  OR query though both are retuning same no of rows (3854).These results indicate that, for the above scenario UNION should be used. Scenario 2: Non-Clustered and Clustered Index Columns only --Query 1 : OR SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 714 OR ProductID =709 OR ProductID =998 OR ProductID =875 OR ProductID =976 OR ProductID =874 GO --Query 2 : UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 714 UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 709 UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 998 UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 875 UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 976 UNION SELECT ProductID,SalesOrderID, SalesOrderDetailID FROM Sales.SalesOrderDetail WHERE ProductID = 874 GO So this time, we will be selecting only index columns, which means these queries will avoid a data page lookup. As in the previous case we will analyze the execution plans: Query 1 Query 2 Again, Query 2 is more complex than Query 1. Let us look at the profile analysis: CPU Reads Duration Row Counts OR 0 24 208 3854 UNION 0 38 193 3854 In this analyzis, there is only slight difference between OR and UNION. Scenario 3: Selecting all columns for different fields Up to now, we were using only one column (ProductID) in the where clause.  What if we have two columns for where clauses and let us assume both are covered by non-clustered indexes? --Query 1 : OR SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 OR CarrierTrackingNumber LIKE 'D0B8%' --Query 2 : UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 UNION SELECT * FROM Sales.SalesOrderDetail WHERE CarrierTrackingNumber  LIKE 'D0B8%' Query 1 Query 2: As we can see, the query plan for the second query has improved. Let us see the profiler results. CPU Reads Duration Row Counts OR 47 1278 443 1228 UNION 31 1334 400 1228 So in this case too, there is little difference between OR and UNION. Scenario 4: Selecting Clustered index columns for different fields Now let us go only with clustered indexes: --Query 1 : OR SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 OR CarrierTrackingNumber LIKE 'D0B8%' --Query 2 : UNION SELECT * FROM Sales.SalesOrderDetail WHERE ProductID = 714 UNION SELECT * FROM Sales.SalesOrderDetail WHERE CarrierTrackingNumber  LIKE 'D0B8%' Query 1 Query 2 Now both execution plans are almost identical except is an additional Stream Aggregate is used in the first query. This means UNION has advantage over OR in this scenario. Let us see profiler results for these queries again. CPU Reads Duration Row Counts OR 0 319 366 1228 UNION 0 50 193 1228 Now see the differences, in this scenario UNION has somewhat of an advantage over OR. Conclusion Using UNION or OR depends on the scenario you are faced with. So you need to do your analyzing before selecting the appropriate method. Also, above the four scenarios are not all an exhaustive list of scenarios, I selected those for the broad description purposes only.

    Read the article

  • How to avoid the "divide by zero" error in SQL?

    - by Henrik Staun Poulsen
    I hate this error message: Msg 8134, Level 16, State 1, Line 1 Divide by zero error encountered. What is the best way to write SQL code, so that I will never see this error message again? I mean, I could add a where clause so that my divisor is never zero. Or I could add a case statement, so that there is a special treatment for zero. Is the best way to use a NullIf clause? Is there better way, or how can this be enforced?

    Read the article

  • Excel cell from one sheet to another sheet

    - by Eric
    Excel 07. I have two excel spreadsheets. Sheet1 has a few cells on it that I would like to be populated with a few cells from sheet2. This I want sheet1 to replicate this about 400 times for all employees in the agency. Here is the example. Sheet 1 Person name number1 number 2 number 3 Cell cell cell cell Sheet 2 Information sheet. Person name number1 Number2 number3 Jim 23 32 54 Sally 25 22 53 End result Sheet 3 Person name number1 number2 number3 Jim 23 32 54 Sheet 4 Person name number1 number2 number3 sally 25 22 53 Any help will be appreciated thank you.

    Read the article

  • Dynamic openrowset in T-Sql Function or viable alternative?

    - by IronicMuffin
    I'm not quite sure how to phrase this. Here is the problem: I have 1-n items that I need to join to a different system (AS400) to get some data. The openrowset takes forever if I specify the where criteria outside of the openrowset, e.g.: select * from openrowset('my connection string', 'select code, myfield from myTable') where code = @code My idea was to create a function that takes in the item number and uses dynamic sql to inject it into the openrowset string, a la: declare @cmd varchar(1000) set @cmd = 'select * from openrowset('my connection string', ''select code, myfield from myTable where code = ' + @code + ''')' Apparently I can't use the insert.. exec.. strategy inside of a function. Is there any better way to achieve this? I was going to use this in joins where I needed the external data using cross apply. I'm not married to tvf and cross apply, but I do need a method of getting this data quickly. Thanks for any help.

    Read the article

  • Generated LinqtoSql Sql 5x slower than SAME EXACT hand-written sql

    - by JasonM
    I have a sql statement which is hardcoded in an existing VB6 app. I'm upgrading a new version in C# and using Linq To Sql. I was able to get LinqToSql to generate the same sql (before I start refactoring), but for some reason the Sql generated by LinqToSql is 5x slower than the original sql. This is running the generated Sql Directly in LinqPad. The only real difference my meager sql eyes can spot is the WITH (NOLOCK), which if I add into the LinqToSql generated sql, makes no difference. Can someone point out what I'm doing wrong here? Thanks! Existing Hard Coded Sql (5.0 Seconds) SELECT DISTINCT CH.ClaimNum, CH.AcnProvID, CH.AcnPatID, CH.TinNum, CH.Diag1, CH.GroupNum, CH.AllowedTotal FROM Claims.dbo.T_ClaimsHeader AS CH WITH (NOLOCK) WHERE CH.ContractID IN ('123A','123B','123C','123D','123E','123F','123G','123H') AND ( ( (CH.Transmited Is Null or CH.Transmited = '') AND CH.DateTransmit Is Null AND CH.EobDate Is Null AND CH.ProcessFlag IN ('Y','E') AND CH.DataSource NOT IN ('A','EC','EU') AND CH.AllowedTotal > 0 ) ) ORDER BY CH.AcnPatID, CH.ClaimNum Generated Sql from LinqToSql (27.6 Seconds) -- Region Parameters DECLARE @p0 NVarChar(4) SET @p0 = '123A' DECLARE @p1 NVarChar(4) SET @p1 = '123B' DECLARE @p2 NVarChar(4) SET @p2 = '123C' DECLARE @p3 NVarChar(4) SET @p3 = '123D' DECLARE @p4 NVarChar(4) SET @p4 = '123E' DECLARE @p5 NVarChar(4) SET @p5 = '123F' DECLARE @p6 NVarChar(4) SET @p6 = '123G' DECLARE @p7 NVarChar(4) SET @p7 = '123H' DECLARE @p8 VarChar(1) SET @p8 = '' DECLARE @p9 NVarChar(1) SET @p9 = 'Y' DECLARE @p10 NVarChar(1) SET @p10 = 'E' DECLARE @p11 NVarChar(1) SET @p11 = 'A' DECLARE @p12 NVarChar(2) SET @p12 = 'EC' DECLARE @p13 NVarChar(2) SET @p13 = 'EU' DECLARE @p14 Decimal(5,4) SET @p14 = 0 -- EndRegion SELECT DISTINCT [t0].[ClaimNum], [t0].[acnprovid] AS [AcnProvID], [t0].[acnpatid] AS [AcnPatID], [t0].[tinnum] AS [TinNum], [t0].[diag1] AS [Diag1], [t0].[GroupNum], [t0].[allowedtotal] AS [AllowedTotal] FROM [Claims].[dbo].[T_ClaimsHeader] AS [t0] WHERE ([t0].[contractid] IN (@p0, @p1, @p2, @p3, @p4, @p5, @p6, @p7)) AND (([t0].[Transmited] IS NULL) OR ([t0].[Transmited] = @p8)) AND ([t0].[DATETRANSMIT] IS NULL) AND ([t0].[EOBDATE] IS NULL) AND ([t0].[PROCESSFLAG] IN (@p9, @p10)) AND (NOT ([t0].[DataSource] IN (@p11, @p12, @p13))) AND ([t0].[allowedtotal] > @p14) ORDER BY [t0].[acnpatid], [t0].[ClaimNum] New LinqToSql Code (30+ seconds... Times out ) var contractIds = T_ContractDatas.Where(x => x.EdiSubmissionGroupID == "123-01").Select(x => x.CONTRACTID).ToList(); var processFlags = new List<string> {"Y","E"}; var dataSource = new List<string> {"A","EC","EU"}; var results = (from claims in T_ClaimsHeaders where contractIds.Contains(claims.contractid) && (claims.Transmited == null || claims.Transmited == string.Empty ) && claims.DATETRANSMIT == null && claims.EOBDATE == null && processFlags.Contains(claims.PROCESSFLAG) && !dataSource.Contains(claims.DataSource) && claims.allowedtotal > 0 select new { ClaimNum = claims.ClaimNum, AcnProvID = claims.acnprovid, AcnPatID = claims.acnpatid, TinNum = claims.tinnum, Diag1 = claims.diag1, GroupNum = claims.GroupNum, AllowedTotal = claims.allowedtotal }).OrderBy(x => x.ClaimNum).OrderBy(x => x.AcnPatID).Distinct(); I'm using the list of constants above to make LinqToSql Generate IN ('xxx','xxx',etc) Otherwise it uses subqueries which are just as slow...

    Read the article

  • SQL Developer Debugging, Watches, Smart Data, & Data

    - by thatjeffsmith
    After presenting the SQL Developer PL/SQL debugger for about an hour yesterday at KScope12 in San Antonio, my boss came up and asked, “Now, would you really want to know what the Smart Data panel does?” Apparently I had ‘made up’ my own story about what that panel’s intent is based on my experience with it. Not good Jeff, not good. It was a very small point of my presentation, but I probably should have read the docs. The Smart Data tab displays information about variables, using your Debugger: Smart Data preferences. You can also specify these preferences by right-clicking in the Smart Data window and selecting Preferences. Debugger Smart Data Preferences, control number of variables to display The Smart Data panel auto-inspects the last X accessed variables. So if you have a program with 26 variables, instead of showing you all 26, it will just show you the last two variables that were referenced in your program. If you were to click on the ‘Data’ debug panel, you’ll see EVERYTHING. And if you only want to see a very specific set of values, then you should use Watches. The Smart Data Panel As I step through the code, the variables being tracked change as they are referenced. Only the most recent ones display. This is controlled by the ‘Maximum Locations to Remember’ preference. Step through the code, see the latest variables accessed The Data Panel All variables are displayed. Might be information overload on large PL/SQL programs where you have many dozens or even hundreds of variables to track. Shows everything all the time Watches Watches are added manually and only show what you ask for. Data on Demand – add a watch to track a specific variable Remember, you can interact with your data If you want to do more than just watch, you can mouse-right on a data element, and change the value of the variable as the program is running. This is one of the primary benefits to debugging over using DBMS_OUTPUT to track what’s happening in your program. Change the values while the program is running to test your ‘What if?’ scenarios

    Read the article

  • Using a .MDF SQL Server Database with ASP.NET Versus Using SQL Server

    - by Maxim Z.
    I'm currently writing a website in ASP.NET MVC, and my database (which doesn't have any data in it yet, it only has the correct tables) uses SQL Server 2008, which I have installed on my development machine. I connect to the database out of my application by using the Server Explorer, followed by LINQ to SQL mapping. Once I finish developing the site, I will move it over to my hosting service, which is a virtual hosting plan. I'm concerned about whether using the SQL Server setup that is currently working on my development machine will be hard to do on the production server, as I'll have to import all the database tables through the hosting control panel. I've noticed that it is possible to create a SQL Server database from inside Visual Studio. It is then stored in the App_Data directory. My questions are the following: Does it make sense to move my SQL Server DB out of SQL Server and into the App_Data directory as an .mdf file? If so, how can I move it? I believe this is called the Detach command, is it not? Are there any performance/security issues that can occur with a .mdf file like this? Would my intended setup work OK with a typical virtual hosting plan? I'm hoping that the .mdf database won't count against the limited number of SQL Server databases that can be created with my plan. I hope this question isn't too broad. Thanks in advance! Note: I'm just starting out with ASP.NET MVC and all this, so I might be completely misunderstanding how this is supposed to work.

    Read the article

  • SQL, moving million records from a database to other database [migrated]

    - by Ryoma
    I am a C# developer, I am not really good with SQL. I have a simple questions here. I need to move more than 50 millions records from a database to other database. I tried to use the import function in ms SQL, however it got stuck because the log was full (I got an error message The transaction log for database 'mydatabase' is full due to 'LOG_BACKUP'). The database recovery model was set to simple. My friend said that importing millions records using task-import data will cause the log to be massive and told me to use loop instead to transfer the data, does anyone know how and why? thanks in advance

    Read the article

  • Migrating MOSS 2007 from SQL 2000 to SQL 2005 - Addendum

    - by lunacrescens
    This is a continuation of an earlier question I had about moving the databases for a MOSS 2007 installation from SQL 2000 to SQL 2005. Here's the URL for the original question: http://stackoverflow.com/questions/254517/migrating-moss-2007-from-sql-2000-to-sql-2005 In my test environment, I've successfully moved the databases to the SQL 2005 test machine and things appear to be working fine. But, on the "Servers in Farm" page of the Central Admin | Operations, it still shows the old (i.e. SQL 2000) server as the Configuration Database Server. Also, it shows the old config database as being the Configuration Database. I know that the SQL2000 server and old config database (that are showing on this page) are NOT being used, because we've deactived the SQL instance in SQL2000. I've tried "removing" the server, and get a message about "Uninstalling SharePoint products and technologies" being the better route. So, I disconnected from the test databases, uninstalled SharePoint from the test WFE server, and reinstalled it. That didn't do anything. Before uninstalling/reinstalling I also tried simply rerunning the SharePoint Configuration wizard, and that didn't do anything either. Does anyone know how to update the Config Server and Config Database on the "Servers in Farm" page after having moved the Config and Content DBs? Is there something I'm missing or overlooking? Thanks.

    Read the article

  • sql server 2008 insert statement question

    - by user61752
    I am learning sql server 2008 t-sql. To insert a varchar type, I just need to insert a string 'abc', but for nvarchar type, I need to add N in front (N'abc'). I have a table employee, it has 2 fields, firstname and lastname, they are both nvarchar(20). insert into employee values('abc', 'def'); I test it, it works, seems like N is not required. Why we need to add N in front for nvarchar type, what's the pro or con if we are not using it?

    Read the article

  • Proper Settings for SQL Server 2005 Profiler User Auditing

    - by Irwin M. Fletcher
    I have been tasked with creating a catalog of all users\applications that access a particular database at our company. The issue is that we have no idea what users are using the database and what they are doing (what queries they are using or stored procedures). I would like to run this over a few days, but want to minimize the amount of information that is returned in the trace, could anyone please advise me on what the minimal settings for Profiler are for this.

    Read the article

  • Deleted user default database - SQL Server 2008

    - by RadiantHex
    Hi folks, I cannot connect to the database anymore, I'm getting: Cannot open user default database. Login failed. I have deleted the database during a previous session and then tried to recreate it. But the recreate failed. Now I am stuck with this error, what can I do? Edit: I'm using Windows Authentication Any ideas? Fixed: use the command: sqlcmd -E -d master then type: ALTER LOGIN [Your Windows Login] WITH DEFAULT_DATABASE=master GO :)

    Read the article

< Previous Page | 39 40 41 42 43 44 45 46 47 48 49 50  | Next Page >