Search Results

Search found 847 results on 34 pages for 'sqlserver'.

Page 20/34 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • REPLACENULL in SSIS 2012

    - by Davide Mauri
    While preparing my slides e demos for the forthcoming SQL Server Conference 2012 in Italy, I’ve come across a nice addition to DTS Expression language which I never noticed before and that seems unknown also to the blogosphere: REPLACENULL. REPLACENULL is the same of ISNULL in T-SQL. It’s *very* useful especially when loading a fact table of your BI solution when you need to replace unexisting reference to dimension with dummy values. Here’s an example of how it can be used (please notice that in this example I’m NOT loading a fact table): I’ve noticed that the feature was requested by fellow MVP John Welch http://connect.microsoft.com/SQLServer/feedback/details/636057/ssis-add-a-replacenull-function-to-the-expression-language So: Thanks John and Thanks SSIS Team ! Ah, btw, the Help online is here http://msdn.microsoft.com/en-us/library/hh479601(v=sql.110).aspx Enjoy!

    Read the article

  • SQL Server PowerShell Provider And PowerShell Version 2 Get-Command Issue

    - by BuckWoody
    The other day I blogged that the version of the SQL Server PowerShell provider (sqlps) follows the version of PowerShell. That’s all goodness, but it has appeared to cause an issue for PowerShell 2.0. the Get-Command PowerShell command-let returns an error (Object reference not set to an instance of an object) if you are using PowerShell 2.0 and sqlps – it’s a known bug, and I’m happy to report that it is fixed in SP2 for SQL Server 2008 – something that will released soon. You can read more about this issue here: http://connect.microsoft.com/SQLServer/feedback/details/484732/sqlps-and-powershell-v2-issues Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • What to do after a servicing fails on TFS 2010

    - by Martin Hinshelwood
    What do you do if you run a couple of hotfixes against your TFS 2010 server and you start to see seem odd behaviour? A customer of mine encountered that very problem, but they could not just, or at least not easily, go back a version.   You see, around the time of the TFS 2010 launch this company decided to upgrade their entire 250+ development team from TFS 2008 to TFS 2010. They encountered a few problems, owing mainly to the size of their TFS deployment, and the way they were using TFS. They were not doing anything wrong, but when you have the largest deployment of TFS outside of Microsoft you tend to run into problems that most people will never encounter. We are talking half a terabyte of source control in TFS with over 80 proxy servers. Its certainly the largest deployment I have ever heard of. When they did their upgrade way back in April, they found two major flaws in the product that meant that they had to back out of the upgrade and wait for a couple of hotfixes. KB983504 – Hotfix KB983578 – Patch KB2401992 -Hotfix In the time since they got the hotfixes they have run 6 successful trial migrations, but we are not talking minutes or hours here. When you have 400+ GB of data it takes time to copy it around. It takes time to do the upgrade and it takes time to do a backup. Well, last week it was crunch time with their developers off for Christmas they had a window of opportunity to complete the upgrade. Now these guys are good, but they wanted Northwest Cadence to be available “just in case”. They did not expect any problems as they already had 6 successful trial upgrades. The problems surfaced around 20 hours in after the first set of hotfixes had been applied. The new Team Project Collection, the only thing of importance, had disappeared from the Team Foundation Server Administration console. The collection would not reattach either. It would not even list the new collection as attachable! Figure: We know there is a database there, but it does not This was a dire situation as 20+ hours to repeat would leave the customer over time with 250+ developers sitting around doing nothing. We tried everything, and then we stumbled upon the command of last resort. TFSConfig Recover /ConfigurationDB:SQLServer\InstanceName;TFS_ConfigurationDBName /CollectionDB:SQLServer\instanceName;"Collection Name" -http://msdn.microsoft.com/en-us/library/ff407077.aspx WARNING: Never run this command! Now this command does something a little nasty. It assumes that there really should not be anything wrong and sets about fixing it. It ignores any servicing levels in the Team Project Collection database and forcibly applies the latest version of the schema. I am sure you can imagine the types of problems this may cause when the schema is updated leaving the data behind. That said, as far as we could see this collection looked good, and we were even able to find and attach the team project collection to the Configuration database. Figure: After attaching the TPC it enters a servicing mode After reattaching the team project collection we found the message “Re-Attaching”. Well, fair enough that sounds like something that may need to happen, and after checking that there was disk IO we left it to it. 14+ hours later, it was still not done so the customer raised a priority support call with MSFT and an engineer helped them out. Figure: Everything looks good, it is just offline. Tip: Did you know that these logs are not represented in the ~/Logs/* folder until they are opened once? The engineer dug around a bit and listened to our situation. He knew that we had run the dreaded “tfsconfig restore”, but was not phased. Figure: This message looks suspiciously like the wrong servicing version As it turns out, the servicing version was slightly out of sync with the schema. KB Schema Successful           KB983504 341 Yes   KB983578 344 sort of   KB2401992 360 nope   Figure: KB, Schema table with notation to its success The Schema version above represents the final end of run version for that hotfix or patch. The only way forward The problem was that the version was somewhere between 341 and 344. This is not a nice place to be in and the engineer give us the  only way forward as the removal of the servicing number from the database so that the re-attach process would apply the latest schema. if his sounds a little like the “tfsconfig recover” command then you are exactly right. Figure: Sneakily changing that 3 to a 1 should do the trick Figure: Changing the status and dropping the version should do it Now that we have done that we should be able to safely reattach and enable the Team Project Collection. Figure: The TPC is now all attached and running You may think that this is the end of the story, but it is not. After a while of mulling and seeking expert advice we came to the opinion that the database was, for want of a better term, “hosed”. There could well be orphaned data in there and the likelihood that we would have problems later down the line is pretty high. We contacted the customer back and made them aware that in all likelihood the repaired database was more like a “cut and shut” than anything else, and at the first sign of trouble later down the line was likely to split in two. So with 40+ hours invested in getting this new database ready the customer threw it away and started again. What would you do? Would you take the “cut and shut” to production and hope for the best?

    Read the article

  • Some datatypes doesn't honor localization

    - by Peter Larsson
    This bug has haunted me for a while, until today when I decided to not accept it anymore. So I filed a bug over at connect.microsoft.com, https://connect.microsoft.com/SQLServer/feedback/details/636074/some-datatypes-doesnt-honor-localization, and if you feel the way I do, please vote for this bug to be fixed. Here is a very simple repro of the problem DECLARE  @Sample TABLE          (              a DECIMAL(38, 19),              b FLOAT          ) INSERT   @Sample          (              a,              b          ) VALUES   (1E / 7E, 1E / 7E) SELECT   * FROM     @Sample Here is the actual output.                                       a                      b --------------------------------------- ----------------------                   0.1428571428571428400      0,142857142857143   I think that both columns should have the same decimal separator, don't you? //Peter

    Read the article

  • Why do we keep using CSV?

    - by Stephen
    Why do we keep using CSV? I recently made a shift to working the health domain and despite the wonderful work in data transfer standards, all data transfer is in CSV, both for reporting to external organisations, and for data migrations when implementing new systems. Unfortunately the use of CSV is the cause of the endless repetition of the same stupid errors, with the same waste of developer time. (bad escaping, failing to handle null fields etc.) I know we can do better, and anything between JSON and XML (depending on the instance) would be fine. (Most of the time this is data going from one MS SQLserver 2005 to another!) I feel as if each time I see this happening I am literally watching one developer waste anothers time. So why do we keep shafting each other? When will we stop?

    Read the article

  • SQL Server MCM Changes and Readiness Videos

    - by Enrique Lima
    Towards the end of 2010, Microsoft made some changes to the Microsoft Certified Master for SQL Server 2008 program. The process to certification required to have a 3 week bootcamp/course in Redmond. This has changed now.  It has been mapped to 2 exams. Get information from Microsoft Learning with regards to the changes, process, resources and pricing for the certification exams.  http://www.microsoft.com/learning/en/us/certification/master-sql-path.aspx What has happened here too, is some SQL MCM rotation Instructors and SQL MCMs  have created materials to prep for those exams.  I see this as a huge benefit for individuals who are planning to take on the MCM, but really it is of huge benefit for all individuals who deal with working around SQL Server on a regular basis. Check the Readiness Videos as a great starting point http://technet.microsoft.com/en-us/sqlserver/ff977043.aspx

    Read the article

  • SQL Server 2012 RTM Available!

    - by Davide Mauri
    SQL Server 2012 is available for download! http://www.microsoft.com/sqlserver/en/us/default.aspx The Evaluation version is available here: http://www.microsoft.com/download/en/details.aspx?id=29066 and along with the SQL Server 2012 RTM there’s also the Feature Pack available: http://www.microsoft.com/download/en/details.aspx?id=29065 The Feature Pack is rich of useful and interesting stuff, something needed by some feature, like the Semantic Language Statistics Database some other a very good (I would say needed) download if you use certain technologies, like MDS or Data Mining. Btw, for Data Mining also the updated Excel Addin has been released and it’s available in the Feature Pack. As if this would not be enough, also the SQL Server Data Tools IDE has been released in RTM: http://msdn.microsoft.com/en-us/data/hh297027 Remember that SQL Server Data Tool is completely free and can be used with SQL Server 2005 and after. Happy downloading!

    Read the article

  • In SQLCMD mode, should CONNECT be an implicit batch separator?

    - by Greg Low
    Hi Folks, I've been working with SQLCMD mode again today and one thing about it always bites me. If I execute a script like: ::CONNECT SERVER1 SELECT @@VERSION; ::CONNECT SERVER2 SELECT @@VERSION; ::CONNECT SERVER3 SELECT @@VERSION; I'm sure I'm not the only person that would be surprised to see all three SELECT commands executed against SERVER3 and none executed against SERVER1 or SERVER2. If you think that's odd behavior, here's where to vote: https://connect.microsoft.com/SQLServer/feedback/details/611144/sqlcmd-connect-to-a-different-server-should-be-an-implicit-batch-separator#detail...(read more)

    Read the article

  • EJB3 - using 2 persistence units within a transaction (Exception: Local transaction already has 1 no

    - by Sorcha
    I am trying to use 2 persistence units within the same transaction in a JEE application deployed on Glassfish. The 2 persistence units are defined in persistence.xml, as follows: <persistence-unit name="BeachWater"> <jta-data-source>jdbc/BeachWater</jta-data-source> ... <persistence-unit name="LIMS"> <jta-data-source>jdbc/BeachWaterLIMS</jta-data-source> ... These persistence units correspond to JDBC resources and connection pools which I had defined in Glassfish as follows (include one here as both are identical apart from names & database connection info): JDBC Resource: JNDI Name: jdbc/BeachWaterLIMS Pool Name: BEACHWATER_LIMS Connection Pool: Name: BEACHWATER_LIMS Datasource Classname: com.microsoft.sqlserver.jdbc.SQLServerConnectionPoolDataSource Resource Type: javax.sql.ConnectionPoolDataSource There are 3 stateless session beans, LimsServiceBean, AnalysisServiceBean and AnalysisDataTransformationServiceBean. Here are the relevant snippets from LimsServiceBean: @PersistenceContext(unitName = "LIMS") EntityManager em; ... public ArrayList<Sample> getLatestLIMSData() { Query q = em.createNamedQuery("Sample.findBySubTypeStatus"); return new ArrayList<Sample>(q.getResultList()); } From AnalysisServiceBean: @PersistenceContext(unitName = "BeachWater") EntityManager em; ... public ArrayList<AnalysisType> getAllAnalysisTypes() { Query q = em.createNamedQuery("AnalysisType.findAll"); return new ArrayList<AnalysisType>(q.getResultList()); } And from AnalysisDataTransformationServiceBean: @EJB private AnalysisService analysisService; @EJB private LimsService limsService; public void transformData() { List<AnalysisType> analysisTypes = analysisService.getAllAnalysisTypes(); ArrayList<Sample> samples = limsService.getLatestLIMSData(); This call to limsService.getLatestLIMSData() caused the following exception: [exec] Caused by: javax.ejb.TransactionRolledbackLocalException: Exception thrown from bean; nested exception is: Exception [TOPLINK-4002] (Oracle TopLink Essentials - 2.1 (Build b60e-fcs (12/23/2008))): oracle.toplink.essentials.exceptions.DatabaseException [exec] Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: java.lang.IllegalStateException: Local transaction already has 1 non-XA Resource: cannot add more resources. Having consulted this page, http://msdn.microsoft.com/en-us/library/ms378484.aspx (among many others), I tried changing the definition of the connection pools to: Connection Pool: Name: BEACHWATER_LIMS Datasource Classname: com.microsoft.sqlserver.jdbc.SQLServerXADataSource Resource Type: javax.sql.XADataSource Ping via the Glassfish admin console succeeds, but call to analysisService.getAllAnalysisTypes() now throws an exception: Caused by: javax.ejb.TransactionRolledbackLocalException: Exception thrown from bean; nested exception is: Exception [TOPLINK-4002] (Oracle TopLink Essentials - 2.1 (Build b60e-fcs (12/23/2008))): oracle.toplink.essentials.exceptions.DatabaseException Internal Exception: java.sql.SQLException: Error in allocating a connection. Cause: javax.transaction.SystemException Any ideas?

    Read the article

  • SQL 2008 R2 login/network issue

    - by martinjd
    I have a Windows Server 2008 R2 new clean install , not a VM, that I have added to a Windows Server 2003 based domain using my account which has domain admin rights. The domain functional level is 2003. I performed a clean install of SQL Server 2008 R2 using my account which has domain admin rights. The installation completed without any errors. I logged into SSMS locally and attempted to add another domain account by clicking Search, Advanced and finding the user in the domain. When I return to the "Dialog - New" window and click OK I receive the following error: Create failed for Login 'Domain\User'. (Microsoft.SqlServer.Smo) An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) Windows NT user or group 'Domain\User' not found. Check the name again. (Microsoft SQL Server, Error: 15401) I have verified that the firewall is off, tried adding a different domain user, tried using SA to add a user, installed the hotfix for KB 976494 and verified that the Local Security Policy for Domain Member: Digitally encrypt or sign secure channel Domain Member: Digitally encrypt secure channel Domain Member: Digitally sign secure channel are disabled none of which have made a difference. I can RDP to a Server 2003 server running SQL 2008 and add the same domain user without issue. Also if I try to connect with SSMS to the sql server from another system on the domain using my account I get the following error: Login failed. The login is from an untrusted domain and cannot be used with Windows authentication. (Microsoft SQL Server, Error: 18452) and on the database server I see the following in the security event log: An account failed to log on. Subject: Security ID: NULL SID Account Name: - Account Domain: - Logon ID: 0x0 Logon Type: 3 Account For Which Logon Failed: Security ID: NULL SID Account Name: myUserName Account Domain: MYDOMAIN Failure Information: Failure Reason: An Error occured during Logon. Status: 0xc000018d Sub Status: 0x0 Process Information: Caller Process ID: 0x0 Caller Process Name: - Network Information: Workstation Name: MYWKS Source Network Address: - Source Port: - Detailed Authentication Information: Logon Process: NtLmSsp Authentication Package: NTLM Transited Services: - Package Name (NTLM only): - Key Length: 0 I am sure that the "NULL SID" has some significant meaning but have no idea at this point what the issue could be.

    Read the article

  • SQLSTATE 08001 error

    - by Joseph
    Hi all OUR SQL Server 2008 got freeze.and not able to connect the databse and we restrted the server(windows 2008). Even we were not able to login windows(Restrted via HP ILO). ANybody face this issue?and have a resolution. [165] ODBC Error: 0, Login timeout expired [SQLSTATE HYT00] [298] SQLServer Error: 258, Shared Memory Provider: Timeout error [258]. [SQLSTATE 08001] [382] Logon to server '(local)' failed (ConnUpdateJobActivity_NextScheduledRunDate) the above three error we got this time thanks in advance

    Read the article

  • SSMS 2008 Add-In - Execute Query

    - by ca8msm
    I'm loading a sql script up to an SSMS 2008 add-in like so: ' create a new blank document ServiceCache.ScriptFactory.CreateNewBlankScript(Microsoft.SqlServer.Management.UI.VSIntegration.Editors.ScriptType.Sql) ' insert SQL statement to the blank document Dim doc As EnvDTE.TextDocument = CType(ServiceCache.ExtensibilityModel.Application.ActiveDocument.Object(Nothing), EnvDTE.TextDocument) doc.EndPoint.CreateEditPoint().Insert(_Output.ToString()) Is there a way to automatically execute the statement as well? Thanks, Mark

    Read the article

  • SQL Server Express : Reporting Services.. limitations on charts?

    - by Brett
    Hi, If you look the edition comparison... http://www.microsoft.com/sqlserver/2008/en/us/editions-compare.aspx It says for Express Edition that this option is not enabled for SQL Server Express.. Enhanced Gauges and Charting Anyone know what the limitations are on this item? I can't seem to find any more information.. Many thanks, Brett

    Read the article

  • What’s the Minimum System Spec Recommended For Developer Laptop

    - by DaveDev
    I'll be regularly running Visual Studio 2010 Professional, SQLServer Express, Office and at least 1 virtual environment running a Linux Distro. I want the machine to be snappy and responsive even when doing a reasonable amount of Development work. I want to spend what it takes for this, but I don't want to go overboard spending more than I need to. I won't be playing many games or graphics processing so i won't need a monster of a machine. Any recommendations?

    Read the article

  • Are there any reserved words in SQLite?

    - by DanM
    Three questions about reserved words: Are there any reserved words in SQLite? If so, what are they? If there are reserved words, is the correct syntax for using one of them as a column or table name still to surround it with brackets? E.g., [User] or [Name]? Are there any implications with using words that are reserved in other flavors of SQL (e.g., SQLServer) but not reserved in SQLite when using ADO.NET to query a SQLite database?

    Read the article

  • sonarinstall Sonar to mssql

    - by senzacionale
    My server name in SQL management studio is: MITJAB-NOTEBOOK\SQL2008 i trx now install sonar in this server: sonar.jdbc.url: jdbc:microsoft:sqlserver://MITJAB-NOTEBOOK\SQL2008:1433;DatabaseName=Sonar;SelectMethod=cursor sonar.jdbc.driverClassName: net.sourceforge.jtds.jdbc.Driver sonar.jdbc.validationQuery: select 1 but not working. DB is Sonar and username and passwrod are the same. is url OK?

    Read the article

  • How do I drop SQL Databases? sp_delete_database_backuphistory woes

    - by rlb.usa
    I want to delete some SQL Databases on my server, but I'm having problems. My login has the roles: public dbcreator serveradmin When I right click the database and hit Delete, it says that Delete backup history failed for server 'MYSERVER' (Microsoft.SqlServer.Smo) Additional Information: The EXECUTE permission was denied on the object 'sp_delete_database_backuphistory' How do I delete these databases?

    Read the article

  • CrystalReports.NET - varbinary blob

    - by BhejaFry
    Hi folks, what's the proper way to display an image stored in sqlserver database as an varbinary(max) datatype in CrystalReports for .NET? I have added a 'blobfieldobject' item in crystal reports & it is bound to a datatable with the column of type 'varbinary(max)' but the image won't show up instead a dark background is diplayed. TIA

    Read the article

  • DataTable identity column not set after DataAdapter.Update/Refresh on table with "instead of"-trigge

    - by Arno
    Within our unit tests we use plain ADO.NET (DataTable, DataAdapter) for preparing the database resp. checking the results, while the tested components themselves run under NHibernate 2.1. .NET version is 3.5, SqlServer version is 2005. The database tables have identity columns as primary keys. Some tables apply instead-of-insert/update triggers (this is due to backward compatibility, nothing I can change). The triggers generally work like this: create trigger dbo.emp_insert on dbo.emp instead of insert as begin set nocount on insert into emp ... select @@identity end The insert statement issued by the ADO.NET DataAdapter (generated on-the-fly by a thin ADO.NET wrapper) tries to retrieve the identity value back into the DataRow: exec sp_executesql N' insert into emp (...) values (...); select id, ... from emp where id = @@identity ' But the DataRow's id-Column is still 0. When I remove the trigger temporarily, it works fine - the id-Column then holds the identity value set by the database. NHibernate on the other hand uses this kind of insert statement: exec sp_executesql N' insert into emp (...) values (...); select scope_identity() ' This works, the NHibernate POCO has its id property correctly set right after flushing. Which seems a little bit counter-intuitive to me, as I expected the trigger to run in a different scope, hence @@identity should be a better fit than scope_identity(). So I thought no problem, I will apply scope_identity() instead of @@identity under ADO.NET as well. But this has no effect, the DataRow value is still not updated accordingly. And now for the best part: When I copy and paste those two statements from SqlServer profiler into a Management Studio query (that is including "exec sp_executesql"), and run them there, the results seem to be inverse! There the ADO.NET version works, and the NHibernate version doesn't (select scope_identity() returns null). I tried several times to verify, but to no avail. Of course this just shows the resultset coming from the database, whatever happens inside NHibernate and ADO.NET is another topic. Also, several session properties defined by T-SQL SET are different in the two scenarios (Management Studio query vs. application at runtime) This is a real puzzle to me. I would be happy about any insights on that. Thank you!

    Read the article

  • ELMAH not logging in ASP.NET MVC 2

    - by PsychoCoder
    I cannot figure out what I'm doing wrong here, trying to use ELMAH in my MVC 2 application and it doesnt log anything, ever. Here's what I have in my web.config (relevant parts) <sectionGroup name="elmah"> <section name="security" requirePermission="false" type="Elmah.SecuritySectionHandler, Elmah" /> <section name="errorLog" requirePermission="false" type="Elmah.ErrorLogSectionHandler, Elmah" /> <section name="errorMail" requirePermission="false" type="Elmah.ErrorMailSectionHandler, Elmah" /> <section name="errorFilter" requirePermission="false" type="Elmah.ErrorFilterSectionHandler, Elmah" /> </sectionGroup> <elmah> <security allowRemoteAccess="0" /> <errorLog type="Elmah.SqlErrorLog, Elmah" connectionStringName="ELMAH.SqlServer" /> <!-- <errorMail from="[email protected]" to="[email protected]" cc="" subject="Elmah Error" async="true" smtpPort="25" smtpServer="[EmailServerName]" userName="" password="" /> <errorLog type="Elmah.XmlFileErrorLog, Elmah" logPath="~/App_Data" /> --> </elmah> <connectionStrings> ... <add name="ELMAH.SqlServer" connectionString="data source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\ELMAH_Logging.mdf;Integrated Security=SSPI;Connect Timeout=30;User Instance=True;" providerName="System.Data.SqlClient"/> </connectionStrings> <system.Web> <httpHandlers> <add verb="POST,GET,HEAD" path="elmah.axd" type="Elmah.ErrorLogPageFactory, Elmah" /> ... </httpHandlers> <httpModules> <add name="ErrorLog" type="Elmah.ErrorLogModule, Elmah" /> <add name="ErrorFilter" type="Elmah.ErrorFilterModule, Elmah" /> <add name="ErrorMail" type="Elmah.ErrorMailModule, Elmah" /> ... </httpModules> </system.Web> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <modules runAllManagedModulesForAllRequests="true"> <add name="ErrorLog" type="Elmah.ErrorLogModule, Elmah" /> <add name="ErrorFilter" type="Elmah.ErrorFilterModule, Elmah" /> <add name="ErrorMail" type="Elmah.ErrorMailModule, Elmah" /> </modules> <handlers> <add name="Elmah" verb="POST,GET,HEAD" path="elmah.axd" type="Elmah.ErrorLogPageFactory, Elmah"/> ... </handlers> </system.webServer> Then using the code from DotNetDarren.com but no matter what I do no exceptions are ever logged?

    Read the article

  • Why are there connections open to my databases?

    - by Everett
    I have a program that stores user projects as databases. Naturally, the program should allow the user to create and delete the databases as they need to. When the program boots up, it looks for all the databases in a specific SQLServer instance that have the structure the program is expecting. These database are then loaded into a listbox so the user can pick one to open as a project to work on. When I try to delete a database from the program, I always get an SQL error saying that the database is currently open and the operation fails. I've determined that the code that checks for the databases to load is causing the problem. I'm not sure why though, because I'm quite sure that all the connections are being properly closed. Here are all the relevant functions. After calling BuildProjectList, running "DROP DATABASE database_name" from ExecuteSQL fails with the message: "Cannot drop database because it is currently in use". I'm using SQLServer 2005. private SqlConnection databaseConnection; private string connectionString; private ArrayList databases; public ArrayList BuildProjectList() { //databases is an ArrayList of all the databases in an instance if (databases.Count <= 0) { return null; } ArrayList databaseNames = new ArrayList(); for (int i = 0; i < databases.Count; i++) { string db = databases[i].ToString(); connectionString = "Server=localhost\\SQLExpress;Trusted_Connection=True;Database=" + db + ";"; //Check if the database has the table required for the project string sql = "select * from TableExpectedToExist"; if (ExecuteSQL(sql)) { databaseNames.Add(db); } } return databaseNames; } private bool ExecuteSQL(string sql) { bool success = false; openConnection(); SqlCommand cmd = new SqlCommand(sql, databaseConnection); try { cmd.ExecuteNonQuery(); success = true; } catch (SqlException ae) { MessageBox.Show(ae.Message.ToString()); } closeConnection(); return success; } public void openConnection() { databaseConnection = new SqlConnection(connectionString); try { databaseConnection.Open(); } catch(Exception e) { MessageBox.Show(e.ToString(), "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); } } public void closeConnection() { if (databaseConnection != null) { try { databaseConnection.Close(); } catch (Exception e) { MessageBox.Show(e.ToString(), "Error", MessageBoxButtons.OK, MessageBoxIcon.Error); } } }

    Read the article

  • 'The default schema does not exist' on deploy of SQL CLR assembly onto SQL Server 2008

    - by abatishchev
    I'm deploying an example SQL CLR stored procedure which has a SQL CLR type as parameter using Visual Studio 2008 and menu Project -> Deploy. public partial class StoredProcedures { [Microsoft.SqlServer.Server.SqlProcedure] public static void TakeTariff(TariffInfo tariffInfo) { } } public class TariffInfo { public SqlDecimal Amount { get; private set; } } but getting next strange error: The default schema does not exist. How can I fix that? My user was created this way: CREATE USER myUser FOR LOGIN myLogin_mod WITH DEFAULT_SCHEMA = mySchema

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >