Search Results

Search found 36186 results on 1448 pages for 'sql 11'.

Page 539/1448 | < Previous Page | 535 536 537 538 539 540 541 542 543 544 545 546  | Next Page >

  • Using openrowset to read an Excel file into a temp table; how do I reference that table?

    - by mattstuehler
    I'm trying to write a stored procedure that will read an Excel file into a temp table, then massage some of the data in that table, then insert selected rows from that table into a permanent table. So, it starts like this: SET @SQL = "select * into #mytemptable FROM OPENROWSET('Microsoft.Jet.OLEDB.4.0', 'Excel 8.0;Database="+@file+";HDR=YES', 'SELECT * FROM [Sheet1$]')" EXEC (@SQL) That much seems to work. However, if I then try something like this: Select * from #mytemptable I get an error: Invalid object name '#mytemptable' Why isn't #mytemptable recognized? Is there a way to have #mytemptable accessible to the rest of the stored procedure? Many thanks in advance!

    Read the article

  • Windows Azure Platform, latest version?

    - by Vimvq1987
    I searched through internet but found nothing. The whitepapers of Windows Azure Platform say something like that: In its first release, the maximum size of a single database in SQL Azure Database is 10 gigabytes A few things are omitted in the technology’s first release, however, such as the SQL Common Language Runtime (CLR) and support for spatial data. (Microsoft says that both will be available in a future version.) I want to know that Microsoft had updated Windows Azure Platform and removed these limits or not? I decided to post this question here instead of Serverfault.com because it's more relative to programming than administration. Thank you

    Read the article

  • Last byte missing when casting from varbinary to varchar

    - by xaxie
    I got the last byte losing when converting varbinary to varchar in some case. For example: DECLARE @binary varbinary(8000), @char varchar(8000) set @binary = 0x000082 set @char = CAST(@binary as varchar(8000)) select BinaryLength=DATALENGTH(@binary), CharLength=DATALENGTH(@char) The result is BinaryLength CharLength 3 2 The affected byte value is from 0x81 - 0xFE. The stranger thing is that if I use varchar(MAX) instead varchar(8000) when casting, there is no any problem. Could someone tell me the root cause of the issue? PS: I run the sql in MS SQL server 2008. Thanks!

    Read the article

  • Sub-query problem on Oracle 10g

    - by Eric
    The following query works on Oracle 10.2.0.1.0 on windows,but doesn't work on Oracle 10.2.0.2.0 on Linux. What's the problem?How can I make it work? Thanks! CREATE TABLE AUDITHISTORY( CASENUM numeric(20, 0) NOT NULL, AUDIT_DATE date NOT NULL, USER_NAME varchar(255) NULL, AUDIT_USECS numeric(6, 0) NOT NULL ) Query: SELECT T.CASENUM, T.USER_NAME, T.AUDIT_DATE AS STARTED, (SELECT * FROM (SELECT S.AUDIT_DATE FROM AUDITHISTORY S WHERE S.CASENUM=T.CASENUM AND S.USER_NAME=T.USER_NAME AND (S.AUDIT_DATE > T.AUDIT_DATE OR (S.AUDIT_DATE = T.AUDIT_DATE AND S.AUDIT_USECS > T.AUDIT_USECS)) ORDER BY S.AUDIT_DATE ASC,S.AUDIT_USECS ASC ) WHERE rownum <= 1) AS ENDED FROM AUDITHISTORY T BANNER Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Prod PL/SQL Release 10.2.0.1.0 - Production CORE 10.2.0.1.0 Production TNS for 32-bit Windows: Version 10.2.0.1.0 - Production NLSRTL Version 10.2.0.1.0 - Production BANNER Oracle Database 10g Enterprise Edition Release 10.2.0.2.0 - Prod PL/SQL Release 10.2.0.2.0 - Production CORE 10.2.0.2.0 Production TNS for Linux: Version 10.2.0.2.0 - Production NLSRTL Version 10.2.0.2.0 - Production

    Read the article

  • Any suggestions on data syncronization from a server to a desktop?

    - by Jamey McElveen
    I have a web based CRM system that stores all the client data from all the clients into one database (MS Sql Server). We need to build a system that maintains a local copy of the data for each client. So basically the client database will have all tables and columns except for the ClientId that is in ever server table. I am aware that I will need to add fields to the server to support synchronization. Are there any good solutions or components already out there to help me accomplish this? We are using MS Sql Server and .NET C#

    Read the article

  • How can I set MAXLENGTH on input fields automatically?

    - by Gary McGill
    I'm using ASP.NET MVC RC2. I have a set of classes that were auto-generated "Linq-to-SQL Classes", where each class models a table in my database. I have some default "edit" views that use the neat-o Html.TextBoxFor helper extension to render HTML input fields for the columns in a table. However, I notice that there is no MAXLENGTH attribute specified in the generated HTML. A quick google shows that you can add explicit attributes using TextBoxFor etc. but that would mean that I need to hard-code the values (and I need to remember to do it for every input). This seems pretty poor, considering that everything was automatically generated directly from the DB, so the size of the columns is known. (Although, to be fair, the web-side stuff has to work with any model objects, not just with Linq-to-SQL objects). Is there any nice way of getting this to do the right thing?

    Read the article

  • Prevent password leakage while using sql* loader

    - by Jai
    I have shell script calling Sql*loader utility which inturn uses username/password as arguments. This details cannot be stored on server in any form due to security related policies. i got 2 approaches to handle this situation, 1. create hidden parameter file with login details and limit the access to owner. again the implication is i cannot store login data in any format on server 2. Create the user as OS authenticated and straight away login into sql without any userid/password I am not able to figure out risks involved in 2nd approach which u experienced folks could have come across. let me know if there is any other approach to handle password leakage issue

    Read the article

  • Problem with SqlServer 2005 when opening connections

    - by Jose Obregon
    I have a Winforms application and I use EntLib to connect to a SQL Server 2005 DB. The application is working ok, but sometimes, and lately more often, we have started receiving this error from the db when opening the connection: A connection was successfully established with the server, but then an error occurred during the login process. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.) (Microsoft SQL Server, Error: 233) The problem is intermittent. The user works well for a couple of hours and then suddenly the exception is thrown. Sometimes it happens when we run a small process that loads a file and then inserts the data to the db.

    Read the article

  • Creating Membership Tables, SPROCs, Views in Attached DB

    - by azamsharp
    I have AdventureWorks database mdf file in my VS 2010 project. Is there anyway I can create Membership tables inside my AdventureWorks database. I know I can detach the database attach in SQL SERVER 2008. Create the tables and then detach. But I don't have SQL SERVER 2008 and I want to see if this can be done using command line tool. I tried this but no use: aspnet_regsql.exe -d AdventureWorks.mdf -A mr -E -S .\SQLEXPRESS Update: If I right click and see the properties of the AdventureWorks.mdf database then it shows the name as "C4BE6C8DA139A060D14925377A7E63D0_64A_10\ADVENTUREWORKSWEBFORMS\ADVENTUREWORKSWEBFORMS\ADVENTUREWORKS\APP_DATA\ADVENTUREWORKS.MDF" This is interesting!

    Read the article

  • How to manage and capture database changes across several developers?

    - by Matt Greer
    We have three developers and one tester all working against the same database. We change the schema of the database quite often, and every time we do it tends to have a ripple effect of headaches for everyone else. Are there good practices in place for .NET oriented development against MS SQL Server 2008 for managing this? I am thinking something similar to Rails Migrations and each dev and tester has their own local database. Or is that overkill? It'd at least be nice to have separate test and dev databases, but currently manually keeping two databases in sync is probably worse than our current predicament. LiquiBase seems promising, has anyone successfully used it in a similar environment? Or are there better approaches? We are using SQL Server 2008, VS 2008 and .NET 3.5 if that matters at all.

    Read the article

  • cakePHP - ACL tutorial & SQL Error: 1104

    - by vector
    Greetings! I run into a problem/bug in production environment with SQL Error: 1104 when deploying a project secured with the ACL tutorial. The full error: SQL Error: 1104: The SELECT would examine more than MAX_JOIN_SIZE rows; check your WHERE and use SET SQL_BIG_SELECTS=1 or SET SQL_MAX_JOIN_SIZE=# if the SELECT is okay [CORE/cake/libs/model/datasources/dbo_source.php, line 666] I've build a small site, worked through the ACL tutorial from the official cake book and locally everything is spiffy. dev. setup consists of: XAMPP: myslq 5.1.33, php 5.3 cakePHP 1.2.7 prod. setup: mysql 5.0.33 php 5.2.2 One of the differences I noticed with mysql setup between local and production servers, is that SQL_BIG_SELECTS is disabled on the production server ( and I don't have privileges to change that, posted a support ticked, didn't hear anything yet) Is there anything I can do about this from my end? Thanks in advance.

    Read the article

  • ado.net managing connections

    - by madlan
    Hi, I'm populating a listview with a list of databases on a selected SQL instance, then retrieving a value from each database (It's internal product version, column doesn't always exist) I'm calling the below function to populate the second column of the listview: item.SubItems.Add(DBVersionCheck(serverName, database.Name)) Function DBVersionCheck(ByVal SelectedInstance As String, ByVal SelectedDatabase As String) Dim m_Connection As New SqlConnection("Server=" + SelectedInstance + ";User Id=sa;Password=password;Database=" + SelectedDatabase) Dim db_command As New SqlCommand("select Setting from SystemSettings where [Setting] = 'version'", m_Connection) Try m_Connection.Open() Return db_command.ExecuteScalar().trim m_Connection.Dispose() Catch ex As Exception 'MessageBox.Show(ex.Message) Return "NA" Finally m_Connection.Dispose() End Try End Function This works fine except it's creating a connection to each database and leaving it open. My understanding is the close()\dispose() releases only the connection from the pool in ado rather than the actual connection to sql. How would I close the actual connections after I've retrieved the value? Leaving these open will create hundreds of connections to databases that will probably not be used for that session.

    Read the article

  • Page upload data again on page refresh in ASP.NET

    - by Etienne
    For some reason when the user click on the submit button and he re-fresh the page the same data get's uploaded again to my SQL Server 2005 database. I do not what this to happen........... Why is this happening? I am making use of a SQL Data Source!! My code Try 'See if user typed the correct code. If Me.txtSecurity.Text = Session("Captcha") Then If Session("NotifyMe") = "Yes" Then SendEmailNS() End If RaterRate.Insert() RaterRate.Update() DisableItems() lblResultNS.Text = "Thank you for leaving a comment" LoadCompanyList() LoadRateRecords() txtCommentNS.Text = "" txtSecurity.Text = "" lblResultNS.Focus() Else Session("Captcha") = GenerateCAPTCHACode() txtSecurity.Text = "" txtSecurity.Focus() Validator10.Validate() End If Catch ex As Exception lblResultNS.Visible = True lblResultNS.Text = ex.Message.ToString lblResultNS.Focus() End Try

    Read the article

  • NHibernate - I have many, but I only want one!

    - by MartinF
    Hello, I have a User which can have many Emails. This is mapped through a List collection (exposed by IEnumerable Emails on the User). For each User one of the Emails will be the Primary one ("Boolean IsPrimary" property on Email). How can I get the primary Email from User without NHibernate loads every email for the User ? I have the following two entities, with a corresponding table for each public class User { public virtual int Id { get; set; } public virtual IEnumerable<Email> Emails { get; set; } // public virtual Email PrimaryEmail { get; set; } - Possible somehow ? } public class Email { public virtual int Id { get; set; } public virtual String Address { get; set; } public virtual Boolean IsPrimary { get; set; } public virtual User User { get; set; } } Can I map a "Email PrimaryEmail" property etc. on the User to the Email which have "IsPrimary=1" set somehow ? Maybe using a Sql Formula ? a View ? a One-To-One relationship ? or another way ? It should be possible to change the primary email to be one of the other emails, so i would like to keep them all in 1 table and just change the IsPrimary property. Using a Sql Formula, is it be possible to keep the "PrimaryEmail" property on the User up-to-date, if I set the IsPrimary property on the current primary email to false, and then afterwards set the PrimaryEmail property to the email which should be the new primary email and set IsPrimary to true ? Will NHibernate track changes on the "old/current" primary Email loaded by the Sql Formula ? What about the 1 level cache and the 2 level cache when using SqlFormula ? I dont know if it could work by using a View ? Then i guess the Email could be mapped like a Component ? Will it work when updating the Email data when loaded from the View ? Is there a better way ? As I have a bi-directional relationship between User and Email I could in many cases of course query the primary Email and then use the "User" property on the Email to get the User (instead of the other way around - going from User to the primary Email) Hope someone can help ?

    Read the article

  • Unit Testing, IDataContext and Stored Procedures via Linq

    - by Terry_Brown
    hey folks, I'm currently using Stephen Walther's approach to unit testing Linq to SQL and the datacontext (http://stephenwalther.com/blog/archive/2008/08/17/asp-net-mvc-tip-33-unit-test-linq-to-sql.aspx) which is working a treat for all things linq. My Dal layer takes in an IDataContext, I use DI (via Unity) to concrete that up as the correct DataContext object and all works well. But - we have a company policy here of writes going via Stored Procs. Has anyone come up with a solution for mocking/faking the data context and still allowing stored procs to effectively be unit tested? I've got no real experience of any of the mocking frameworks (Rhino etc.) so if these are the correct means of doing this, I'd love some pointers on guides for them. Many thanks, Terry

    Read the article

  • Creating a CLR UDF with variable number of parameters

    - by josephj1989
    Hi I wanted a function to find the greatest of a list of String values passed in. I want to invoke it as Select greatest('Abcd','Efgh','Zxy','EAD') from sql server. It should return Zxy. The number of parameters is variable.Incidentally it is very similar to oracle GREATEST function. So I wrote a very simple CLR function (Vs2008) and tried to deploy it. See below public partial class UserDefinedFunctions { [Microsoft.SqlServer.Server.SqlFunction] public static SqlString Greatest(params SqlString[] p) { SqlString max=p[0]; foreach (string s in p) max = s.CompareTo(max) > 0 ? s : max; return max; } }; But when I try to compile or deploy it I get the following error Cannot find data type SqlString[]. Is it possible to satisfy my requirement using SQL CLR ?

    Read the article

  • SSIS - How do I see/set the field types in a Recordset?

    - by thursdaysgeek
    I'm looking at an inherited SSIS package, and a stored procedure is sending records to a recordset called USER:NEW_RECORDS. It's of type Object, and the value is System.Object. It is then used for inputting that data to a SQL table. We're getting an error, because it seems that the numeric results of the stored procedure are being put in a DT_WSTR field, and then failing when it is then put into a decimal field in the database. Most of the records are working, but one, which happens to have a longer number of decimal digits, is failing. I want to see exactly what my SSIS recordset field types are, and probably change them, so I can force the data to be truncated properly and copied. Or, perhaps, I'm not even looking at this correctly. The data is put into the recordset using a SQL Task that executes the stored procedure.

    Read the article

  • Get correct output from UTF-8 stored in VarChar using Entity Framework or Linq2SQL?

    - by jasonpenny
    Borland StarTeam seems to store its data as UTF-8 encoded data in VarChar fields. I have an ASP.NET MVC site that returns some custom HTML reports using the StarTeam database, and I would like to find a better solution for getting the correct data, for a rewrite with MVC2. I tried a few things with Encoding GetBytes and GetString, but I couldn't get it to work (we use mostly Delphi at work); then I figured out a T-SQL function to return a NVarChar from UTF-8 stored in a VarChar, and created new SQL views which return the data as NVarChar, but it's slow. The actual problem appears like this: “description†instead of “description”, in both SSMS and in a webpage when using Linq2SQL Is there a way to get the proper data out of these fields using Entity Framework or Linq2SQL?

    Read the article

  • Database on the fly with scripting languages

    - by afilatun
    I have a set of .csv files that I want to process. It would be far easier to process it with SQL queries. I wonder if there is some way to load a .csv file and use SQL language to look into it with a scripting language like python or ruby. Loading it with something similar to ActiveRecord would be awesome. The problem is that I don't want to have to run a database somewhere prior to running my script. I souldn't have additionnal installations needed outside of the scripting language and some modules. My question is which language and what modules should I use for this task. I looked around and can't find anything that suits my need. Is it even possible?

    Read the article

  • Filesystem synchronization library?

    - by IsaacB
    Hi, I've got 10 GB of files to back up daily to another site. The client is way out in the country so bandwidth is an issue. Does anyone know of any existing software or libraries out there that help with keeping a folder with its files synchronized across a slow link, that is it only sends files across if they have changed? Some kind of hash checking would be nice, too, to at least confirm the two sides are the same. I don't mind paying some money for it, seeing as how it might take me several weeks to a month to implement something decent on my own. I just don't want to re-invent the wheel, here. BTW it is a windows shop (they have an in house windows IT guy) so windows is preferred. I also have 10 GB of SQL Server 2000 databases to go across. Is the SQL server replication mode reliable? Thanks!

    Read the article

  • Retrieving a filtered list of stored procedures using t-sql

    - by DanDan
    I'm trying to get a list of stored procedures in t-sql. I am using the line: exec sys.sp_stored_procedures; I would like to filter the results back though, so I only get user created stored procedures. I would like to filter out sp_*, dt_*, fn_*, xp_* and everything else that is a system stored procedure and no interest to me. How can I manipulate the result set returned? Using Sql Server 2008 express. Solved! Here is what I used: SELECT name FROM sys.procedures WHERE [type] = 'P' AND name NOT LIKE 'sp_%' AND name NOT LIKE 'dt_%' ORDER BY name ASC;

    Read the article

  • Efficiency of checking for null varbinary(max) column ?

    - by Moe Sisko
    Using SQL Server 2008. Example table : CREATE table dbo.blobtest (id int primary key not null, name nvarchar(200) not null, data varbinary(max) null) Example query : select id, name, cast((case when data is null then 0 else 1 end) as bit) as DataExists from dbo.blobtest Now, the query needs to return a "DataExists" column, that returns 0 if the blob is null, else 1. This all works fine, but I'm wondering how efficient it is. i.e. does SQL server need to read in the whole blob to its memory, or is there some optimization so that it just does enough reads to figure out if the blob is null or not ? (FWIW, the sp_tableoption "large value types out of row" option is set to OFF for this example).

    Read the article

< Previous Page | 535 536 537 538 539 540 541 542 543 544 545 546  | Next Page >