Search Results

Search found 28900 results on 1156 pages for 'sql 2005'.

Page 94/1156 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • Sql Server 2008 types in Sql CLR stored procedure

    - by BadEnglish
    I have Table-valued parameters in SQL Server 2008 e.g. CREATE TYPE UserType AS TABLE ( UserID int, UserName nvarchar(100), UserPassword nvarchar(100) ) Can i use this type somehow in my Sql CLR stored procedure? for example as input parameter ?? [SqlProcedure] public static void SomeFunction(/* what type should be here ?? */) { } Will be appreciate even for attention, let alone for any help !

    Read the article

  • Dump Microsoft SQL Server database to an SQL script

    - by Matt Sheppard
    Is there any way to export a Microsoft SQL Server database to an sql script? I'm looking for something which behaves similarly to mysqldump, taking a database name, and producing a single script which will recreate all the tables, stored procedures, reinsert all the data etc. I've seen http://vyaskn.tripod.com/code.htm#inserts, but I ideally want something to recreate everything (not just the data) which works in a single step to produce the final script.

    Read the article

  • SQL Server 2008: If Multiple Values Set In Other Mutliple Values Set

    - by AJH
    In SQL, is there anyway to accomplish something like this? This is based off a report built in SQL Server Report Builder, where the user can specify multiple text values as a single report parameter. The query for the report grabs all of the values the user selected and stores them in a single variable. I need a way for the query to return only records that have associations to EVERY value the user specified. -- Assume there's a table of Elements with thousands of entries. -- Now we declare a list of properties for those Elements to be associated with. create table #masterTable ( ElementId int, Text varchar(10) ) insert into #masterTable (ElementId, Text) values (1, 'Red'); insert into #masterTable (ElementId, Text) values (1, 'Coarse'); insert into #masterTable (ElementId, Text) values (1, 'Dense'); insert into #masterTable (ElementId, Text) values (2, 'Red'); insert into #masterTable (ElementId, Text) values (2, 'Smooth'); insert into #masterTable (ElementId, Text) values (2, 'Hollow'); -- Element 1 is Red, Coarse, and Dense. Element 2 is Red, Smooth, and Hollow. -- The real table is actually much much larger than this; this is just an example. -- This is me trying to replicate how SQL Server Report Builder treats -- report parameters in its queries. The user selects one, some, all, -- or no properties from a list. The written query treats the user's -- selections as a single variable called @Properties. -- Example scenario 1: User only wants to see Elements that are BOTH Red and Dense. select e.* from Elements e where (@Properties) --ideally a set containing only Red and Dense in (select Text from #masterTable where ElementId = e.Id) --ideally a set containing only Red, Coarse, and Dense --Both Red and Dense are within Element 1's properties (Red, Coarse, Dense), so Element 1 gets returned, but not Element 2. -- Example scenario 2: User only wants to see Elements that are BOTH Red and Hollow. select e.* from Elements e where (@Properties) --ideally a set containing only Red and Hollow in (select Text from #masterTable where ElementId = e.Id) --Both Red and Hollow are within Element 2's properties (Red, Smooth, Hollow), so Element 2 gets returned, but not Element 1. --Example Scenario 3: User only picked the Red option. select e.* from Elements e where (@Properties) --ideally a set containing only Red in (select Text from #masterTable where ElementId = e.Id) --Red is within both Element 1 and Element 2's properties, so both Element 1 and Element 2 get returned. The above syntax doesn't actually work because SQL doesn't seem to allow multiple values on the left side of the "in" comparison. Error that returns: Subquery returned more than 1 value. This is not permitted when the subquery follows =, !=, <, <= , >, >= or when the subquery is used as an expression. Am I even on the right track here? Sorry if the example looks long-winded or confusing.

    Read the article

  • Converting SQL Query to LINQ 2 SQL expression

    - by Shyju
    How can i rewrite the below SQL query to its equivalent LINQ 2 SQL expression (both in C# and VB.NET) SELECT t1.itemnmbr, t1.locncode,t1.bin,t2.Total FROM IV00200 t1 (NOLOCK) INNER JOIN IV00112 t2 (NOLOCK) ON t1.itemnmbr = t2.itemnmbr AND t1.bin = t2.bin AND t1.bin = 'MU7I336A80'

    Read the article

  • Select into from one sql server into another?

    - by blasto
    I want to select data from one table (T1, in DB1) in one server (Data.Old.S1) into data in another table (T2, in DB2) in another server (Data.Latest.S2). How can I do this ? Please note the way the servers are named. The query should take care of that too. That is, SQL server should not be confused about fully qualified table names. For example - this could confuse SQL server - Data.Old.S1.DB1.dbo.T1.

    Read the article

  • Hierarchical data in Linq - options and performance

    - by Anthony
    I have some hierarchical data - each entry has an id and a (nullable) parent entry id. I want to retrieve all entries in the tree under a given entry. This is in a SQL Server 2005 database. I am querying it with LINQ to SQL in C# 3.5. LINQ to SQL does not support Common Table Expressions directly. My choices are to assemble the data in code with several LINQ queries, or to make a view on the database that surfaces a CTE. Which option (or another option) do you think will perform better when data volumes get large? Is SQL Server 2008's HierarchyId type supported in Linq to SQL?

    Read the article

  • Best solution to import records from MySQL database to MS SQL (Hourly)

    - by xkingpin
    I need to import records stored in a MySQL Database that I do not maintain into my Sql Server 2005 database (x64) We should import the records at an interval basis (probably 1 hour). What would be the best solution to perform the regular import? Windows Service (using reference MySql.data dll) Windows Client (could make it automated) SQL Extended Stored Procedure (is it possible to reference the MySQL.data dll?) SSIS package - Install MySQL ODBC driver The problem with #4 is that I do not really want to support the ODBC driver on the sql server. I'm not sure if you can even reference the x86 MySql.data dll into a x64 sql server process for #3. (Or if you can even reference that dll within a sql server project)

    Read the article

  • Have you really fixed that problem?

    - by DavidWimbush
    The day before yesterday I saw our main live server's CPU go up to constantly 100% with just the occasional short drop to a lower level. The exact opposite of what you'd want to see. We're log shipping every 15 minutes and part of that involves calling WinRAR to compress the log backups before copying them over. (We're on SQL2005 so there's no native compression and we have bandwidth issues with the connection to our remote site.) I realised the log shipping jobs were taking about 10 minutes and that most of that was spent shipping a 'live' reporting database that is completely rebuilt every 20 minutes. (I'm just trying to keep this stuff alive until I can improve it.) We can rebuild this database in minutes if we have to fail over so I disabled log shipping of that database. The log shipping went down to less than 2 minutes and I went off to the SQL Social evening in London feeling quite pleased with myself. It was a great evening - fun, educational and thought-provoking. Thanks to Simon Sabin & co for laying that on, and thanks too to the guests for making the effort when they must have been pretty worn out after doing DevWeek all day first. The next morning I came down to earth with a bump: CPU still at 100%. WTF? I looked in the activity monitor but it was confusing because some sessions have been running for a long time so it's not a good guide what's using the CPU now. I tried the standard reports showing queries by CPU (average and total) but they only show the top 10 so they just show my big overnight archiving and data cleaning stuff. But the Profiler showed it was four queries used by our new website usage tracking system. Four simple indexes later the CPU was back where it should be: about 20% with occasional short spikes. So the moral is: even when you're convinced you've found the cause and fixed the problem, you HAVE to go back and confirm that the problem has gone. And, yes, I have checked the CPU again today and it's still looking sweet.

    Read the article

  • Using SQL Server specific code in Access linked to SQL Server database

    - by Brennan Vincent
    Hi, I have an access file that is linked (through an ODBC connection) to a SQL Server 2008 database. I am trying to write some reports against this database. However, Access chokes when I write the select query of the report with SQL syntax specific to SQL Server that doesn't exist in access. Shouldn't this work, since it's the SQL Server engine running the queries and just sending the data back to Access to display? Is there any way to get this to work? Need this to work on any combination of Access 2007 and 2010, and SQL Server 2005 and 2008. Edit Note: I cannot create a SQL Server stored procedure or function, or otherwise modify the original (SQL Server) schema in any way.

    Read the article

  • General monitoring for SQL Server Analysis Services using Performance Monitor

    - by Testas
    A recent customer engagement required a setup of a monitoring solution for SSAS, due to the time restrictions placed upon this, native Windows Performance Monitor (Perfmon) and SQL Server Profiler Monitoring Tools was used as using a third party tool would have meant the customer providing an additional monitoring server that was not available.I wanted to outline the performance monitoring counters that was used to monitor the system on which SSAS was running. Due to the slow query performance that was occurring during certain scenarios, perfmon was used to establish if any pressure was being placed on the Disk, CPU or Memory subsystem when concurrent connections access the same query, and Profiler to pinpoint how the query was being managed within SSAS, profiler I will leave for another blogThis guide is not designed to provide a definitive list of what should be used when monitoring SSAS, different situations may require the addition or removal of counters as presented by the situation. However I hope that it serves as a good basis for starting your monitoring of SSAS. I would also like to acknowledge Chris Webb’s awesome chapters from “Expert Cube Development” that also helped shape my monitoring strategy:http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!6657.entrySimulating ConnectionsTo simulate the additional connections to the SSAS server whilst monitoring, I used ascmd to simulate multiple connections to the typical and worse performing queries that were identified by the customer. A similar sript can be downloaded from codeplex at http://www.codeplex.com/SQLSrvAnalysisSrvcs.     File name: ASCMD_StressTestingScripts.zip. Performance MonitorWithin performance monitor,  a counter log was created that contained the list of counters below. The important point to note when running the counter log is that the RUN AS property within the counter log properties should be changed to an account that has rights to the SSAS instance when monitoring MSAS counters. Failure to do so means that the counter log runs under the system account, no errors or warning are given while running the counter log, and it is not until you need to view the MSAS counters that they will not be displayed if run under the default account that has no right to SSAS. If your connection simulation takes hours, this could prove quite frustrating if not done beforehand JThe counters used……  Object Counter Instance Justification System Processor Queue legnth N/A Indicates how many threads are waiting for execution against the processor. If this counter is consistently higher than around 5 when processor utilization approaches 100%, then this is a good indication that there is more work (active threads) available (ready for execution) than the machine's processors are able to handle. System Context Switches/sec N/A Measures how frequently the processor has to switch from user- to kernel-mode to handle a request from a thread running in user mode. The heavier the workload running on your machine, the higher this counter will generally be, but over long term the value of this counter should remain fairly constant. If this counter suddenly starts increasing however, it may be an indicating of a malfunctioning device, especially if the Processor\Interrupts/sec\(_Total) counter on your machine shows a similar unexplained increase Process % Processor Time sqlservr Definately should be used if Processor\% Processor Time\(_Total) is maxing at 100% to assess the effect of the SQL Server process on the processor Process % Processor Time msmdsrv Definately should be used if Processor\% Processor Time\(_Total) is maxing at 100% to assess the effect of the SQL Server process on the processor Process Working Set sqlservr If the Memory\Available bytes counter is decreaing this counter can be run to indicate if the process is consuming larger and larger amounts of RAM. Process(instance)\Working Set measures the size of the working set for each process, which indicates the number of allocated pages the process can address without generating a page fault. Process Working Set msmdsrv If the Memory\Available bytes counter is decreaing this counter can be run to indicate if the process is consuming larger and larger amounts of RAM. Process(instance)\Working Set measures the size of the working set for each process, which indicates the number of allocated pages the process can address without generating a page fault. Processor % Processor Time _Total and individual cores measures the total utilization of your processor by all running processes. If multi-proc then be mindful only an average is provided Processor % Privileged Time _Total To see how the OS is handling basic IO requests. If kernel mode utilization is high, your machine is likely underpowered as it's too busy handling basic OS housekeeping functions to be able to effectively run other applications. Processor % User Time _Total To see how the applications is interacting from a processor perspective, a high percentage utilisation determine that the server is dealing with too many apps and may require increasing thje hardware or scaling out Processor Interrupts/sec _Total  The average rate, in incidents per second, at which the processor received and serviced hardware interrupts. Shoulr be consistant over time but a sudden unexplained increase could indicate a device malfunction which can be confirmed using the System\Context Switches/sec counter Memory Pages/sec N/A Indicates the rate at which pages are read from or written to disk to resolve hard page faults. This counter is a primary indicator of the kinds of faults that cause system-wide delays, this is the primary counter to watch for indication of possible insufficient RAM to meet your server's needs. A good idea here is to configure a perfmon alert that triggers when the number of pages per second exceeds 50 per paging disk on your system. May also want to see the configuration of the page file on the Server Memory Available Mbytes N/A is the amount of physical memory, in bytes, available to processes running on the computer. if this counter is greater than 10% of the actual RAM in your machine then you probably have more than enough RAM. monitor it regularly to see if any downward trend develops, and set an alert to trigger if it drops below 2% of the installed RAM. Physical Disk Disk Transfers/sec for each physical disk If it goes above 10 disk I/Os per second then you've got poor response time for your disk. Physical Disk Idle Time _total If Disk Transfers/sec is above  25 disk I/Os per second use this counter. which measures the percent time that your hard disk is idle during the measurement interval, and if you see this counter fall below 20% then you've likely got read/write requests queuing up for your disk which is unable to service these requests in a timely fashion. Physical Disk Disk queue legnth For the OLAP and SQL physical disk A value that is consistently less than 2 means that the disk system is handling the IO requests against the physical disk Network Interface Bytes Total/sec For the NIC Should be monitored over a period of time to see if there is anb increase/decrease in network utilisation Network Interface Current Bandwidth For the NIC is an estimate of the current bandwidth of the network interface in bits per second (BPS). MSAS 2005: Memory Memory Limit High KB N/A Shows (as a percentage) the high memory limit configured for SSAS in C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Config\msmdsrv.ini MSAS 2005: Memory Memory Limit Low KB N/A Shows (as a percentage) the low memory limit configured for SSAS in C:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Config\msmdsrv.ini MSAS 2005: Memory Memory Usage KB N/A Displays the memory usage of the server process. MSAS 2005: Memory File Store KB N/A Displays the amount of memory that is reserved for the Cache. Note if total memory limit in the msmdsrv.ini is set to 0, no memory is reserved for the cache MSAS 2005: Storage Engine Query Queries from Cache Direct / sec N/A Displays the rate of queries answered from the cache directly MSAS 2005: Storage Engine Query Queries from Cache Filtered / Sec N/A Displays the Rate of queries answered by filtering existing cache entry. MSAS 2005: Storage Engine Query Queries from File / Sec N/A Displays the Rate of queries answered from files. MSAS 2005: Storage Engine Query Average time /query N/A Displays the average time of a query MSAS 2005: Connection Current connections N/A Displays the number of connections against the SSAS instance MSAS 2005: Connection Requests / sec N/A Displays the rate of query requests per second MSAS 2005: Locks Current Lock Waits N/A Displays thhe number of connections waiting on a lock MSAS 2005: Threads Query Pool job queue Length N/A The number of queries in the job queue MSAS 2005:Proc Aggregations Temp file bytes written/sec N/A Shows the number of bytes of data processed in a temporary file MSAS 2005:Proc Aggregations Temp file rows written/sec N/A Shows the number of bytes of data processed in a temporary file 

    Read the article

  • Need help limiting a join in Transact-sql

    - by MsLis
    I'm somewhat new to SQL and need help with query syntax. My issue involves 2 tables within a larger multi-table join under Transact-SQL (MS SQL Server 2000 Query Analyzer) I have ACCOUNTS and LOGINS, which are joined on 2 fields: Site & Subset. Both tables may have multiple rows for each Site/Subset combination. ACCOUNTS: | LOGINS: SITE SUBSET FIELD FIELD FIELD | SITE SUBSET USERID PASSWD alpha bravo blah blah blah | alpha bravo foo bar alpha charlie blah blah blah | alpha bravo bar foo alpha charlie bleh bleh blue | alpha charlie id ego delta bravo blah blah blah | delta bravo john welcome delta foxtrot blah blah blah | delta bravo jane welcome | delta bravo ken welcome | delta bravo barbara welcome I want to select all rows in ACCOUNTS which have LOGIN entries, but only 1 login per account. DESIRED RESULT: SITE SUBSET FIELD FIELD FIELD USERID PASSWD alpha bravo blah blah blah foo bar alpha charlie blah blah blah id ego alpha charlie bleh bleh blue id ego delta bravo blah blah blah jane welcome I don't really care which row from the login table I get, but the UserID and Password have to correspond. [Don't return invalid combinations like foo/foo or bar/bar] MS Access has a handy FIRST function, which can do this, but I haven't found an equivalent in TSQL. Also, if it makes a difference, other tables are joined to ACCOUNTS, but this is the only use of LOGINS in the structure. Thank you very much for any assistance.

    Read the article

  • Schema objects not visible in SQL Server Management Studio 2008

    - by Germ
    I'm experiencing a weird problem with a SQL login. When I connect to the server in Microsoft SQL Server Management Studio (2008) using this account, I cannot see any of the tables, stored procedures etc. that this account should have access to on a particular database. When I connect to the same server within Visual Studio (2008) with the same account everything is there. When I connect with the same account on a Virtual Machine everything is there. I've also had a co-worker connect to the server using the same login and he's able to view everything as well. I use Microsoft SQL Server Management Studio all day connecting to different servers and databases and I've never experienced this problem. Does anyone have any suggestions on how I can diagnose this problem? I've checked to make sure I don't have any Table filters etc. There's several database on this server and I'm able to see the correct tables that this account has access to in the other databases just fine. Running this query lists the tables I'm expecting to see. SELECT * FROM INFORMATION_SCHEMA.TABLES

    Read the article

  • SQL Server: invalid object name how to solve it?

    - by Phsika
    error returns to me in below codes: Msg 208, Level 16, State 1, Line 1 Invalid object name 'ENG_PREP'. insert into ENG_PREP VALUES('572012-01-1,572012-01-2,572012-01-3,572013-01-1,572013-01-2', '', '500', '', 'A320 P.001-A', 'Removal of the LH Wing Safety Rope', '', '', '', '0', '', 'AF', '12-00-00-081-001', '', '', '', '', '', '', '' )

    Read the article

  • What's the diff between VSS 6.0 and VSS 2005?

    - by Peter Turner
    We've been using VSS 6.0 since time began, but yesterday I nabbed VSS2005 off of our MSDN subscription, it wouldn't let me install it off the ISO through Daemon Tools (not sure why, but I submitted error report to MS...). I noticed it had a program files directory right on the ISO, so I just copied the folder onto my hard drive. Well, I opened up the client and behold, a glamorous version of VSS 6.0 connected to the exact same DB. Anyone know if I'm going to destroy everything by using it?

    Read the article

  • Apply a recursive CTE on grouped table rows (SQL server 2005).

    - by Evan V.
    Hi all, I have a table (ROOMUSAGE) containing the times people check in and out of rooms grouped by PERSONKEY and ROOMKEY. It looks like this: PERSONKEY | ROOMKEY | CHECKIN | CHECKOUT | ROW ---------------------------------------------------------------- 1 | 8 | 13-4-2010 10:00 | 13-4-2010 11:00 | 1 1 | 8 | 13-4-2010 08:00 | 13-4-2010 09:00 | 2 1 | 1 | 13-4-2010 15:00 | 13-4-2010 16:00 | 1 1 | 1 | 13-4-2010 14:00 | 13-4-2010 15:00 | 2 1 | 1 | 13-4-2010 13:00 | 13-4-2010 14:00 | 3 13 | 2 | 13-4-2010 15:00 | 13-4-2010 16:00 | 1 13 | 2 | 13-4-2010 15:00 | 13-4-2010 16:00 | 2 I want to select just the consecutive rows for each PERSONKEY, ROOMKEY grouping. So the desired resulting table is: PERSONKEY | ROOMKEY | CHECKIN | CHECKOUT | ROW ---------------------------------------------------------------- 1 | 8 | 13-4-2010 10:00 | 13-4-2010 11:00 | 1 1 | 1 | 13-4-2010 15:00 | 13-4-2010 16:00 | 1 1 | 1 | 13-4-2010 14:00 | 13-4-2010 15:00 | 2 1 | 1 | 13-4-2010 13:00 | 13-4-2010 14:00 | 3 13 | 2 | 13-4-2010 15:00 | 13-4-2010 16:00 | 1 I want to avoid using cursors so I thought I would use a recursive CTE. Here is what I came up with: ;with CTE (PERSONKEY, ROOMKEY, CHECKIN, CHECKOUT, ROW) as (select RU.PERSONKEY, RU.ROOMKEY, RU.CHECKIN, RU.CHECKOUT, RU.ROW from ROOMUSAGE RU where RU.ROW = 1 union all select RU.PERSONKEY, RU.ROOMKEY, RU.CHECKIN, RU.CHECKOUT, RU.ROW from ROOMUSAGE RU inner join CTE on RU.ROWNUM = CTE.ROWNUM + 1 where CTE.CHECKIN = RU.CHECKOUT and CTE.PERSONKEY = RU.PERSONKEY and CTE.ROOMKEY = RU.ROOMKEY) This worked OK for very small datasets (under 100 records) but it's unusable on large datasets. I'm thinking that I should somehow apply the cte recursevely on each PERSONKEY, ROOMKEY grouping on my ROOMUSAGE table but I am not sure how to do that. Any help would be much appreciated, Cheers!

    Read the article

  • How to write to a varchar(max) column using ODBC

    - by andyjohnson
    Summary: I'm trying to write a text string to a column of type varchar(max) using ODBC and SQL Server 2005. It fails if the length of the string is greater than 8000. Help! I have some C++ code that uses ODBC (SQL Native Client) to write a text string to a table. If I change the column from, say, varchar(100) to varchar(max) and try to write a string with length greater than 8000, the write fails with the following error [Microsoft][ODBC SQL Server Driver]String data, right truncation So, can anyone advise me on if this can be done, and how? Some example (not production) code that shows what I'm trying to do: SQLHENV hEnv = NULL; SQLRETURN iError = SQLAllocEnv(&hEnv); HDBC hDbc = NULL; SQLAllocConnect(hEnv, &hDbc); const char* pszConnStr = "Driver={SQL Server};Server=127.0.0.1;Database=MyTestDB"; UCHAR szConnectOut[SQL_MAX_MESSAGE_LENGTH]; SWORD iConnectOutLen = 0; iError = SQLDriverConnect(hDbc, NULL, (unsigned char*)pszConnStr, SQL_NTS, szConnectOut, (SQL_MAX_MESSAGE_LENGTH-1), &iConnectOutLen, SQL_DRIVER_COMPLETE); HSTMT hStmt = NULL; iError = SQLAllocStmt(hDbc, &hStmt); const char* pszSQL = "INSERT INTO MyTestTable (LongStr) VALUES (?)"; iError = SQLPrepare(hStmt, (SQLCHAR*)pszSQL, SQL_NTS); char* pszBigString = AllocBigString(8001); iError = SQLSetParam(hStmt, 1, SQL_C_CHAR, SQL_VARCHAR, 0, 0, (SQLPOINTER)pszBigString, NULL); iError = SQLExecute(hStmt); // Returns SQL_ERROR if pszBigString len > 8000 The table MyTestTable contains a single colum defined as varchar(max). The function AllocBigString (not shown) creates a string of arbitrary length. I understand that previous versions of SQL Server had an 8000 character limit to varchars, but not why is this happening in SQL 2005? Thanks, Andy

    Read the article

  • visual studio 2005 is deleting the .svn folder in the bin\Debug directory - how to prevent this?

    - by M K Saravanan
    For some reason I need to check in a couple of files in the bin\Debug directory. For the past few weeks, I am noticing a strange behaviour from VS2005. Every time I recompile the code, it is deleting the .svn folder in the bin\Debug directory and hence svn is showing "obstructed" error. Even svn clean up doesn't help due to missing .svn folder. Is there any settings on VS2005 to prevent this? In the first place, why it is deleting .svn folder? This thread http://svn.haxx.se/tsvnusers/archive-2008-10/0019.shtml discuss about it but no useful solution to prevent this from happening. Any other suggestions?

    Read the article

  • SQL Server 2005. Full Text Search. Need Thesaurus working with NEAR/AND/OR keywords

    - by user305924
    Hi, does anyone know if it's possible to do a thesaurus search together with NEAR or AND/OR keywords. Here is an example of the type of query I want to run: SELECT Title, RANK FROM Item INNER JOIN CONTAINSTABLE(Item, Title, 'FORMSOF(Thesaurus, "red" NEAR "wine")') AS KEY_TBL ON Item.ItemID = KEY_TBL.[KEY] ORDER BY RANK DESC ....But I get the error message: Syntax error near 'NEAR' in the full-text search condition 'FORMSOF(Thesaurus, "red" NEAR "wine")'.

    Read the article

  • Disable auto-indent certain items in Visual Studio 2005?

    - by Jakobud
    I don't mind most of the way that VS2005 auto-indents (or auto-formats) my C++ code, but certain items I don't want it to automatically indent. Like #define statements for example. It takes them and shoves them all the way to the left side of the screen, no matter how deep into my scope I type them. That's really really annoying. Is there someway to alter this behavior, besides completely disabling auto-indent/format?

    Read the article

  • In MS SQL Server, is there a way to "atomically" increment a column being used as a counter?

    - by Dan P
    Assuming a Read Committed Snapshot transaction isolation setting, is the following statement "atomic" in the sense that you won't ever "lose" a concurrent increment? update mytable set counter = counter + 1 I would assume that in the general case, where this update statement is part of a larger transaction, that it wouldn't be. For example, I think this scenario is possible: update the counter within transaction #1 do some other stuff in transaction #1 update the counter with transaction #2 commit transaction #2 commit transaction #1 In this situation, wouldn't the counter end up only being incremented by 1? Does it make a difference if that is the only statement in a transaction? How does a site like stackoverflow handle this for its question view counter? Or is the possibility of "losing" some increments just considered acceptable?

    Read the article

  • Return type from DAL class (Sql ce, Linq to Sql)

    - by bretddog
    Hi, Using VS2008 and Sql CE 3.5, and preferably Linq to Sql. I'm learning database, and unsure about DAL methods return types and how/where to map the data over to my business objects: I don't want direct UI binding. A business object class UserData, and a class UserDataList (Inherits List(Of UserData)), is represented in the database by the table "Users". I use SQL Compact and run SqlMetal which creates dbml/designer.vb file. This gives me a class with a TableAttribute: <Table()> _ Partial Public Class Users I'm unsure how to use this class. Should my business object know about this class, such that the DAL can return the type Users, or List(Of Users) ? So for example the "UserDataService Class" is a part of the DAL, and would have for example the functions GetAll and GetById. Will this be correct : ? Public Class UserDataService Public Function GetAll() As List(Of Users) Dim ctx As New MyDB(connection) Dim q As List(Of Users) = From n In ctx.Users Select n Return q End Function Public Function GetById(ByVal id As Integer) As Users Dim ctx As New MyDB(connection) Dim q As Users = (From n In ctx.Users Where n.UserID = id Select n).Single Return q End Function And then, would I perhaps have a method, say in the UserDataList class, like: Public Class UserDataList Inherits List(Of UserData) Public Sub LoadFromDatabase() Me.clear() Dim database as New UserDataService dim users as List(Of Users) users = database.GetAll() For each u in users dim newUser as new UserData newUser.Id = u.Id newUser.Name = u.Name Me.Add(newUser) Next End Sub End Class Is this a sensible approach? Would appreciate any suggestions/alternatives, as this is my first attempt on a database DAL. cheers!

    Read the article

  • What does really happen when we do a BEGIN TRAN in SQL Server 2005?

    - by Misnomer
    Hi all, I came across this issue or maybe something I didn't realize but I did a Begin Tran and had some code inside it and never ran a commit or rollback as I forgot about it. That caused all many of the database queries or even just a simple select top 1000 command were just sitting on loading..? Now it probably has put some locks on the tables I guess since it did not let me query them..but I just wanted to know what exactly happened and what are the practices to be followed here ?

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >