Search Results

Search found 28900 results on 1156 pages for 'sql 2005'.

Page 192/1156 | < Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >

  • SQL RDBMS : one query or multiple calls

    - by None None
    After looking around the internet, I decided to create DAOs that returned objects (POJOs) to the calling business logic function/method. For example: a Customer object with a Address reference would be split in the RDBMS into two tables; Customer and ADDRESS. The CustomerDAO would be in charge of joining the data from the two tables and create both an Address POJO and Customer POJO adding the address to the customer object. Finally return the fulll Customer POJO. Simple, however, now i am at a point where i need to join three or four tables and each representing an attribute or list of attributes for the resulting POJO. The sql will include a group by but i will still result with multiple rows for the same pojo, because some of the tables are joining a one to many relationship. My app code will now have to loop through all the rows trying to figure out if the rows are the same with different attributes or if the record should be a new POJO. Should I continue to create my daos using this technique or break up my Pojo creation into multiple db calls to make the code easier to understand and maintain?

    Read the article

  • SQL 2008/2005 Hosting :: Error - “Named Pipes Provider, error: 40 – Could not open a connection to SQL Server”

    - by mbridge
    When setting up a Microsoft Windows Server 2008 system, I went through the motions to set up IIS, MS SQL Server 2008, and Visual Studio 2010 to use as a test-bed. One of the immediate benefits of setting up such a system is that most development can be done remotely: MS SQL Server Management Studio, Visual Studio’s Web development suite, as well as file shares, remote desktop, etc, make for a great way to remotely develop in ‘pristine’ conditions. But there are drawbacks, too, such as needing to deal with firewall issues, not being able to penetrate past a router or the requirement of setting up a VPN. One of the problems I encountered when trying to remote into the MS SQL Server 2008 that I’d set up was the following error: Named Pipes Provider, error: 40 – Could not open a connection to SQL Server I followed the below steps, and was able to connect to the server after just a few moments of tinkering: 1. From the server in question, surf to this Microsoft article, and download and install the Firewall rules modification program. Never drop your firewall, even on a development machine, unless you have a really good reason to. 2. Launch SQL Server Configuration Manager. Navigate to SQL Server Network Configuration, then Protocols for your server name. Enable TCP/IP and Named Pipes by right-clicking and choosing Enable for each given Protocol Name. 3. Restart the SQL Server service from Services (or from command line, subsequently run “net stop mssqlserver” then “net start mssqlserver”. 4. Try your remote connection once more, and you should be able to connect. It’s not a terribly difficult concept, but one of the more challenging tasks developers face is dealing with environment setup. And while there is a certain blurred-line overlap between software development and server administration, sometimes the latter is daunting, especially given that you might set up only a handful of servers during your career.

    Read the article

  • Getting results in a result set from dynamic SQL in Oracle

    - by msorens
    This question is similar to a couple others I have found on StackOverflow, but the differences are signficant enough to me to warrant a new question, so here it is: I want to obtain a result set from dynamic SQL in Oracle and then display it as a result set in a SqlDeveloper-like tool, just as if I had executed the dynamic SQL statement directly. This is straightforward in SQL Server, so to be concrete, here is an example from SQL Server that returns a result set in SQL Server Management Studio or Query Explorer: EXEC sp_executesql N'select * from countries' Or more properly: DECLARE @stmt nvarchar(100) SET @stmt = N'select * from countries' EXEC sp_executesql @stmt The question "How to return a resultset / cursor from a Oracle PL/SQL anonymous block that executes Dynamic SQL?" addresses the first half of the problem--executing dynamic SQL into a cursor. The question "How to make Oracle procedure return result sets" provides a similar answer. Web search has revealed many variations of the same theme, all addressing just the first half of my question. I found this post explaining how to do it in SqlDeveloper, but that uses a bit of functionality of SqlDeveloper. I am actually using a custom query tool so I need the solution to be self-contained in the SQL code. This custom query tool similarly does not have the capability to show output of print (dbms_output.put_line) statements; it only displays result sets. Here is yet one more possible avenue using 'execute immediate...bulk collect', but this example again renders the results with a loop of dbms_output.put_line statements. This link attempts to address the topic but the question never quite got answered there either. Assuming this is possible, I will add one more condition: I would like to do this without having to define a function or procedure (due to limited DB permissions). That is, I would like to execute a self-contained PL/SQL block containing dynamic SQL and return a result set in SqlDeveloper or a similar tool. So to summarize: I want to execute an arbitrary SQL statement (hence dynamic SQL). The platform is Oracle. The solution must be a PL/SQL block with no procedures or functions. The output must be generated as a canonical result set; no print statements. The output must render as a result set in SqlDeveloper without using any SqlDeveloper special functionality. Any suggestions?

    Read the article

  • How to deploy SQL Reporting 2005 when Data Sources are locked?

    - by spoulson
    The DBAs here maintain all SQL Server and SQL Reporting servers. I have a custom developed SQL Reporting 2005 project in Visual Studio that runs fine on my local SQL Database and Reporting instances. I need to deploy to a production server, so I had a folder created on a SQL Reporting 2005 server with permissions to upload files. Normally, a deploy from within Visual Studio is all that is needed to upload the report files. However, for security purposes, data sources are maintained explicitly by DBAs and stored in a separated locked down common folder on the reporting server. I had them create the data source for me. When I attempt to deploy from VS, it gives me the error "The item '/Data Sources' already exists." I get this whether I'm deploying the whole project or just a single report file. I already set OverwriteDataSources=false in the project properties. The TargetServer URL and folder are verified correct. I suppose I could copy the files manually, but I'd like to be able to deploy from within VS. What could I be doing wrong?

    Read the article

  • Timeout Expired error Using LINQ

    - by Refracted Paladin
    I am going to sum up my problem first and then offer massive details and what I have already tried. Summary: I have an internal winform app that uses Linq 2 Sql to connect to a local SQL Express database. Each user has there own DB and the DB stay in sync through Merge Replication with a Central DB. All DB's are SQL 2005(sp2or3). We have been using this app for over 5 months now but recently our users are getting a Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. Detailed: The strange part is they get that in two differnt locations(2 differnt LINQ Methods) and only the first time they fire in a given time period(~5mins). One LINQ method is pulling all records that match a FK ID and then Manipulating them to form a Heirarchy View for a TreeView. The second is pulling all records that match a FK ID and dumping them into a DataGridView. The only things I can find in common with the 2 are that the first IS an IEnumerable and the second converts itself from IQueryable - IEnumerable - DataTable... I looked at the query's in Profiler and they 'seemed' normal. They are not very complicated querys. They are only pulling back 10 - 90 records, from one table. Any thoughts, suggestions, hints whatever would be greatly appreciated. I am at my wit's end on this.... public IList<CaseNoteTreeItem> GetTreeViewDataAsList(int personID) { var myContext = MatrixDataContext.Create(); var caseNotesTree = from cn in myContext.tblCaseNotes where cn.PersonID == personID orderby cn.ContactDate descending, cn.InsertDate descending select new CaseNoteTreeItem { CaseNoteID = cn.CaseNoteID, NoteContactDate = Convert.ToDateTime(cn.ContactDate). ToShortDateString(), ParentNoteID = cn.ParentNote, InsertUser = cn.InsertUser, ContactDetailsPreview = cn.ContactDetails.Substring(0, 75) }; return caseNotesTree.ToList<CaseNoteTreeItem>(); } AND THIS ONE public static DataTable GetAllCNotes(int personID) { using (var context = MatrixDataContext.Create()) { var caseNotes = from cn in context.tblCaseNotes where cn.PersonID == personID orderby cn.ContactDate select new { cn.ContactDate, cn.ContactDetails, cn.TimeSpentUnits, cn.IsCaseLog, cn.IsPreEnrollment, cn.PresentAtContact, cn.InsertDate, cn.InsertUser, cn.CaseNoteID, cn.ParentNote }; return caseNotes.ToList().CopyLinqToDataTable(); } }

    Read the article

  • PHP mssql_query double quotes cannot be used

    - by Nilesh
    Hi all, In java-jdbc, I can easily run the following SQL (NOTE the double quotes around columns and table names) Select cus."customer_id" , cus."organisation_or_person" , cus."organisation_name" , cus."first_name" , cus."last_name" , cus."date_became_customer" , cus."other_customer_details" From "Contact_Management"."dbo"."Customers" cus But the same query in PHP errors out saying invalid syntax "Warning: mssql_query() [function.mssql-query]: message: Incorrect syntax near 'customer_id'. (severity 15) " But If remove all the double quotes, the query works fine and no errors. The query is ported from a java application so I would like to keep the double quotes and the SQL as it is. Any alternative solutions? Thank you Nilesh

    Read the article

  • SSIS - How do I see/set the field types in a Recordset?

    - by thursdaysgeek
    I'm looking at an inherited SSIS package, and a stored procedure is sending records to a recordset called USER:NEW_RECORDS. It's of type Object, and the value is System.Object. It is then used for inputting that data to a SQL table. We're getting an error, because it seems that the numeric results of the stored procedure are being put in a DT_WSTR field, and then failing when it is then put into a decimal field in the database. Most of the records are working, but one, which happens to have a longer number of decimal digits, is failing. I want to see exactly what my SSIS recordset field types are, and probably change them, so I can force the data to be truncated properly and copied. Or, perhaps, I'm not even looking at this correctly. The data is put into the recordset using a SQL Task that executes the stored procedure.

    Read the article

  • Add Web Reference prompts for credentials with "Discovery Credential" dialog but won't accept valid

    - by Eden
    I'm attempting to add a web reference to my ASP.Net 3.0 project. This is a reference to the SQL Server Reporting Services web service. I have verified the service is up and running, but when I try to add the web reference in my project, I am prompted for my credentials, which I enter, and then prompted again and again and again. I have to hit cancel to stop the vicious cycle. When I do that the service definition comes up in the window, but the "Add Reference" button is disabled so I can't add the reference to my project. I have limited knowledge of SQL Reporting Services configurations. If anyone knows how to resolve this problem I'd really appreicate it.

    Read the article

  • Fastest way to compress a database or .bak file and transfer it

    - by Nai
    As per the question title. I wonder if there are special programmes or commands that makes zipping up a .bak file and transferring it super quick. I read abour xp_cmdshell here but I'm not sure about the speed. My .bak file is about 12 gigs at the moment. Related to this is the possibility of using Red Gate's SQL Data Compare to just transfer the differential data across the network pipeline but I have never used SQL Data Compare before and I'm not sure how it goes about doing INSERTS on tables with Primary Keys and such. Also, not sure about the speed. Does anyone have any experience with this programme or similar programmes? Cheers!

    Read the article

  • Database migration

    - by graham.reeds
    We are migrating a client's own database schema to our own (both SQL-Server). Most of the mappings from their schema to ours have been indentified and rules been agreed on if the columns don't exactly align (default values etc.) Previously, depending on who was assigned the task, this has been done either with a mixture of sql scripts or one-off vb apps. I was thinking there must be a application (commercial or otherwise) where you can assign these mappings/rules and have it stream the data across. Surely the setting up and configuration of this tool would be less than the creation of ad-hoc scripts... Is there an app? Apart from the obvious 'be careful' any tips to mitigate the stress of a non-DBA porting one schema to another?

    Read the article

  • Search and Browse Database Objects with Oracle SQL Developer

    - by thatjeffsmith
    I was tempted to throw in another Dora the Explorer Map reference here, but I came to my senses.Having trouble finding something? Maybe you’re just getting older? I know I am. But still, it’d be nice if my favorite database tool could help me out a bit. Hmmm, what’s this ‘Find Database Object‘ thing over here…sounds like a search mechanism of some sort? You can access this panel from the ‘View‘ menu. It’s a good bit down the screen, so I don’t blame you if you haven’t seen it before. It makes finding ‘stuff’ in your database so much easier. Let’s say I want to find my ‘beer’ objects. I simply need to type my search string and the context (in this case I want it to search EVERYTHING), and hit enter. The search results are listed below and clicking on an object automatically opens it! I know it seems very simple, but I get asked this question a LOT. It will even search through your PL/SQL code! Finding too much? Be sure to toggle off the ‘%’ wildcard check box before doing a search. Working on a Project? I bet you use common column names, or codes, throughout your tables. You could take advantage of this knowledge and use the Find Database Object panel as a substitute connection tree or schema browser. Working on your HR project and want to look at your employee objects? Do a column search for your column ID/key. Sometimes thinking outside the box actually works! Don’t be afraid to tackle a problem from a weird angle, or re-purpose your tools. I do it all the time And I drive the developers nuts trying to do things with the tools they were never designed to do. But I digress. Back to your coding!

    Read the article

  • SQLOS and Cloud Infrastructure sessions at PASS Summit 2012

    - by SQLOS Team
    The SQL Pass Summit 2012, the largest yet, is in full swing. Here's a summary of the sessions this week on cloud infrastructure and SQLOS topics. Some of these were today, and you can catch the recordings. One more session takes place on Friday covering SQL Server solution patterns in Windows Azure VMs... Also, catch Thursday's keynote with Quentin Clark which will feature a cool IaaS demo!   SQL Server in Windows Azure VM Sessions CLD-309-A SQLCAT: Best Practices and Lessons Learned on SQL Server in an Azure VM Steve Howard, Arvind Ranasaria - Wednesday 11/6 10:15 This session looked at some best practices to optimize Networking, Memory, Disk IO and high availability based on lessons learned during SQLCat work with customer deployments. Well worth catching the recording.   SQL Server in Azure VM patterns: Hybrid Disaster Recovery, data movement and BI Guy Bowerman, Peter Saddow, Michael Washam, Ross LoForte - Friday 11/9 9:45 Rm 613 [Note: In the guides this has an outdated title.] This session has a focus on SQL Server Azure VM solutions. Starting with the basics and then going deeper into: - New features in the Microsoft Assessment and Planning Toolkit 8.0 to help plan and size SQL VM migrations.- A Look at a Windows Azure VM SQL Server app making use of load balancing and SQL Server high availability features.- A BI case study running SQL BI components in Azure VMs and making use of Windows 8 tiles.- A training class in a VM case study.   SQLOS Sessions DBA-500-HD Inside SQLOS 2012 (half-day session) Bob Ward - Wednesday 11/6 1:30pm Bob Ward from CSS applies his wealth of experience to look at the internals of SQLOS and what's changed in the various SQL 2012 components, including memory, resource governor, scheduler.   DBA-403-M: SQLCAT: Memory Manager Changes in SQL Server 2012 Gus Apostol, Jerome Halmans - 1:30pm Covers the redesigned SQLOS memory manager in SQL Server 2012 including the new page allocator for any size pages (and all that implies), DMVs, demo's. Not sure why this was placed at the same time as the SQLOS half-day session, but since it's recorded it's available for catch-up.   - Guy   Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • SQL Server 2008 Compression

    - by Peter Larsson
    Hi! Today I am going to talk about compression in SQL Server 2008. The data warehouse I currently design and develop holds historical data back to 1973. The data warehouse will have an other blog post laster due to it's complexity. However, the server has 60GB of memory (of which 48 is dedicated to SQL Server service), so all data didn't fit in memory and the SAN is not the fastest one around. So I decided to give compression a go, since we use Enterprise Edition anyway. This is the code I use to compress all tables with PAGE compression. DECLARE @SQL VARCHAR(MAX)   DECLARE curTables CURSOR FOR             SELECT 'ALTER TABLE ' + QUOTENAME(OBJECT_SCHEMA_NAME(object_id))                     + '.' + QUOTENAME(OBJECT_NAME(object_id))                     + ' REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE)'             FROM    sys.tables   OPEN    curTables   FETCH   NEXT FROM    curTables INTO    @SQL   WHILE @@FETCH_STATUS = 0     BEGIN         IF @SQL IS NOT NULL             RAISERROR(@SQL, 10, 1) WITH NOWAIT           FETCH   NEXT         FROM    curTables         INTO    @SQL     END   CLOSE       curTables DEALLOCATE  curTables Copy and paste the result to a new code window and execute the statements. One thing I noticed when doing this, is that the database grows with the same size as the table. If the database cannot grow this size, the operation fails. For me, I first ended up with orphaned connection. Not good. And this is the code I use to create the index compression statements DECLARE @SQL VARCHAR(MAX)   DECLARE curIndexes CURSOR FOR             SELECT      'ALTER INDEX ' + QUOTENAME(name)                         + ' ON '                         + QUOTENAME(OBJECT_SCHEMA_NAME(object_id))                         + '.'                         + QUOTENAME(OBJECT_NAME(object_id))                         + ' REBUILD PARTITION = ALL WITH (FILLFACTOR = 100, DATA_COMPRESSION = PAGE)'             FROM        sys.indexes             WHERE       OBJECTPROPERTY(object_id, 'IsMSShipped') = 0                         AND OBJECTPROPERTY(object_id, 'IsTable') = 1             ORDER BY    CASE type_desc                             WHEN 'CLUSTERED' THEN 1                             ELSE 2                         END   OPEN    curIndexes   FETCH   NEXT FROM    curIndexes INTO    @SQL   WHILE @@FETCH_STATUS = 0     BEGIN         IF @SQL IS NOT NULL             RAISERROR(@SQL, 10, 1) WITH NOWAIT           FETCH   NEXT         FROM    curIndexes         INTO    @SQL     END   CLOSE       curIndexes DEALLOCATE  curIndexes When this was done, I noticed that the 90GB database now only was 17GB. And most important, complete database now could reside in memory! After this I took care of the administrative tasks, backups. Here I copied the code from Management Studio because I didn't want to give too much time for this. The code looks like (notice the compression option). BACKUP DATABASE [Yoda] TO              DISK = N'D:\Fileshare\Backup\Yoda.bak' WITH            NOFORMAT,                 INIT,                 NAME = N'Yoda - Full Database Backup',                 SKIP,                 NOREWIND,                 NOUNLOAD,                 COMPRESSION,                 STATS = 10,                 CHECKSUM GO   DECLARE @BackupSetID INT   SELECT  @BackupSetID = Position FROM    msdb..backupset WHERE   database_name = N'Yoda'         AND backup_set_id =(SELECT MAX(backup_set_id) FROM msdb..backupset WHERE database_name = N'Yoda')   IF @BackupSetID IS NULL     RAISERROR(N'Verify failed. Backup information for database ''Yoda'' not found.', 16, 1)   RESTORE VERIFYONLY FROM    DISK = N'D:\Fileshare\Backup\Yoda.bak' WITH    FILE = @BackupSetID,         NOUNLOAD,         NOREWIND GO After running the backup, the file size was even more reduced due to the zip-like compression algorithm used in SQL Server 2008. The file size? Only 9 GB. //Peso

    Read the article

  • Configurable tables in sql database

    - by dot
    I have the following tables in my database: Config Table: ====================================== Start_Range | End Range | Config_id 10 | 15 | 1 ====================================== Available_UserIDs ========================== ID | UserID | Used_YN | 1 | 10 | t | 1 | 11 | f | 1 | 12 | f | 1 | 13 | f | 1 | 14 | f | 1 | 15 | f | ========================== Users ========================== UserId | FName | LName | 10 |John | Doe | ========================== This is used in a reservation system of sorts... which lets an administrator specify a range of numbers that will be assigned to users in the config table. Once the range has been defined, the system then populates the Available_userIDs table with all the numbers in between the range, and sets the Used_YN flag to false As users sign up, they grab the next user_id number that's not in use... and reserve it. Then the system adds a record to the Users table. Once the admin has specified a range, it is possible that they can change it. For example, they can start with 10-15... and then when the range is used up, they should be able to specify another range like 16 - 99. I've put a unique constraint on the Available_UserIDs table, as well as on the Users table - to ensure that UserIds can't be duplicated. My questions are as follows: What's the best way to prevent the admins from using a range that's already in use? I thought of the following options: -- check either the Users table to see if the start range or ending range numbers are being used. If they are, assume that all the numbers in between are in use too, and reject the range. -- let them specify whatever they want, try to populate the Available_UserIDs table. If there are duplicates, just ignore that specific error message from the database and continue on. How do I find gaps in the number ranges? For example, if they specify 10-15, and then 20-25, it'd be nice to be able to somehow suggest on my web page that 16-19 is currently available. I found this article: http://stackoverflow.com/questions/1312101/how-to-find-a-gap-in-running-counter-with-sql But it only seems to return the first available number... so in my example above, it would only return the number 16. I'm sure there's a simpler way to do things that I'm overlooking!

    Read the article

  • User Provisioning Tool for SQL Server 2008?

    - by Rob Sanders
    Yesterday I moved my machine from one domain to another - foolishly forgetting the implications for my local instance of SQL Server! Mixed Mode authentication is not enabled, and the only local account login has only "public" permissions. SQL Server 2005 Service Pack 2 had a tool called the User Provisioning Tool for Windows Vista (sqlprov.exe) which allowed you to add Domain Users to a local SQL 2005 instance (it doesn't work against SQL 2008 btw) - my question is.. is there a similar tool for SQL Server 2008 or am I going to have to do a reinstall? Also let me know if you think this belongs on StackOverflow

    Read the article

  • Error when restoring database (Windows 7 test environment)

    - by Undh
    I have a windows 7 operating system as a test environment. I have SQL Server EE installed with two instances, named as test and production. I took a full backup from AdventureWorks database from test instance and I tried to restore it into the production instance: RESTORE DATABASE [testikanta] FROM DISK = N'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\Backup\AdventureWorks.bak' WITH FILE = 1, NOUNLOAD, REPLACE, STATS = 10 GO I got an error saying: Msg 3634, Level 16, State 1, Line 1 The operating system returned the error '32(failed to retrieve text for this error. Reason: 15105)' while attempting 'RestoreContainer::ValidateTargetForCreation' on 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Data.mdf'. Msg 3156, Level 16, State 8, Line 1 File 'AdventureWorks_Data' cannot be restored to 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Data.mdf'. Use WITH MOVE to identify a valid location for the file. Msg 3634, Level 16, State 1, Line 1 The operating system returned the error '32(failed to retrieve text for this error. Reason: 15105)' while attempting 'RestoreContainer::ValidateTargetForCreation' on 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Log.ldf'. Msg 3156, Level 16, State 8, Line 1 File 'AdventureWorks_Log' cannot be restored to 'C:\Program Files\Microsoft SQL Server\MSSQL10.SQL2008TESTI\MSSQL\DATA\AdventureWorks_Log.ldf'. Use WITH MOVE to identify a valid location for the file. Msg 3119, Level 16, State 1, Line 1 Problems were identified while planning for the RESTORE statement. Previous messages provide details. Msg 3013, Level 16, State 1, Line 1 RESTORE DATABASE is terminating abnormally. Where's the problem? I'm running these instances as on local machine adminstrator (SQL Server services are running with the same account).

    Read the article

  • SQL Server 2008 Hardware Recommendation;

    - by Jay
    Hi,I work for a large fortune 500 company. We have several SQL 2005 Servers running on DELL Poweredge 2950 with 8 GB RAM and 4 CPU's. Storage is DMX RAID 10. We are in the process of migrating to sql 2008. We are planning on consolidating multiple sql 2005 into single SQL 2008 Server.If anyone can suggest hardware I would appreciate. We have looked at DELL R710, I was wondering if there are other servers that are good for running SQL 2008. Thanks

    Read the article

  • SQL Server 2008 Remote Access

    - by Timothy Strimple
    I'm having problems connecting to my SQL Server 2008 database from my computer. I have enabled remote connections as described in this answer (http://serverfault.com/questions/7798/how-to-enable-remote-connections-for-sql-server-2008). And I have added the ports listed on the microsoft support page to our Cisco Asa firewall and I'm still unable to connect. The error I'm getting from the SQL Management Studio is: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: TCP Provider, error: 0 - A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond.) (Microsoft SQL Server, Error: 10060) Once again, I have double and triple checked that remote connections are enabled under the database properties and that TCP is enabled on the configuration page. I've added tcp ports 135, 1433, 1434, 2382, 2383, and 4022 as well as udp 1434 to the firewall. I've also checked to make sure that 1433 is the static port that is set in the tcp section of the database server configuration. The ports should be configured correctly in the firewall since http/https and rdp are all working and the sql server ports are setup the same way. What am I missing here? Any help you could offer would be greatly appreciated. Edit: I can connect to the server via TCP on the internal network. The servers are colocated in a datacenter and I can connect from my production box to my development box and vice versa. To me that indicates a firewall issue, but I've no idea what else to open. I've even tried allowing all tcp ports to that server without success.

    Read the article

  • SQL Performance Problem IA64

    - by Vendoran
    We’ve got a performance problem in production. QA and DEV environments are 2 instances on the same physical server: Windows 2003 Enterprise SP2, 32 GB RAM, 1 Quad 3.5 GHz Intel Xeon X5270 (4 cores x64), SQL 2005 SP3 (9.0.4262), SAN Drives Prod: Windows 2003 Datacenter SP2, 64 GB RAM, 4 Dual Core 1.6 GHz Intel Family 80000002, Model 6 Itanium (8 cores IA64), SQL 2005 SP3 (9.0.4262), SAN Drives, Veritas Cluster I am seeing excessive Signal Wait Percentages ( 250%) and Page Reads /s (50) and Page Writes /s (25) are both high occasionally. I did test this query on both QA and PROD and it has the same execution plan and even the same stats: SELECT top 40000000 * INTO dbo.tmp_tbl FROM dbo.tbl GO Scan count 1, logical reads 429564, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. As you can see it’s just logical reads, however: QA: 0:48 Prod: 2:18 So It seems like a processor related issue, however I’m not sure where to go next, any ideas? Thanks, Aaron

    Read the article

  • Help Installing SQL Server 2008 Express Edition

    - by Jordan S
    Ok I am running Windows 7, 64 bit. I cleaned of SQL server 2005 completely off my system leaving only SQL Compact Edition. I went here http://www.microsoft.com/downloads/details.aspx?FamilyID=01af61e6-2f63-4291-bcad-fd500f6027ff&displaylang=en and installed SQL Server 2008 Express Edition Service Pack 1. After the install, under my start bar menu all i have for SQL configuration tools are the Configuration Manager, Error and Usage Reporting and the Install Center. I don't have the SQL Managment Studio. So I went here http://www.microsoft.com/downloads/details.aspx?FamilyID=08e52ac2-1d62-45f6-9a4a-4b76a8564a2b&displaylang=en and downloaded the SQL Server 2008 Management Studio Express but when I try to install it I get a warning says This program has known compatibility issues and that I need to Install SQL Server 2008 Service Pack 1. I thought that is what I installed. So, I tried to continue running the install but I then get an error message that says Invoke or BeginInvoke can not be called on a Form before it is opened... How can I check if Service pack 1 is installed or not? What should I do? Also I rebooted my system and checked for Windows Updates and it says that Windows it up to date.

    Read the article

  • SQL 2008 Replication over Internet

    - by Akash Kava
    We have decided to put our servers in data centers on east and west coast of US, to keep high level redundancy. After evaluating number of replication options, apart from VPN there is no other way to do replication for SQL Server. We are investigating VPN but I have following questions. Our Large DB consists of media information (pictures/movies/audio/pdf) etc, so we are not very concerned about security because they are not financial sensitive data. SQL 2005 supports or can be configured to support replication over internet? If Yes then should we downgrade to 2005? If SQL 2008 Publisher is configured for Web Sync, can we write an automatic program (C# Windows Service) to act as pull subscriber and run on the subscriber server and replicate subscriber database? Or are there any API available in SQL where we can write our own program to do replication in very generic way? (In a nut shell, can we write our own C# Windows Service based Subscriber program?)

    Read the article

  • SQL Server Offsite Backups

    - by Eric Maibach
    We have about !TB of SQL Server databases, and these databases generate about 200GB of data changes each day. Up to this point we have been doing Weekly full backups, daily diff backups, and hourly transaction log backups. The full and diff backups are backed up to tape and taken offsite each day. We have been trying to move away from tapes, and our IT department purchased a Barracuda Backup device that backups up data and then sends it offsite using our internet connection. I have been trying to get this to work for our SQL Server backups, and have ran into a number of problems. I normally like to just use SQL Server to perform backups instead of trying to use a agent, so that is what I tried first. However the Barracuda device was not able to dedup these files very well, so it ended up being to much data to try to send offsite and to archive. I then tried installing the Barracuda agent and using it to backup the SQL Server databases. However the problem I am having there is that on some of the database servers I also have files that need backed up, and I cannot find a way to create seperate backup schedules for the file backups and the SQL Server backups. Barracuda only does full or transaction log backups. So if I want to do hourly transaction log backups I end up doing a file system backup every hour (which is not good), or if I only schedule the backups to run once a night I either have to do a full backup every night, or only do a transaction backup once a day. None of these scenarios are good options. My question is, how is everyone else getting their large SQL Server database backups offsite. Are you just using tape, or have you found a offsite backup device that works well? Is anybody else using Barracuda to backup their SQL Server databases? If you do, then how do you have it setup?

    Read the article

  • SQL Server 2000, large transaction log, almost empty, performance issue?

    - by Mafu Josh
    For a company that I have been helping troubleshoot their database. In SQL Server 2000, database is about 120 gig. Something caused the transaction log to grow MUCH larger than normal to over 100 gig, some hung transaction that didn't commit or roll back for a few days. That has been resolved and it now stays around 1% full or less, due to its hourly transaction log backups. It IS my understanding that a GROWING transaction log file size can cause performance issues. But what I am a little paranoid about is the size. Although mainly empty, MIGHT it be having a negative effect on performance? But I haven't found any documentation that suggests this is true. I did find this link: http://www.bigresource.com/MS_SQL-Large-Transaction-Log-dramatically-Slows-down-processing-any-idea-why--2ahzP5wK.html but in this post I can't tell if their log was full or empty, and there is not any replies to the post in this link. So I am guessing it is not a problem, anyone know for sure?

    Read the article

  • FullText SQL Server Clustered Service Move

    - by Steve
    I need to move the I drive from one san vendor to another. Microsoft states that you can't move the fulltext http://support.microsoft.com/default.aspx?scid=KB;EN-US;Q304282& and the only way would be remove the cluster and re-install the cluster. Does anyuone know of any other way? Would swapping the old I drive for a new I drive work? Currently on the I drive I just have the fulltext directory and sql server log directory to move. The later I know how to handle. Any help would be greatly appreciated Windows 2003 - SQL Server 2005 32bit and yes this is one of the few 2005 instances that we have left Steve

    Read the article

< Previous Page | 188 189 190 191 192 193 194 195 196 197 198 199  | Next Page >