Search Results

Search found 4908 results on 197 pages for 'ssas 2005'.

Page 29/197 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • Log Shipping breaking daily on SQL Server 2005

    - by IT2
    I am facing a somewhat serious problem with Log Shipping on SQL Server 2005 and I am having trouble to correct it, so I will try some help from SF's experts. I have a Windows 2003 Server (PROD) that ships transactional log backups to another two servers: STAND1: Windows 2003 Server with SQL Server 2005. STAND2: Windows 2008 R2 Server with SQL Server 2005. The problem is that Log Shipping to STAND2 is breaking for ~ 90 minutes some times of the day and returning back without intervention. The breaking occur at times when the backup file is larger (after reindexing, etc). I can see the message below logged on the COPY job: *** Error: The specified network name is no longer available The copy agent was breaking dozens times a day only to STAND2 server, and after the changes below "only" breaks ~ two times a day: The frequency of the backup job was changed from 5 minutes to 10 minutes. Instead of backing up the 4 databases to the same folder, the log backups are now saved on separated folders for each database. The backup job doesn't run 24hs now, and only for 14 hours a day, when people are working on the database. I configured the SQL Server instances on the three servers to limit the memory, leaving more memory to the OS. Now I don't know what to do. Any help will be much appreciated! Thanks!

    Read the article

  • How to prevent Project ASP.NET Configuration and Team Foundation Server from fighting

    - by Brian
    So, I am using visual studio 2005 (and team explorer 2005) with tfs 2008. I have installed both Visual Studio 2005 SP1 and VS80sp1-KB932544-X86-ENU.exe. I perform the following steps: Select Project-ASP.NET Configuration within Visual Studio 2005. Within Visual Studio 2005, attempt to perform either a check-in or a checkout. The following happens: The local server started by Visual Studio starts closing itself. I suspect it is crashing; the systray icons are not properly disposed of. It then reopens itself. It does this over and over again, maybe once every second or two. The TFS progress meter doesn't even budge, it just sits there. Canceling out of the checkout does not work; it says it is cancelling and does nothing. Any suggestions?

    Read the article

  • Preserving Language across inline Calculated Members in SSAS

    - by Tullo
    Problem: I need to retrieve the language of a given cell from the cube. The cell is defined by code-generated MDX, which can have an arbitrary level of indirection as far as calculated members and sets go (defined in the WITH clause). SSAS appears to ignore the Language of the specified members when you declare a calculated member inline in the query. Example: The cube's default locale is 1033 (en-US) The cube contains a Calculated Measure called [Net Pounds] which is defined as [Net Amt], language=2057 (en-GB) The query requests this measure alongside an inline calculated measure which is simply an alias to the [Net Pounds] When used directly, the measure is formatted in the en-GB locale, but when aliased, the measure falls back to using the cube default of en-US. Here's what the query looks like: WITH MEMBER [Measures].[Pounds Indirect] AS [Measures].[Net Pounds] SELECT { [Measures].[Pounds Indirect], [Measures].[Net Pounds] } ON AXIS (0) FROM [Cube] CELL PROPERTIES language, value, formatted_value The query returns the expected two cells, but only uses the [Net Pounds] locale when used directly. Is there an option or switch somewhere in SSAS that will allow locale information to be visible in calculated members? I realise that it is possible to declare the inline calculated member in a particular locale, but this would involve extracting the locale from the tuple first, which (since the cube's member is isolated in the application's query schema) is unknown.

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario Conventional Structures Columnstore ? SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • Microsoft VC++ 2005 SP1 and 2008 SP1 Redistributable Package

    - by ewlung
    Dear developers, I have an application that requires Microsoft VC++ 2005 (SP1) Redistributable Package. I know that I can just download it and install it. The problem is, in our server, there is Microsoft VC++ 2008 (SP1) Redistributable Package already installed. Now, do I still need to install the 2005 (SP1) version? Or the 2008 (SP1) is "backward compatible" with 2005 (SP1)? Thank you.

    Read the article

  • Manage network database with SQL Management Studio 2005 express

    - by johntotetwoo
    hi there mates :) i'm new with database and database handling and i was wondering if it is possible to manage network database with SQL Server Management Studio Express 2005. I have here two PCs in my home and were connected via router. PC 1 has SQL Server 2005 Express and SQL Server Management Studio 2005 Express while PC 2 has only SQL Server 2005 Express. Can it be possible that PC1's management studio can manage PC2's server? im glad if you could help me with this. Happy New YEar!

    Read the article

  • Query Logging in Analysis Services

    - by MikeD
    On a project I work on, we capture the queries that get executed on our Analysis Services instance (SQL Server 2008 R2) and use the table for helping us to build aggregations and also we aggregate the query log daily into a data warehouse of operational data so we can track usage of our Analysis databases by users over time. We've learned a couple of helpful things about this logging that I'd like to share here.First off, the query log table automatically gets cleaned out by SSAS under a few conditions - schema changes to the analysis database and even regular data and aggregation processing can delete rows in the table. We like to keep these logs longer than that, so we have a trigger on the table that copies all rows into another table with the same structure:Here is our trigger code:CREATE TRIGGER [dbo].[SaveQueryLog] on [dbo].[OlapQueryLog] AFTER INSERT AS       INSERT INTO dbo.[OlapQueryLog_History] (MSOLAP_Database, MSOLAP_ObjectPath, MSOLAP_User, Dataset, StartTime, Duration)      SELECT MSOLAP_Database, MSOLAP_ObjectPath, MSOLAP_User, Dataset, StartTime, Duration FROM inserted Second, the query logging process is "best effort" - if SSAS cannot connect to the database listed in the QueryLogConnectionString in the Analysis Server properties, it just stops logging - it doesn't generate any errors to the client at all, which is a good thing. Once it stops logging, it doesn't retry later - an hour, a day, a week, or even a month later, so long as the service doesn't restart.That has burned us a couple of times, when we have made changes to the service account that is used for SSAS, and that account doesn't have access to the database we want to log to. The last time this happened, we noticed a while later that no logging was taking place, and I determined that the service account didn't have sufficient permissions, so I made the necessary changes to give that service account access to the logging database. I first tried just the db_datawriter role and that wasn't enough, so I granted the service account membership in the db_owner role. Yes, that's a much bigger set of permissions, but I didn't want to search out the specific permissions at the time. Once I determined that the service account had the appropriate permissions, I wanted to get query logging restarted from SSAS, and I wondered how to do that? Having just used a larger hammer than necessary with the db_owner role membership, I considered just restarting SSAS to get it logging again. However, this was a production server, and it was in the middle of business hours, and there were active users connecting to that SSAS instance, so I thought better of it.As I considered the options, I remembered that the first time I set up query logging, by putting in a valid connection string to the QueryLogConnectionString server property, logging started immediately after I saved the properties. I wondered if I could make some other change to the connection string so that the query logging would start again without restarting the service. I went into the connection string dialog, went to the All page, and looked at the properties I could change that wouldn't affect the actual connection. Aha! The Application Name property would do just nicely - I set it to "SSAS Query Logging" (it was previously blank) and saved the changes to the server properties. And the query logging started up right away. If I need to get this running again in the future, I could just make a small change in the Application Name property again, save it, and even change it back again if I wanted to.The other nice side effect of setting the Application Name property is that now I can see (and possibly filter for or filter out) the SQL activity in that database that is related to the query logging process in Profiler:  To sum up:The SSAS Query Logging process will automatically delete rows from the QueryLog table, so if you want to keep them longer, put a trigger on the table to copy the rows to another tableThe SSAS service account requires more than db_datawriter role membership (and probably less than db_owner) in the database specified in the QueryLogConnectionString server property to successfully insert log rows to the QueryLog  table.Query logging will stop quietly whenever it encounters an error. Make a change to the QueryLogConnectionString server property (such as the Application Name attribute) to get query logging to restart and you won't have to restart the service.

    Read the article

  • SQL Server Express Uninstall / Enterprise Install Issue

    - by user19049
    I need help installing SQL Server 2005 Enterprise edition.I really need to remove the current SQL Server 2005.installation that is no longer on my Add/Remove software list but yet still installed on the machine. I tried to uninstall SQL Server Express / Developer Edition but it only removed it from my Add/Remove software list. It returned immediately but did NOT actually remove the product. (I'm now in a bad state.) i tried to install SQL Server 2005 Enterprise and its says I'm blocked as all components are already installed - but they are not. How can I remove all instance of previous one and install clean Enterprise edition on my server Thanks

    Read the article

  • sqlserver.exe uses 100% CPU

    - by Markus
    I've created an application (asp.net) that once a day syncs an entire database through XML-files. The sync first creates an transaction and then clears the databases tables and then starts to parse and insert the new rows into the database. When all the parsing is complete it commits the transaction. This works fine on a SQL Server 2005 (on another machine), but on SQL Server 2005 Express, the process starts to use 100% CPU after a while, and as I log the inserts being made I can see that it just stops inserting. No exception, it just stops inserting. Anyone got any idea what this may be? I've previously run the syncronization on another sql 2005 express (also on another computer), and that worked. The server has only 2GB RAM, could this be the problem?

    Read the article

  • Cannot open SQL 2005 database in SQL server Management Studio 2008 R2 on Windows 7

    - by Darryl Lawrence
    I have Windows 7 64bit as my OS + SQL Server 2008 R2 installed. I can connect to SQL Databases (2008), but cannot connect to a SQL 2005 database. I can, however, connect to the 2005 SQL Database on a PC that has Windows XP as the OS and also has SQL Server 2008 R2 installed. So it seems that it works fine on XP but not on Windows 7 (32 or 64bit). is this an Operating System issue? Error message: Cannot connect to OMRSQLV016\PRODSQL002. =================================== A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (.Net SqlClient Data Provider) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=-1&LinkId=20476 Error Number: -1 Severity: 20 State: 0

    Read the article

  • Weird Windows 2003 MSDTC and SQL 2005 issue

    - by seagull surfer
    scenario: Windows 2003 sp2 x64 enterprise edition. SQL 2005 sp2 cu9 x64 Enterprise edition After restarting the resource groups on two node active-active cluster, 3 SQL 2005 instances start up fine. The 4th one starts up but starts throwing the following error. "Enlist operation failed: 0x8004d00e(XACT E NOTRANSACTION). SQL Server could not register with Microsoft Distributed Transaction Coordinator (MS DTC) as a resource manager for this transaction. The transaction may have been stopped by the client or the resource manager." MSDTC is fine since the other 3 function normally. The only way to "fix" it is to take the 4th instance offline and bring it online again. Is there any way to fix this enlistment without restarting?

    Read the article

  • Weird Windows 2003 MSDTC and SQL 2005 issue

    - by seagull surfer
    scenario: Windows 2003 sp2 x64 enterprise edition. SQL 2005 sp2 cu9 x64 Enterprise edition After restarting the resource groups on two node active-active cluster, 3 SQL 2005 instances start up fine. The 4th one starts up but starts throwing the following error. "Enlist operation failed: 0x8004d00e(XACT E NOTRANSACTION). SQL Server could not register with Microsoft Distributed Transaction Coordinator (MS DTC) as a resource manager for this transaction. The transaction may have been stopped by the client or the resource manager." MSDTC is fine since the other 3 function normally. The only way to "fix" it is to take the 4th instance offline and bring it online again. Is there any way to fix this enlistment without restarting?

    Read the article

  • Uninstall SQL Server 2005 Express after Demoting the DC

    - by Walter Aman
    A Windows Server 2003 SP2 hosting a now orphan installation of SQL 2005 Workgroup was pressed into service as a DC in a disaster recovery scenario. It has since been demoted. The server also hosts legacy apps for which we lack reinstallation resources; thus our desire to preserve it as close to intact as possible while removing the orphaned roles. All efforts to remove SQL 2005 thru Control Panel and ARPWrapper /remove fail with error 29528. Should I abandon this and leave the orphan SQL dormant, or is it reasonable to remove it post-demote?

    Read the article

  • Publish database between two open database connections (Visual Studio 2005)

    - by danielswe
    I have two data locations, one to a local and one to a remote database. How do I copy the local database schema to the remote? The reason I don't use "Publish to provider" is that I'm not sure that I have all the information necessary to do so. I have the database name, server, username and pass but not "web service address" nor "web service password". I work in Visual Studio 2005. The server is a MSSQL 2005 server. I have tried using the queries but I only get errors doing so.

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario   Conventional Structures   Columnstore   Δ SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • SSAS: distribution of measures over percentage

    - by Alex
    Hi there, I am running a SSAS cube that stores facts of HTTP requests. The is a column "Time Taken" that stores the milliseconds a particular HTTP request took. Like... RequestID Time Taken -------------------------- 1 0 2 10 3 20 4 20 5 2000 I want to provide a report through Excel that shows the distribution of those timings by percentage of requests. A statement like "90% of all requests took less than 20millisecond". Analysis: 100% <2000 80% <20 60% <20 40% <10 20% <=0 I am pretty much lost what would be the right approach to design aggregations, calculations etc. to offer this analysis through Excel. Any ideas? Thanks, Alex

    Read the article

  • SSAS/SSRS remove parameter from cube report destroys report

    - by Jon
    Group, We built a data cube using SSAS and are now building SSRS reports off of that cube. Not sure if anyone has come across this, but when you build the report using the wizard and include parameters all looks fine. However if you are in the report after the wizard is compete, and you decide you want to remove one of the parameters you created it debunks the report and the only way to get it back is to re-create the whole report. Any way you can remove or add parameters after the initial build without destroying your report? Thanks in advance for the help! I love this forumn!

    Read the article

  • Managing Team Development with SSAS, TFS, & BIDS

    - by Kevin D. White
    I am currently a single BI developer over a corporate datawarehouse and cube. I use SQL Server 2008, SSAS, and SSIS as my basic toolkit. I use Visual Studio +BIDS and TFS for my IDE and source control. I am about to take on multiple projects with an offshore vendor and I am worried about managing change. My major concern is manging merges and changes between me and the offshore team. Merging and managing changes to SQL & XML for just one person is bad enough but with multiple developers it seems like a nightmare. Any thoughts on how best to structure development knowing that sometimes there is no way to avoid multiple individuals making changes to the same file?

    Read the article

  • SSAS 2008/Excel 2007 - Can't see cube

    - by Brian
    I have created a cube in SSAS 2008. In BIDS and SSMS I can see it fine. However, I cannot connect through Excel. I have tried both Excel 2003 and Excel 2007. I must support both and neither work. I can see the database but the cubes do not show up. I created a dummy cube in the project using the wizard and deployed that to the same database. In Excel 2003, I can see and connect to the dummy cube. Excel 2007 can't even find the server/instance after adding this cube. I have used another computer and received the same results with Excel 2003. Does anyone have any suggestions? Thanks for any help you can give me. -Brian

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >