Search Results

Search found 34476 results on 1380 pages for 'sql blog'.

Page 200/1380 | < Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >

  • Windows Azure Virtual Machine Readiness and Capacity Assessment for SQL Server

    - by SQLOS Team
    Windows Azure Virtual Machine Readiness and Capacity Assessment for Windows Server Machine Running SQL Server With the release of MAP Toolkit 8.0 Beta, we have added a new scenario to assess your Windows Azure Virtual Machine Readiness. The MAP 8.0 Beta performs a comprehensive assessment of Windows Servers running SQL Server to determine you level of readiness to migrate an on-premise physical or virtual machine to Windows Azure Virtual Machines. The MAP Toolkit then offers suggested changes to prepare the machines for migration, such as upgrading the operating system or SQL Server. MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Now, let’s walk through the MAP Toolkit task for completing the Windows Azure Virtual Machine assessment and capacity planning. The tasks include the following: Perform an inventory View the Windows Azure VM Readiness results and report Collect performance data for determine VM sizing View the Windows Azure Capacity results and report Perform an inventory: 1. To perform an inventory against a single machine or across a complete environment, choose Perform an Inventory to launch the Inventory and Assessment Wizard as shown below: 2. After the Inventory and Assessment Wizard launches, select either the Windows computers or SQL Server scenario to inventory Windows machines. HINT: If you don’t care about completely inventorying a machine, just select the SQL Server scenario. Click Next to Continue. 3. On the Discovery Methods page, select how you want to discover computers and then click Next to continue. Description of Discovery Methods: Use Active Directory Domain Services -- This method allows you to query a domain controller via the Lightweight Directory Access Protocol (LDAP) and select computers in all or specific domains, containers, or OUs. Use this method if all computers and devices are in AD DS. Windows networking protocols --  This method uses the WIN32 LAN Manager application programming interfaces to query the Computer Browser service for computers in workgroups and Windows NT 4.0–based domains. If the computers on the network are not joined to an Active Directory domain, use only the Windows networking protocols option to find computers. System Center Configuration Manager (SCCM) -- This method enables you to inventory computers managed by System Center Configuration Manager (SCCM). You need to provide credentials to the System Center Configuration Manager server in order to inventory the managed computers. When you select this option, the MAP Toolkit will query SCCM for a list of computers and then MAP will connect to these computers. Scan an IP address range -- This method allows you to specify the starting address and ending address of an IP address range. The wizard will then scan all IP addresses in the range and inventory only those computers. Note: This option can perform poorly, if many IP addresses aren’t being used within the range. Manually enter computer names and credentials -- Use this method if you want to inventory a small number of specific computers. Import computer names from a files -- Using this method, you can create a text file with a list of computer names that will be inventoried. 4. On the All Computers Credentials page, enter the accounts that have administrator rights to connect to the discovered machines. This does not need to a domain account, but needs to be a local administrator. I have entered my domain account that is an administrator on my local machine. Click Next after one or more accounts have been added. NOTE: The MAP Toolkit primarily uses Windows Management Instrumentation (WMI) to collect hardware, device, and software information from the remote computers. In order for the MAP Toolkit to successfully connect and inventory computers in your environment, you have to configure your machines to inventory through WMI and also allow your firewall to enable remote access through WMI. The MAP Toolkit also requires remote registry access for certain assessments. In addition to enabling WMI, you need accounts with administrative privileges to access desktops and servers in your environment. 5. On the Credentials Order page, select the order in which want the MAP Toolkit to connect to the machine and SQL Server. Generally just accept the defaults and click Next. 6. On the Enter Computers Manually page, click Create to pull up at dialog to enter one or more computer names. 7. On the Summary page confirm your settings and then click Finish. After clicking Finish the inventory process will start, as shown below: Windows Azure Readiness results and report After the inventory progress has completed, you can review the results under the Database scenario. On the tile, you will see the number of Windows Server machine with SQL Server that were analyzed, the number of machines that are ready to move without changes and the number of machines that require further changes. If you click this Azure VM Readiness tile, you will see additional details and can generate the Windows Azure VM Readiness Report. After the report is generated, select View | Saved Reports and Proposals to view the location of the report. Open up WindowsAzureVMReadiness* report in Excel. On the Windows tab, you can see the results of the assessment. This report has a column for the Operating System and SQL Server assessment and provides a recommendation on how to resolve, if there a component is not supported. Collect Performance Data Launch the Performance Wizard to collect performance information for the Windows Server machines that you would like the MAP Toolkit to suggest a Windows Azure VM size for. Windows Azure Capacity results and report After the performance metrics are collected, the Azure VM Capacity title will display the number of Virtual Machine sizes that are suggested for the Windows Server and Linux machines that were analyzed. You can then click on the Azure VM Capacity tile to see the capacity details and generate the Windows Azure VM Capacity Report. Within this report, you can view the performance data that was collected and the Virtual Machine sizes.   MAP Toolkit 8.0 Beta is available for download here Your participation and feedback is very important to make the MAP Toolkit work better for you. We encourage you to participate in the beta program and provide your feedback at [email protected] or through one of our surveys. Useful References: Windows Azure Homepage How to guides for Windows Azure Virtual Machines Provisioning a SQL Server Virtual Machine on Windows Azure Windows Azure Pricing     Peter Saddow Senior Program Manager – MAP Toolkit Team

    Read the article

  • SQL Server Backup File Significantly Smaller After Table Recreation

    - by userx
    We run automated weekly backups of our SQL Server. The database in question is configured for Simple Recovery. We backup using Full, not differential. Recently, we had to re-create one of our tables with data in it (making 2 varchar fields a couple characters longer). This required running a script which created a new table, copied the data over, and then dropped the old. This worked correctly. Oddly though, our weekly backup files now SHRANK by over 75%! The tables don't have large indexes. All data was copied over correctly (and verified). I've verified that we are doing full and not incremental backups. The new files restore just fine. I can't seem to figure out why the backup files would have shrank so much? I've also noticed that they get about 10 MB larger every week, even though less than that amount of data is being added. I'm guessing that I'm simply not understanding something. Any insight would be appreciated.

    Read the article

  • SQL Server Analysis Services, DNS, AD, Kerberos, Connection Issues

    - by ScaleOvenStove
    Running into a very weird issue. Converting servers to Windows 2008/SQL 2008. Have a server, SERVER_A, brand new, setup with Win2k8,Sql2k8 - works. Have a Server SERVER_B, running Windows2003/SQL2005. I want to migrate from SERVER_B to SERVER_A. I have all db's, cubes, etc setup on SERVER_A and it is mimicking functionality. Since users are using Excel to connect to SSAS, they connection string has SERVER_B in it. What I want to do, is, change DNS on the network to point SERVER_B (by name) at the ip of SERVER_A. I have successfully done this with another server, SERVER_C, but I need to do it with SERVER_B. What I have found is that with SERVER_C, after changing DNS, had to remove SERVER_C from AD and then it worked. I could connect to SERVER_C (DB), SERVER_C (SSAS Default Instance) and SERVER_C (SSAS Named instance) and it all was actually connecting to SERVER_A I tried to do the same with with SERVER_B, and no luck. Changed DNS, removed from AD, and it wouldn't connect. Found out that there were some SPN's in AD set up, so removed those and tried again. I then could connect to SERVER_B (DB), SERVER_B (SSAS Named Instance), but not SERVER_B (SSAS Default Instance). I could connect to SERVER_B (SSAS Default Intance WITH the Port #), but I need to be able to connect without the port number. I am at a loss to as why I can't connect to the default instance without a port #. Not sure if it is SPN's in AD, or another AD issue, or something else. Pretty sure it isnt something on the server (because SERVER_C works!) Any insight or suggestions would be greatly helpful!!

    Read the article

  • SQL Server Instance login issue

    - by reallyJim
    I've just brought up a new installation of SQL Server 2008. I installed the default instance as well as one named instance. I'm having a problem connecting to the named instance from anywhere besides the server itself with any user besides 'sa'. I am running in mixed mode. I have a login/user that has a known username. Using that user/login, I can properly connect when directly on the server. When I attempt to login from anywhere else, I recieve a "Login failed for user ''", with Error 18456. In the log file in the server, I see a reason that doesn't seem to help: "Reason: Could not find a login matching the name provided.". However, that user/login DOES exist, as I can use it locally. There are no further details about the error. Where can I start to find something to help me with this? I've tried deleting and recreating the user, as well as just creating a new one from scratch--same result, locally fine, remotely an error. EDIT: Partially Resolved. I'm now passed the base issue--the clients were trying to connect via the default instance. I don't know why. So, once proper ports were opened in the firewall, and a static port assigned to the named instance, I can now connect--BUT ONLY if I specify the connection as Server,Port. SQLBrowser is apparently not helping/working in this case. I've verified it IS running, and done a stop/restart after my config changes, but no difference yet.

    Read the article

  • SQL Saturday 47 Phoenix February 2011

    - by billramo
    Today I presented data collection strategies for SQL Server 2008 at Phoenix SQL Saturday 47. I’ve attached my deck to this post so that you can get the links and references that I presented. To learn more about the Data Collector, check out these links. SQL Server 2008 Data Collector Proof of Concept – How to get started with Data Collector in your organization SQL Server Query Hash Statistics – Replacement for the shipping Query Statistics collection set Writing Reports Against the Management Data Warehouse – This is part 1 of the series on MDW reports. You can see all the articles through this link. Be sure to click on the attachment link to download the slides.

    Read the article

  • How Parallelism Works in SQL Server

    - by Paul White
    You might have noticed that January was a quiet blogging month for me.  Part of the reason was that I was working on a series of articles for Simple Talk, examining how parallel query execution really works.  The first part is published today at: http://www.simple-talk.com/sql/learn-sql-server/understanding-and-using-parallelism-in-sql-server/ . This introductory piece is not quite as deeply technical as my SQLblog posts tend to be, but I hope there be enough interesting material to make...(read more)

    Read the article

  • SQL Server Data Tools–BI for Visual Studio 2013 Re-released

    - by Greg Low
    Customers used to complain that the tooling for creating BI projects (Analysis Services MD and Tabular, Reporting Services, and Integration services) has been based on earlier versions of Visual Studio than the ones they were using for their other work in Visual Studio (such as C#, VB, and ASP.NET projects). To alleviate that problem, the shipment of those tools has been decoupled from the shipment of the SQL Server product. In SQL Server 2014, the BI tooling isn’t even included in the released version of SQL Server. This allows the team to keep up-to-date with the releases of Visual Studio. A little while back, I was really pleased to see that the Visual Studio 2013 update for SSDT-BI (SQL Server Data Tools for Business Intelligence) had been released. Unfortunately, they then had to be withdrawn. The good news is that they’re back and you can get the latest version from here: http://www.microsoft.com/en-us/download/details.aspx?id=42313

    Read the article

  • Creating a Simple PHP Blog in Azure

    - by Josh Holmes
    In this post, I want to walk through creating a simple Azure application that will show a few pages, leverage Blob storage, Table storage and generally get you started doing PHP on Azure development. In short, we are going to write a very simple PHP Blog engine for Azure. To be very clear, this is not a pro blog engine and I don’t recommend using it in production. It’s a » read more.

    Read the article

  • The results are in: I'm an okay speaker.

    - by AaronBertrand
    Last weekend I spoke at SQL Saturday #60 in Cleveland, Ohio. I had a great time catching up with some existing friends and colleagues, and met a bunch of new people too. I presented two sessions: What's New in Denali, and T-SQL: Bad Habits to Kick. Yesterday the organizers passed along the scanned-in speaker evaluations (this was the first SQL Saturday event where I found folks to be quite motivated to fill out the forms, since it was how they drew the door prizes). And being the ADD person I am,...(read more)

    Read the article

  • sql 2008 disk layout on a budget this is for database mirroring

    - by user22215
    Guys I'm rolling out a SQL database server that will be used to back Sharepoint 2007. Right now I need some advice on my disk layout. I have two Dell servers that are configured a little differently in terms of storage. The principle server will be using a combination of local storage and san storage. I have to work with what I have the organization is currently all allocated on san storage it was like pulling teeth to even get what I have to work with now. My disk setup on the principle is as follows: raid 1 for OS raid 10 for logs raid 10 fiber on san for high IO databases raid 10 sata on san for content databases My question in regards to the principle server is where should I place the temp db? I thought about placing it on the fiber raid 10 which will be hosting my high IO Sharepoint SSP databases my only other choice is to move it to the raid 1 os partition which I’m sure you guys will be against. Now let’s talk about the mirror server it is not connected to the san it is all local 6 15k SAS drives. Now my question is the same do I put tempdb on the os partition or do I leave the os partition and use a single raid 10 for everything? Any help you can provide is much appreciated.

    Read the article

  • Writing a SQL Azure Book - Notes

    - by Herve Roggero
    Over the last few months I have had the opportunity to ramp up significantly on SQL Azure.  In fact I will be the co-author of Pro SQL Azure, published by Apress. This is going to be a book on how to best leverage SQL Azure, both from a technology and design standpoint. Talking about design, one of the things I realized is that understanding the key limitations and boundary parameters of Azure in general, and more specifically SQL Azure, will play an important role in making sounds design decisions that both meet reasonable performance requirements and minimize the costs associated with running a cloud computing solution.   The book touches on many design considerations including link encryption, pricing model, design patterns, and also some important performance techniques that need to be leveraged when developing in Azure, including Caching, Lazy Properties and more.   Finally I started working with Shards and how to implement them in Azure to ensure database scalability beyond the current size limitations. Implementing shards is not simple, and the book will address how to create a shard technology within your code to provide a scale-out mechanism for your SQL Azure databases.   As you can see, there are many factors to consider when designing a SQL Azure database. While we can think of SQL Azure as a cloud version of SQL Server, it is best to look at it as a new platform to make sure you don’t make any assumptions on how to best leverage it.

    Read the article

  • New Blog Site

    - by Miguel A. Castro
    Effective immediately, my blog is hosted on www.dotnetdude.com. This blog will no longer be updated so please check out my new one and update your RSS feed with this link. Miguel

    Read the article

  • How to detach a sql server 2008 database that is not in database list?

    - by Amir
    I installed SQL Server 2008 on Windows 7. Then I created a database. After 2 days I reinstalled Windows and SQL Server. Now I am trying to attach my database file, but I have encountered the error below. I think that the files are like an attached file and I can't attach them. What is difference between an attached file and a non-attached file? How can I attach this file? Please Help Me. Error Text: TITLE: Microsoft SQL Server Management Studio Attach database failed for Server 'AMIR-PC'. (Microsoft.SqlServer.Smo) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=10.50.1600.1+((KJ_RTM).100402-1540+)&EvtSrc=Microsoft.SqlServer.Management.Smo.ExceptionTemplates.FailedOperationExceptionText&EvtID=Attach+database+Server&LinkId=20476 ------------------------------ ADDITIONAL INFORMATION: An exception occurred while executing a Transact-SQL statement or batch. (Microsoft.SqlServer.ConnectionInfo) Unable to open the physical file "F:\Company.mdf". Operating system error 5: "5(Access is denied.)". (Microsoft SQL Server, Error: 5120) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&ProdVer=10.50.1600&EvtSrc=MSSQLServer&EvtID=5120&LinkId=20476

    Read the article

  • To disallow indexing the category and tag listings in a blog

    - by Mert Nuhoglu
    Mark Wilson says that category and tag listings in a blog should be disallowed in order to prevent duplicate content. I understand this. However, I want to put internal links on keywords in the blog posts to the tag and category pages in order for the readers to find more relevant content. I wonder whether putting those internal links to the category/tag pages which are disallowed in robots.txt is counted as useful from the perspective of SEO internal linking?

    Read the article

  • Columnstore Case Study #2: Columnstore faster than SSAS Cube at DevCon Security

    - by aspiringgeek
    Preamble This is the second in a series of posts documenting big wins encountered using columnstore indexes in SQL Server 2012 & 2014.  Many of these can be found in my big deck along with details such as internals, best practices, caveats, etc.  The purpose of sharing the case studies in this context is to provide an easy-to-consume quick-reference alternative. See also Columnstore Case Study #1: MSIT SONAR Aggregations Why Columnstore? As stated previously, If we’re looking for a subset of columns from one or a few rows, given the right indexes, SQL Server can do a superlative job of providing an answer. If we’re asking a question which by design needs to hit lots of rows—DW, reporting, aggregations, grouping, scans, etc., SQL Server has never had a good mechanism—until columnstore. Columnstore indexes were introduced in SQL Server 2012. However, they're still largely unknown. Some adoption blockers existed; yet columnstore was nonetheless a game changer for many apps.  In SQL Server 2014, potential blockers have been largely removed & they're going to profoundly change the way we interact with our data.  The purpose of this series is to share the performance benefits of columnstore & documenting columnstore is a compelling reason to upgrade to SQL Server 2014. The Customer DevCon Security provides home & business security services & has been in business for 135 years. I met DevCon personnel while speaking to the Utah County SQL User Group on 20 February 2012. (Thanks to TJ Belt (b|@tjaybelt) & Ben Miller (b|@DBADuck) for the invitation which serendipitously coincided with the height of ski season.) The App: DevCon Security Reporting: Optimized & Ad Hoc Queries DevCon users interrogate a SQL Server 2012 Analysis Services cube via SSRS. In addition, the SQL Server 2012 relational back end is the target of ad hoc queries; this DW back end is refreshed nightly during a brief maintenance window via conventional table partition switching. SSRS, SSAS, & MDX Conventional relational structures were unable to provide adequate performance for user interaction for the SSRS reports. An SSAS solution was implemented requiring personnel to ramp up technically, including learning enough MDX to satisfy requirements. Ad Hoc Queries Even though the fact table is relatively small—only 22 million rows & 33GB—the table was a typical DW table in terms of its width: 137 columns, any of which could be the target of ad hoc interrogation. As is common in DW reporting scenarios such as this, it is often nearly to optimize for such queries using conventional indexing. DevCon DBAs & developers attended PASS 2012 & were introduced to the marvels of columnstore in a session presented by Klaus Aschenbrenner (b|@Aschenbrenner) The Details Classic vs. columnstore before-&-after metrics are impressive. Scenario   Conventional Structures   Columnstore   Δ SSRS via SSAS 10 - 12 seconds 1 second >10x Ad Hoc 5-7 minutes (300 - 420 seconds) 1 - 2 seconds >100x Here are two charts characterizing this data graphically.  The first is a linear representation of Report Duration (in seconds) for Conventional Structures vs. Columnstore Indexes.  As is so often the case when we chart such significant deltas, the linear scale doesn’t expose some the dramatically improved values corresponding to the columnstore metrics.  Just to make it fair here’s the same data represented logarithmically; yet even here the values corresponding to 1 –2 seconds aren’t visible.  The Wins Performance: Even prior to columnstore implementation, at 10 - 12 seconds canned report performance against the SSAS cube was tolerable. Yet the 1 second performance afterward is clearly better. As significant as that is, imagine the user experience re: ad hoc interrogation. The difference between several minutes vs. one or two seconds is a game changer, literally changing the way users interact with their data—no mental context switching, no wondering when the results will appear, no preoccupation with the spinning mind-numbing hurry-up-&-wait indicators.  As we’ve commonly found elsewhere, columnstore indexes here provided performance improvements of one, two, or more orders of magnitude. Simplified Infrastructure: Because in this case a nonclustered columnstore index on a conventional DW table was faster than an Analysis Services cube, the entire SSAS infrastructure was rendered superfluous & was retired. PASS Rocks: Once again, the value of attending PASS is proven out. The trip to Charlotte combined with eager & enquiring minds let directly to this success story. Find out more about the next PASS Summit here, hosted this year in Seattle on November 4 - 7, 2014. DevCon BI Team Lead Nathan Allan provided this unsolicited feedback: “What we found was pretty awesome. It has been a game changer for us in terms of the flexibility we can offer people that would like to get to the data in different ways.” Summary For DW, reports, & other BI workloads, columnstore often provides significant performance enhancements relative to conventional indexing.  I have documented here, the second in a series of reports on columnstore implementations, results from DevCon Security, a live customer production app for which performance increased by factors of from 10x to 100x for all report queries, including canned queries as well as reducing time for results for ad hoc queries from 5 - 7 minutes to 1 - 2 seconds. As a result of columnstore performance, the customer retired their SSAS infrastructure. I invite you to consider leveraging columnstore in your own environment. Let me know if you have any questions.

    Read the article

  • Tagged: 5 things SQL Server should drop

    - by AaronBertrand
    I was tagged by Paul Randal ( blog | twitter ) last night in his latest blog post, entitled, " What 5 things should SQL Server get rid of? " His top 5 pretty much coincide with my top 5, so I'll have to dig a little deeper. In no particular order: Syntax inconsistencies This isn't really a specific thing that Microsoft should get rid of, but rather an attitude and overall approach to SQL Server's long-term development. Every time they add a feature or option to SQL Server, it seems to be implemented...(read more)

    Read the article

  • My blog has now moved to http://dataidol.com/tonyrogerson

    - by tonyrogerson
    I've decided to implement a new blogging infrastructure based on WordPress; the community server stuff I'm using has got a bit tired and WordPress offers a wealth of plug-ins to do pretty much everything I want.So, see http://dataidol.com/tonyrogerson for my new blog; its not just a move, I have broadened my coverage to encompass all things data, the first couple of posts talk about Short-Stroking hard disks (something I've presented on a number of times now), I've also got some Erlang content.Anyway, enjoy!T

    Read the article

  • Fresh Install of SQL Server 2008 doesn't install managment studio. Help!

    - by Jordan S
    Ok I am running Windows 7, 64 bit. I cleaned of SQL server 2005 completely off my system leaving only SQL Compact Edition. I went here http://www.microsoft.com/downloads/details.aspx?FamilyID=01af61e6-2f63-4291-bcad-fd500f6027ff&displaylang=en and installed SQL Server 2008 Express Edition Service Pack 1. After the install, under my start bar menu all i have for SQL configuration tools are the Configuration Manager, Error and Usage Reporting and the Install Center. I don't have the SQL Managment Studio. So I went here http://www.microsoft.com/downloads/details.aspx?FamilyID=08e52ac2-1d62-45f6-9a4a-4b76a8564a2b&displaylang=en and downloaded the SQL Server 2008 Management Studio Express but when I try to install it I get a warning says This program has known compatibility issues and that I need to Install SQL Server 2008 Service Pack 1. I thought that is what I installed. So, I tried to continue running the install but I then get an error message that says Invoke or BeginInvoke can not be called on a Form before it is opened... How can I check if Service pack 1 is installed or not? What should I do?

    Read the article

  • TechEd 2010 Day Two – No SQL Server in Sight

    - by BuckWoody
    Today I worked the booth at TechEd 2010, manning the new “Surface” computer, which is just the coolest object on the planet. After that I didn’t attend a single SQL Server session – instead I’ve been frequenting SharePoint, Microsoft Office, and even the High-Performance Computing sessions. The reason is that I get really high quality SQL Server presentations at PASS, SQL Saturdays, and online from Microsoft and other vendors. While there are SQL Server sessions here (after all, I’m giving one of them!) I tend to try and see things that I don’t normally get to learn about. And the cross-pollination between those technologies and mine is fantastic.     I’ve even managed to go to an Entity Framework presentation for the developers. I actually have (a little) more respect for that technology – and I’ve modified my presentation to encompass more of that information. So whenever you have the chance, take a walk outside your comfort zone. Even at PASS and SQL Saturdays (and certainly online) you can investigate technologies other than the ones you know best.  Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Hi Availability blog posts on TechNet UK

    - by Testas
    Back in April I started a blog series on SQL Server High Availability BI.These are currently hosted on Technet UKPart Ihttp://blogs.technet.com/b/uktechnet/archive/2012/05/03/guest-post-seven-worlds-will-collide-high-availability-bi-is-not-such-a-distant-sun.aspxPart IIhttp://blogs.technet.com/b/uktechnet/archive/2012/05/30/guest-post-providing-the-foundation-for-a-high-availability-bi-infrastructure.aspxPart IIIhttp://blogs.technet.com/b/uktechnet/archive/2012/07/16/guest-post-part-3-highly-available-bi-me-myself-and-i.aspxThe final part will be released via Technet in a couple of weeksThanksChris

    Read the article

  • SQL 2008 Backups to UNC Share Failing 0xC002F210

    - by Matty Brown
    This problem is driving me NUTS!! We take backups of all of our production databases to a network share, which are then backed up to tape nightly. 8pm Mon-Fri - Full backup, followed by log backup 7am-7pm Mon-Fri, at half-hour interval - Log backup Our backups have been working in this manner since we migrated from SQL Server Standard 2000 to 2008, 3 years ago. Recently, the first log backup on Mondays have been failing. Not every time, but almost every time! The rest of the week, we've had no problems. I guess the issue may have something to do with the size of the log backup that's attempted after a weekend of no backups. Now onto the issue I need a fix for... All this week, every full backup on our biggest two databases have failed (Both backups < 1GB compressed). There's plenty of disk space on the source and destination servers. I'm guessing the issue is to do with the amount of time it takes to complete the backups of these databases, and/or the size of the backup files required to complete these backups. Changing the backup destination to local storage works fine (and very, very fast in comparison). From the Job History, I can find a few hints as to what the problem could be... Code: 0xC002F210 (Always this code, but a mix of the following descriptions...) "The operating system returned the error '64(failed to retrieve text for this error. Reason: 1815)' while attempting 'SetEndOfFile' on '\drserver\SQLBackups\Database.bak'. BACKUP DATABASE is terminating abnormally. "The operating system returned the error '64(failed to retrieve text for this error. Reason: 1815)' while attempting 'FlushFileBuffers' on '\drserver\SQLBackups\Database.bak'. BACKUP DATABASE is terminating abnormally. Please help save my hair and sanity!!

    Read the article

  • SQLRally Nordic gets underway

    - by Rob Farley
    PASS is becoming more international, which is great. The SQL Community has always been international – it’s not as if data is only generated in North America. And while it’s easy for organisations to have a North American focus, PASS is taking steps to become international. Regular readers will be aware that I’m one of three advisors to the PASS Board of Directors, with a focus on developing PASS as a more global organisation. With this in mind, it’s great that today is Day 1 of SQLRally Nordic, being hosted in in Sweden – not only a non-American country, but one that doesn’t have English as its major language. The event has been hosted by the amazing Johan Åhlén and Raoul Illyés, two guys who I met earlier this year, but the thing that amazes me is the incredible support that this event has from the SQL Community. It’s been sold out for a long time, and when you see the list of speakers, it’s not surprising. Some of the industry’s biggest names from Microsoft have turned up, including Mark Souza (who is also a PASS Director), Thomas Kejser and Tobias Thernström. Business Intelligence experts such as Jen Stirrup, Chris Webb, Peter Myers, Marco Russo and Alberto Ferrari are there, as are some of the most awarded SQL MVPs such as Itzik Ben-Gan, Aaron Bertrand and Kevin Kline. The sponsor list is also brilliant, with names such as HP, FusionIO, SQL Sentry, Quest and SolidQ complimented by Swedish companies like Cornerstone, Informator, B3IT and Addskills. As someone who is interested in PASS becoming global, I’m really excited to see this event happening, and I hope it’s a launch-pad into many other international events hosted by the SQL community. If you have the opportunity, thank Johan and Raoul for putting this event on, and the speakers and sponsors for helping support it. The noise from Twitter is that everything is going fantastically well, and everyone involved should be thoroughly congratulated! @rob_farley

    Read the article

  • TechEd 2010 Followup

    - by AllenMWhite
    Last week I presented a couple of sessions at Tech Ed NA in New Orleans. It was a great experience, even though my demos didn't always work out as planned. Here are the sessions I presented: DAT01-INT Administrative Demo-Fest for SQL Server 2008 SQL Server 2008 provides a wealth of features aimed at the DBA. In this demofest of features we'll see ways to make administering SQL Server easier and faster such as Centralized Data Management, Performance Data Warehouse, Resource Governor, Backup Compression...(read more)

    Read the article

  • Blog Commenting - Is it Really Useful in SEO?

    The easy answer to the question is, yes, blog commenting is incredibly useful as pertains to your internet search engine optimization. The relevant question is how is it useful? Furthermore, how can you make sure that anyone commenting on a blog that is related to a service that you provide give you the traffic you deserve?

    Read the article

  • Google Adsense threshold estimation

    - by Wladimir Ivanov
    I've an electronic music blog with traffic mainly from the North American continent, Western Europe and Russia. Daily I get about 100 unique visitors with 150-200 pageviews. Should I start Adsense or I need to work to increase the traffic stats. Can you suggest another appropriate monetizing option for the given case? How much time It would take me to hit the 100$ Adsense barrier with the given traffic statistics? Thanks in advance.

    Read the article

< Previous Page | 196 197 198 199 200 201 202 203 204 205 206 207  | Next Page >