Search Results

Search found 93496 results on 3740 pages for 'email server'.

Page 24/3740 | < Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >

  • An XEvent a Day (18 of 31) – A Look at Backup Internals and How to Track Backup and Restore Throughput (Part 2)

    - by Jonathan Kehayias
    In yesterday’s blog post A Look at Backup Internals and How to Track Backup and Restore Throughput (Part 1) , we looked at what happens when we Backup a database in SQL Server.  Today, we are going to use the information we captured to perform some analysis of the Backup information in an attempt to find ways to decrease the time it takes to backup a database.  When I began reviewing the data from the Backup in yesterdays post, I realized that I had made a mistake in the process and left...(read more)

    Read the article

  • An XEvent a Day (3 of 31) – Managing Event Sessions

    - by Jonathan Kehayias
    Yesterdays post, Querying the Extended Events Metadata , showed how to discover the objects available for use in Extended Events.  In todays post, we’ll take a look at the DDL Commands that are used to create and manage Event Sessions based on the objects available in the system.  Like other objects inside of SQL Server, there are three DDL commands that are used with Extended Events; CREATE EVENT SESSION , ALTER EVENT SESSION , and DROP EVENT SESSION .  The command names are self...(read more)

    Read the article

  • SQL SERVER – Online Session on What is New in Denali – Today Online

    - by pinaldave
    I will be presenting today on subject Inside of Next Generation SQL Server – Denali online at Zeollar.com. This sessions are really fun as they are online, downloadable, and 100% demo oriented. I will be using SQL Server ‘Denali’ CTP 1 to present on the subject of What is New in Denali. The webcast will start at 12:30 PM sharp and will end at 1 PM India Time. It will be 100% demo oriented and no slides. I will be covering following topics in the session. SQL SERVER – Denali Feature – Zoom Query Editor SQL SERVER – Denali – Improvement in Startup Options SQL SERVER – Denali – Clipboard Ring – CTRL+SHIFT+V SQL SERVER – Denali – Multi-Monitor SSMS Windows SQL SERVER – Denali – Executing Stored Procedure with Result Sets SQL SERVER – Performance Improvement with of Executing Stored Procedure with Result Sets in Denali SQL SERVER – ‘Denali’ – A Simple Example of Contained Databases SQL SERVER – Denali – ObjectID in Negative – Local TempTable has Negative ObjectID SQL SERVERServer Side Paging in SQL Server Denali – A Better Alternative SQL SERVERServer Side Paging in SQL Server Denali Performance Comparison SQL SERVER – Denali – SEQUENCE is not IDENTITY SQL SERVER – Denali – Introduction to SEQUENCE – Simple Example of SEQUENCE If time permits we will cover few more topics as well. The session will be recorded as well. My earlier session on the Topic of Best Practices Analyzer is also available to watch online here: SQL SERVER – Video – Best Practices Analyzer using Microsoft Baseline Configuration Analyzer Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • An XEvent a Day (13 of 31) – The system_health Session

    - by Jonathan Kehayias
    Today’s post was originally planned for this coming weekend, but seems I’ve caught whatever bug my kids had over the weekend so I am changing up today’s blog post with one that is easier to cover and shorter.  If you’ve been running some of the queries from the posts in this series, you have no doubt come across an Event Session running on your server with the name of system_health.  In today’s post I’ll go over this session and provide links to references related to it. When Extended Events...(read more)

    Read the article

  • An XEvent a Day (15 of 31) – Tracking Ghost Cleanup

    - by Jonathan Kehayias
    If you don’t know anything about Ghost Cleanup, I recommend highly that you go read Paul Randal’s blog posts Inside the Storage Engine: Ghost cleanup in depth , Ghost cleanup redux , and Turning off the ghost cleanup task for a performance gain .  To my knowledge Paul’s posts are the only things that cover Ghost Cleanup at any level online. In this post we’ll look at how you can use Extended Events to track the activity of Ghost Cleanup inside of your SQL Server.  To do this, we’ll first...(read more)

    Read the article

  • An XEvent a Day (16 of 31) – How Many Checkpoints are Issued During a Full Backup?

    - by Jonathan Kehayias
    This wasn’t my intended blog post for today, but last night a question came across #SQLHelp on Twitter from Varun ( Twitter ). #sqlhelp how many checkpoints are issued during a full backup? The question was answered by Robert Davis (Blog|Twitter) as: Just 1, at the very start. RT @ 1sql : #sqlhelp how many checkpoints are issued during a full backup? This seemed like a great thing to test out with Extended Events so I ran through the available Events in SQL Server 2008, and the only Event related...(read more)

    Read the article

  • An XEvent a Day (1 of 31) – An Overview of Extended Events

    - by Jonathan Kehayias
    First introduced in SQL Server 2008, Extended Events provided a new mechanism for capturing information about events inside the Database Engine that was both highly performant and highly configurable. Designed from the ground up with performance as a primary focus, Extended Events may seem a bit odd at first look, especially when you compare it to SQL Trace. However, as you begin to work with Extended Events, you will most likely change how you think about tracing problems, and will find the power...(read more)

    Read the article

  • An XEvent a Day (6 of 31) – Targets Week – asynchronous_file_target

    - by Jonathan Kehayias
    Yesterday’s post, Targets Week - ring_buffer , looked at the ring_buffer Target in Extended Events and how it outputs the raw Event data in an XML document.  Today I’m going to go over the details of the other Target in Extended Events that captures raw Event data, the asynchronous_file_target. What is the asynchronous_file_target? The asynchronous_file_target holds the raw format Event data in a proprietary binary file format that persists beyond server restarts and can be provided to another...(read more)

    Read the article

  • SQL SERVER – Introduction to PERCENTILE_DISC() – Analytic Functions Introduced in SQL Server 2012

    - by pinaldave
    SQL Server 2012 introduces new analytical function PERCENTILE_DISC(). The book online gives following definition of this function: Computes a specific percentile for sorted values in an entire rowset or within distinct partitions of a rowset in Microsoft SQL Server 2012 Release Candidate 0 (RC 0). For a given percentile value P, PERCENTILE_DISC sorts the values of the expression in the ORDER BY clause and returns the value with the smallest CUME_DIST value (with respect to the same sort specification) that is greater than or equal to P. If you are clear with understanding of the function – no need to read further. If you got lost here is the same in simple words – find value of the column which is equal or more than CUME_DIST. Before you continue reading this blog I strongly suggest you read about CUME_DIST function over here Introduction to CUME_DIST – Analytic Functions Introduced in SQL Server 2012. Now let’s have fun following query: USE AdventureWorks GO SELECT SalesOrderID, OrderQty, ProductID, CUME_DIST() OVER(PARTITION BY SalesOrderID ORDER BY ProductID ) AS CDist, PERCENTILE_DISC(0.5) WITHIN GROUP (ORDER BY ProductID) OVER (PARTITION BY SalesOrderID) AS PercentileDisc FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ORDER BY SalesOrderID DESC GO The above query will give us the following result: You can see that I have used PERCENTILE_DISC(0.5) in query, which is similar to finding median but not exactly. PERCENTILE_DISC() function takes a percentile as a passing parameters. It returns the value as answer which value is equal or great to the percentile value which is passed into the example. For example in above example we are passing 0.5 into the PERCENTILE_DISC() function. It will go through the resultset and identify which rows has values which are equal to or great than 0.5. In first example it found two rows which are equal to 0.5 and the value of ProductID of that row is the answer of PERCENTILE_DISC(). In some third windowed resultset there is only single row with the CUME_DIST() value as 1 and that is for sure higher than 0.5 making it as a answer. To make sure that we are clear with this example properly. Here is one more example where I am passing 0.6 as a percentile. Now let’s have fun following query: USE AdventureWorks GO SELECT SalesOrderID, OrderQty, ProductID, CUME_DIST() OVER(PARTITION BY SalesOrderID ORDER BY ProductID ) AS CDist, PERCENTILE_DISC(0.6) WITHIN GROUP (ORDER BY ProductID) OVER (PARTITION BY SalesOrderID) AS PercentileDisc FROM Sales.SalesOrderDetail WHERE SalesOrderID IN (43670, 43669, 43667, 43663) ORDER BY SalesOrderID DESC GO The above query will give us the following result: The result of the PERCENTILE_DISC(0.6) is ProductID of which CUME_DIST() is more than 0.6. This means for SalesOrderID 43670 has row with CUME_DIST() 0.75 is the qualified row, resulting answer 773 for ProductID. I hope this explanation makes it further clear. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Function, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Intermittent silent failures when receiving email

    - by s t
    I’ve had a company host my website and email for the last six months or so and I’m having intermittent and silent failures where emails sent to me are not received. The sender never receives a “delivery notification failure” message. This has happened on multiple domains (@gmail.com, @microsoft.com) I’ve experienced it happening first hand when I sent a mail to myself from another account but I was unable to reproduce the error. It’s very rare (one in every 300 emails or so) The mails are not routed to my junk folder :) Obviously I’m worried about the effect this has on my business – but what can I do? I don’t believe I have enough information to diagnose the problem (neither does my hosting company when I presented them with the same information) – should I switch to another host?

    Read the article

  • SQL SERVER – Thinking about Deprecated, Discontinued Features and Breaking Changes while Upgrading to SQL Server 2012 – Guest Post by Nakul Vachhrajani

    - by pinaldave
    Nakul Vachhrajani is a Technical Specialist and systems development professional with iGATE having a total IT experience of more than 7 years. Nakul is an active blogger with BeyondRelational.com (150+ blogs), and can also be found on forums at SQLServerCentral and BeyondRelational.com. Nakul has also been a guest columnist for SQLAuthority.com and SQLServerCentral.com. Nakul presented a webcast on the “Underappreciated Features of Microsoft SQL Server” at the Microsoft Virtual Tech Days Exclusive Webcast series (May 02-06, 2011) on May 06, 2011. He is also the author of a research paper on Database upgrade methodologies, which was published in a CSI journal, published nationwide. In addition to his passion about SQL Server, Nakul also contributes to the academia out of personal interest. He visits various colleges and universities as an external faculty to judge project activities being carried out by the students. Disclaimer: The opinions expressed herein are his own personal opinions and do not represent his employer’s view in anyway. Blog | LinkedIn | Twitter | Google+ Let us hear the thoughts of Nakul in first person - Those who have been following my blogs would be aware that I am recently running a series on the database engine features that have been deprecated in Microsoft SQL Server 2012. Based on the response that I have received, I was quite surprised to know that most of the audience found these to be breaking changes, when in fact, they were not! It was then that I decided to write a little piece on how to plan your database upgrade such that it works with the next version of Microsoft SQL Server. Please note that the recommendations made in this article are high-level markers and are intended to help you think over the specific steps that you would need to take to upgrade your database. Refer the documentation – Understand the terms Change is the only constant in this world. Therefore, whenever customer requirements, newer architectures and designs require software vendors to make a change to the keywords, functions, etc; they ensure that they provide their end users sufficient time to migrate over to the new standards before dropping off the old ones. Microsoft does that too with it’s Microsoft SQL Server product. Whenever a new SQL Server release is announced, it comes with a list of the following features: Breaking changes These are changes that would break your currently running applications, scripts or functionalities that are based on earlier version of Microsoft SQL Server These are mostly features whose behavior has been changed keeping in mind the newer architectures and designs Lesson: These are the changes that you need to be most worried about! Discontinued features These features are no longer available in the associated version of Microsoft SQL Server These features used to be “deprecated” in the prior release Lesson: Without these changes, your database would not be compliant/may not work with the version of Microsoft SQL Server under consideration Deprecated features These features are those that are still available in the current version of Microsoft SQL Server, but are scheduled for removal in a future version. These may be removed in either the next version or any other future version of Microsoft SQL Server The features listed for deprecation will compose the list of discontinued features in the next version of SQL Server Lesson: Plan to make necessary changes required to remove/replace usage of the deprecated features with the latest recommended replacements Once a feature appears on the list, it moves from bottom to the top, i.e. it is first marked as “Deprecated” and then “Discontinued”. We know of “Breaking change” comes later on in the product life cycle. What this means is that if you want to know what features would not work with SQL Server 2012 (and you are currently using SQL Server 2008 R2), you need to refer the list of breaking changes and discontinued features in SQL Server 2012. Use the tools! There are a lot of tools and technologies around us, but it is rarely that I find teams using these tools religiously and to the best of their potential. Below are the top two tools, from Microsoft, that I use every time I plan a database upgrade. The SQL Server Upgrade Advisor Ever since SQL Server 2005 was announced, Microsoft provides a small, very light-weight tool called the “SQL Server upgrade advisor”. The upgrade advisor analyzes installed components from earlier versions of SQL Server, and then generates a report that identifies issues to fix either before or after you upgrade. The analysis examines objects that can be accessed, such as scripts, stored procedures, triggers, and trace files. Upgrade Advisor cannot analyze desktop applications or encrypted stored procedures. Refer the links towards the end of the post to know how to get the Upgrade Advisor. The SQL Server Profiler Another great tool that you can use is the one most SQL Server developers & administrators use often – the SQL Server profiler. SQL Server Profiler provides functionality to monitor the “Deprecation” event, which contains: Deprecation announcement – equivalent to features to be deprecated in a future release of SQL Server Deprecation final support – equivalent to features to be deprecated in the next release of SQL Server You can learn more using the links towards the end of the post. A basic checklist There are a lot of finer points that need to be taken care of when upgrading your database. But, it would be worth-while to identify a few basic steps in order to make your database compliant with the next version of SQL Server: Monitor the current application workload (on a test bed) via the Profiler in order to identify usage of features marked as Deprecated If none appear, you are all set! (This almost never happens) Note down all the offending queries and feature usages Run analysis sessions using the SQL Server upgrade advisor on your database Based on the inputs from the analysis report and Profiler trace sessions, Incorporate solutions for the breaking changes first Next, incorporate solutions for the discontinued features Revisit and document the upgrade strategy for your deployment scenarios Revisit the fall-back, i.e. rollback strategies in case the upgrades fail Because some programming changes are dependent upon the SQL server version, this may need to be done in consultation with the development teams Before any other enhancements are incorporated by the development team, send out the database changes into QA QA strategy should involve a comparison between an environment running the old version of SQL Server against the new one Because minimal application changes have gone in (essential changes for SQL Server version compliance only), this would be possible As an ongoing activity, keep incorporating changes recommended as per the deprecated features list As a DBA, update your coding standards to ensure that the developers are using ANSI compliant code – this code will require a change only if the ANSI standard changes Remember this: Change management is a continuous process. Keep revisiting the product release notes and incorporate recommended changes to stay prepared for the next release of SQL Server. May the power of SQL Server be with you! Links Referenced in this post Breaking changes in SQL Server 2012: Link Discontinued features in SQL Server 2012: Link Get the upgrade advisor from the Microsoft Download Center at: Link Upgrade Advisor page on MSDN: Link Profiler: Review T-SQL code to identify objects no longer supported by Microsoft: Link Upgrading to SQL Server 2012 by Vinod Kumar: Link Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Upgrade

    Read the article

  • Email Service or CRM

    - by MG1
    I am creating a process for a client who is a chapel. They have people who sign up to receive notifications of a death anniversary. I exported a CSV from the db, imported it into Mailchimp and I was about to launch a Mailchimp automation based on a date. Not I realized that are many instances where the same person singed up for multiple death reminders. Mailchimp doesn't allow for duplicate email addresses in one list. Is there another service or application that I can use for this?

    Read the article

  • SQL SERVER – Iridium I/O – SQL Server Deduplication that Shrinks Databases and Improves Performance

    - by Pinal Dave
    Database performance is a common problem for SQL Server DBA’s.  It seems like we spend more time on performance than just about anything else.  In many cases, we use scripts or tools that point out performance bottlenecks but we don’t have any way to fix them.  For example, what do you do when you need to speed up a query that is already tuned as well as possible?  Or what do you do when you aren’t allowed to make changes for a database supporting a purchased application? Iridium I/O for SQL Server was originally built at Confio software (makers of Ignite) because DBA’s kept asking for a way to actually fix performance instead of just pointing out performance problems. The technology is certified by Microsoft and was so promising that it was spun out into a separate company that is now run by the Confio Founder/CEO and technology management team. Iridium uses deduplication technology to both shrink the databases as well as boost IO performance.  It is intriguing to see it work.  It will deduplicate a live database as it is running transactions.  You can watch the database get smaller while user queries are running. Iridium is a simple tool to use. After installing the software, you click an “Analyze” button which will spend a minute or two on each database and estimate both your storage and performance savings.  Next, you click an “Activate” button to turn on Iridium I/O for your selected databases.  You don’t need to reboot the operating system or restart the database during any part of the process. As part of my test, I also wanted to see if there would be an impact on my databases when Iridium was removed.  The ‘revert’ process (bringing the files back to their SQL Server native format) was executed by a simple click of a button, and completed while the databases were available for normal processing. I was impressed and enjoyed playing with the software and encourage all of you to try it out.  Here is the link to the website to download Iridium for free. . Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • An XEvent a Day (25 of 31) – The Twelve Days of Christmas

    - by Jonathan Kehayias
    In the spirit of today’s holiday, a couple of people have been posting SQL related renditions of holiday songs.  Tim Mitchell posted his 12 days of SQL Christmas , and Paul Randal and Kimberly Tripp went as far as to record themselves sing SQL Carols on their blog post Our Christmas Gift To You: Paul and Kimberly Singing!   For today’s post on Extended Events I give you the 12 days of Christmas, Extended Events style (all of these are based on true facts about Extended Events in SQL Server)....(read more)

    Read the article

  • An XEvent a Day (23 of 31) – How it Works – Multiple Transaction Log Files

    - by Jonathan Kehayias
    While working on yesterday’s blog post The Future – fn_dblog() No More? Tracking Transaction Log Activity in Denali I did a quick Google search to find a specific blog post by Paul Randal to use it as a reference, and in the results returned another blog post titled, Investigating Multiple Transaction Log Files in SQL Server caught my eye so I opened it in a new tab in IE and went about finishing the blog post.  It probably wouldn’t have gotten my attention if it hadn’t been on the SqlServerPedia...(read more)

    Read the article

  • How to attach WAR file in email from jenkins

    - by birdy
    We have a case where a developer needs to access the last successfully built WAR file from jenkins. However, they can't access the jenkins server. I'd like to configure jenkins such that on every successful build, jenkins sends the WAR file to this user. I've installed the ext-email plugin and it seems to be working fine. Emails are being received along with the build.log. However, the WAR file isn't being received. The WAR file lives on this path in the server: /var/lib/jenkins/workspace/Ourproject/dist/our.war So I configured it under Post build actions like this: The problem is that emails are sent but the WAR file isn't being attached. Do I need to do something else?

    Read the article

  • SQL Server PowerShell Provider And PowerShell Version 2 Get-Command Issue

    - by BuckWoody
    The other day I blogged that the version of the SQL Server PowerShell provider (sqlps) follows the version of PowerShell. That’s all goodness, but it has appeared to cause an issue for PowerShell 2.0. the Get-Command PowerShell command-let returns an error (Object reference not set to an instance of an object) if you are using PowerShell 2.0 and sqlps – it’s a known bug, and I’m happy to report that it is fixed in SP2 for SQL Server 2008 – something that will released soon. You can read more about this issue here: http://connect.microsoft.com/SQLServer/feedback/details/484732/sqlps-and-powershell-v2-issues Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Admin form that generates an email confirmation ends up in SPAM [duplicate]

    - by PJD Creative
    This question already has an answer here: How can I prevent my mail from being classified as spam? 10 answers I have an admin form that I have setup for a client, that generates an email confirmation from a template I have designer... It works really well but it ends up in spam some of the time, and this is real frustrating as it is just confirming some details for the customer of what they have just booked, not at all spam, and is accessed via a page where the admin requires login. Any suggestions as to why this may end up in spam. It does have dollar signs ($$) as it is confirming a price, im assuming this is one problem, the rest of it is just general dates and info about the confirmation. Is there any suggestions on how to get this out of spam? thanks in advance

    Read the article

  • Cannot logon to POP server on my VPS or recieve emails

    - by Andy
    I recently purchased an unmanaged VPS to host my business websites, however I am struggling to get the email accounts working as I have only ever had experience with shared hosting. The VPS is running CentOS and I have Webmin/Virtualmin installed. I have added my domain, which is lyke.org.uk, and that is working OK. However, when I've added a user and tried to access their email account using Apple Mail, I've been able to establish a SMTP connection but I've not been able to login using POP. Furthermore, I've set up squirrelmail and I can send an email to any email address from there but I haven't recieved any that I have sent to that email address from other accounts. I would very much appreciate any help or suggestions as I am completely new to VPS and web hosting without Plesk or cPanel.

    Read the article

  • Outlook 2007 - Right Click Email > Move To {Folder Name}

    - by HK1
    I know it seems like an elementary question. What's the simplest and fastest way to move a read/completed email to a different folder in Outlook 2007 (connected to Exchange 2007)? I have a particular user that is challenged by technology. Using keyboard shortcuts is not an option. Dragging and dropping things - forget it. And too many clicks is frustrating to him. He keeps his email inbox completely clean (OCD=True) but he does that by deleting every single email as quickly as he's done with it. If an email can't be resolved in a day or two it almost drives him to insanity. As far as he's concerned, there's only one right thing to do with an email - reply to it and then delete it. He's being asked to save emails unless they are clearly trash. I'm trying to figure out what the simplest method is to move an email to a "Saved Emails" or "Archived" folder (don't confuse "folder" with .PST file, that's irrelevant for this discussion). I envisioned that I could possibly hi-jack every delete and put the email in his Saved folder. But I don't like this option because some emails are truly trash and I don't want him saving those. What I'd really like to do is something like this: Right Click Email in List > Move To {Folder Name} Is there a simple way to do this? Maybe someone has another suggestion on how to handle this situation.

    Read the article

  • MS Server 2008 R2: DNS Redirection on second server for website

    - by Alain
    We have a website on a secondary server that we want this website to be accessible from Internet, with www.mywebsite.com. In the domain name provider of www.mywebsite.com, we set our 2 dns names, dns1.company.ch, dns2.company.ch and our static ip address. System is set as following: MS Server 2008 R2 N°1: Main server, in AD With IP 192.168.1.100 With DNS zone dns1.company.ch With DNS secondary zone from server N°2: dns2.company.ch With DNS secondary zone from server N°2: mywebsite.com (zone transfer is on) MS Server 2008 R2 N°2: Secondary server, not in AD With IP 192.168.1.101 With DNS zone dns2.company.ch With DNS zone mywebsite.com with host: 192.168.1.101 With the website under ISS with bindings www.mywebsite.com:80, mywebsite.com:80 All traffics for ports 80 (http) and 53 (dns) from Internet goes to server N°1. How can we redirect all traffics for www.mywebsite.com from Internet to our secondary server so the corresponding website can be displayed in Internet ? Note: Under DNS of server N°1, we tried to use also a conditional redirector mywebsite.com (192.168.0.101), but it was working only for intranet. Thank you, Alain

    Read the article

  • My server is slower than the average user's computer, should I still offload Access queries to SQL Server? [closed]

    - by andrewb
    Possible Duplicate: How do you do Load Testing and Capacity Planning for Databases I have a database set up with MS Access 2007 front ends and an SQL Server 2005 back end. At the moment, all the queries are saved in the front end as I've only recently moved to an SQL Server backend. I'm wondering how much of those queries I should save as stored procedures/views on SQL Server. About the system The number of concurrent users is only a handful, though it could be as high as 25 at one time (very unlikely). The average computer has an Intel i3-2120 CPU running at 3.3 GHz, which gets a PassMark score of 3,987, whilst the server has an Intel Xeon E5335 running at 2.0 GHz, which gets a PassMark score of 2,637. Always an awkward situation when an i3 outperforms a Xeon... though the i3 is from Q1 2011 and the Xeon is Q2 2009. There is potential for a server upgrade in the future, though it wouldn't come easy. I'm inclined to move the queries to the back end, as they are beginning to take noticeable time and I figure that is a better way of doing things. I like the idea of throwing everything at the server, then pushing for a server upgrade. It makes more sense in my mind to be upgrading one server rather than 30 PCs. Or am I being overzealous? Why my question isn't a duplicate It seems that my question has been misinterpreted and labelled a duplicate of quite a different question, one about testing and capacity planning. I'll try explain how my question is very different from the linked question. The crux of my question is something like "Even though my server is technically slower, is it better to have it doing more of the queries?" There's two ways that people could have answered this: I agree the server is going to be slower, but the extra benefits of such and such (like the less Access the better) means you should move most to the server anyway. (OR no it doesn't outweigh the benefit, keep them in Access) Actually the server will be faster because of such and such. I'm hoping that people out there could provide some answers like this, and the question in the dupe link doesn't really provide either of these answers. Ok sure, I suppose I could do extensive performance testing to compare Access queries running on a local machine to SQL Server queries running on the server, but that sounds like a very hard task (particularly performance testing of access) compared to someone giving some quick general guidance, and again, my question is looking for a lot more than immediate performance benefit.

    Read the article

  • Can I Send An Email From An Email Account That Has Email Forwarding Set Up?

    - by mickburkejnr
    Stick with me on this one, I couldn't think of a correct way to ask the question! I have a domain registered, but it isn't attached to a hosting package or e-mail account. I will do soon, but not at the moment. Right now, I would like to set up an e-mail forwarding scenario. So, I could have the email [email protected] forward to [email protected]. I know this much is possible. What I would like to do, is using the gmail account is to keep the forwarded e-mail address when replying. So when I reply to emails sent to gmail through my e-mail forwarding, I would like any reply to say it came from [email protected] instead of [email protected]. Is this possible? If so how would I go about it?

    Read the article

  • Baseline / Benchmark Physical and virtual server performance

    - by EyeonTech
    I am setting up a new server and there are some options. I want to perform some benchmarks and I need your help in determining the best tools and if possible run pre-configured benchmarks designed for SQL servers on Windows Server 2008/2012. Step 1. Run a performance monitor on the current Live SQL server (Windows Server 2008 Virtual machine running on ESXi. New server Hardware rundown: Intel® Server System R1304BTLSHBN - 1U Rack, LGA1155 http://ark.intel.com/products/53559/Intel-Server-System-R1304BTLSHBN Intel Xeon E3-1270V2 2x Intel SSD 330 Series 240GB 2.5in SATA 6Gb/s 25nm 1x WD 2TB WD2002FAEX 2TB 64M SATA3 CAVIAR BLACK 4x 8GB 1333MHz DDR3 ECC CL9 DIMM There are several options for configurations and I want to benchmark some of them and share the results. Option 1. Configure 2x SSDs at RAID 0. Install Windows Server 2008 directly to the 2TB WD Caviar HDD. Store Database files on the RAID 0 Volume. Benchmark the OS direct on the hardware as an SQL Server. Store SQL Backup databases on the 2TB WD Caviar HDD. Option 2. Configure 2x SSDs at RAID 0. Install Windows Server 2012 directly to the 2TB WD Caviar HDD. Install Hyper-V. Install the SQL Server (Server 2008) as a virtual machine. Store the Virtual Hard Disks on the SSDs. Option 3. Configure 2x SSDs at RAID 0. Install VMWare ESXi on a partition of the 2TB WD Caviar HDD. Install the SQL Server (Server 2008) as a virtual machine. Store the Virtual Hard Disks on the SSDs. I have a few tools in mind from http://technet.microsoft.com/en-us/library/cc768530(v=bts.10).aspx. Any tools with pre-configured test would be fantastic. Specifically if there are pre-configured perfmon sets avaliable. Any opinions on the setup to gain the best results is welcome. Thanks in advance.

    Read the article

< Previous Page | 20 21 22 23 24 25 26 27 28 29 30 31  | Next Page >