Search Results

Search found 24220 results on 969 pages for 'performance tools'.

Page 111/969 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • OBIEE 11.1.1 - How to Enable Caching in Internet Information Services (IIS) 7.0+

    - by Ahmed A
    Follow these steps to configure static file caching and content expiration if you are using IIS 7.0 Web Server with Oracle Business Intelligence. Tip: Install IIS URL Rewrite that enables Web administrators to create powerful outbound rules. Following are the steps to set up static file caching for IIS 7.0+ Web Server: 1. In “web.config” file for OBIEE static files virtual directory (ORACLE_HOME/bifoundation/web/app) add the following highlight in bold the outbound rule for caching:<?xml version="1.0" encoding="UTF-8"?><configuration>    <system.webServer>        <urlCompression doDynamicCompression="true" />        <rewrite>            <outboundRules>                <rule name="header1" preCondition="FilesMatch" patternSyntax="Wildcard">                    <match serverVariable="RESPONSE_CACHE_CONTROL" pattern="*" />                    <action type="Rewrite" value="max-age=604800" />                </rule>                <preConditions>    <preCondition name="FilesMatch">                        <add input="{RESPONSE_CONTENT_TYPE}" pattern="^text/css|^text/x-javascript|^text/javascript|^image/gif|^image/jpeg|^image/png" />                    </preCondition>                </preConditions>            </outboundRules>        </rewrite>    </system.webServer></configuration>2. Restart IIS. Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Silly Developers, VirtualBox Is For Sysadmins!

    - by rickramsey
    That's one of my favorite bumper stickers. (Well, along with the sticker placed upside down on Jeep windows that says "If you can read this, roll me over.") I don't object to the "silly boys" sticker because, in my humble opinion, girls look much cuter in Jeeps than guys do. But as Ginny Henningsen points out, a similar sentiment can be applied to Oracle VM VirtualBox. While writing her other sysadmin-related articles for OTN, Ginny horsed around with VirtualBox so much that she fell in love with it. Not as a developer, but as a sysadmin. Read why she thinks it's such a great sysadmin tool: My New Favorite Sysadmin Tool: Oracle VM VirtualBox Here are some of Ginny's other articles: How I Simplified Oracle Database Installation on Oracle Linux Best Way to Update Software With IPS Best Way to Automate ZFS Snapshots and Track Software Updates Best Way to Update Software in Zones - Rick Ramsey Website Newsletter Facebook Twitter

    Read the article

  • IBM System x3850 X5 TPC-H Benchmark

    - by jchang
    IBM just published a TPC-H SF 1000 result for their x3850 X5 , 4-way Xeon 7560 system featuring a special MAX5 memory expansion board to support 1.5TB memory. In Dec 2010, IBM also published a TPC-H SF1000 for their Power 780 system, 8-way, quad-core, (4 logical processors per physical core). In Feb 2011, Ingres published a TPC-H SF 100 on a 2-way Xeon 5680 for their VectorWise column-store engine (plus enhancements for memory architecture, SIMD and compression). The figure table below shows TPC-H...(read more)

    Read the article

  • Demantra USA Based Companies and SOX Compliance

    - by user702295
    A USA based company is assessing Demantra Trade Promotion Management (TPM) capability.  It appears that SOX is necessary in their case due to the nature of what TPM does and the necessity for auditability.  Do we have any detail on SOX compliance for Demantra? Answser ------- SOX compliance with regards to IT: 1.  Requires auditing of data changes done by who, what, when     a. Audit trail profiles can be set up for key financial series and view them in audit trail reports     b. One functionality we do not have which typically is asked for is user login history. We have only        active sessions, history is not available. 2.  Segregation of duties     a. With respect to TPM, you could have deduction and financial analyst for settlement be different        from promotion creator, promotion approver or sales team.     b. Budget Approver for funds can be different from funds consumer.     c. Promotion creator can be different than promotion approver     d. For a US customer you may have to write some custom scripts to capture promotion status change        and produce an external report as part of compliance. One additional requirement is transparency of forward commitments entered into with retailers / distributors for trade spending, promotions.  Outside of Demantra - Consumer Goods Trade Funds Analytics.

    Read the article

  • Advanced donut caching: using dynamically loaded controls

    - by DigiMortal
    Yesterday I solved one caching problem with local community portal. I enabled output cache on SharePoint Server 2007 to make site faster. Although caching works fine I needed to do some additional work because there are some controls that show different content to different users. In this example I will show you how to use “donut caching” with user controls – powerful way to drive some content around cache. About donut caching Donut caching means that although you are caching your content you have some holes in it so you can still affect the output that goes to user. By example you can cache front page on your site and still show welcome message that contains correct user name. To get better idea about donut caching I suggest you to read ScottGu posting Tip/Trick: Implement "Donut Caching" with the ASP.NET 2.0 Output Cache Substitution Feature. Basically donut caching uses ASP.NET substitution control. In output this control is replaced by string you return from static method bound to substitution control. Again, take a look at ScottGu blog posting I referred above. Problem If you look at Scott’s example it is pretty plain and easy by its output. All it does is it writes out current user name as string. Here are examples of my login area for anonymous and authenticated users:    It is clear that outputting mark-up for these views as string is pretty lame to implement in code at string level. Every little change in design will end up with new version of controls library because some parts of design “live” there. Solution: using user controls I worked out easy solution to my problem. I used cache substitution and user controls together. I have three user controls: LogInControl – this is the proxy control that checks which “real” control to load. AnonymousLogInControl – template and logic for anonymous users login area. AuthenticatedLogInControl – template and logic for authenticated users login area. This is the control we render for each user separately because it contains user name and user profile fill percent. Anonymous control is not very interesting because it is only about keeping mark-up in separate file. Interesting parts are LogInControl and AuthenticatedLogInControl. Creating proxy control The first thing was to create control that has substitution area where “real” control is loaded. This proxy control should also be available to decide which control to load. The definition of control is very primitive. <%@ Control EnableViewState="false" Inherits="MyPortal.Profiles.LogInControl" %> <asp:Substitution runat="server" MethodName="ShowLogInBox" /> But code is a little bit tricky. Based on current user instance we decide which login control to load. Then we create page instance and load our control through it. When control is loaded we will call DataBind() method. In this method we evaluate all fields in loaded control (it was best choice as Load and other events will not be fired). Take a look at the code. public static string ShowLogInBox(HttpContext context) {     var user = SPContext.Current.Web.CurrentUser;     string controlName;       if (user != null)         controlName = "AuthenticatedLogInControl.ascx";     else         controlName = "AnonymousLogInControl.ascx";       var path = "~/_controltemplates/" + controlName;     var output = new StringBuilder(10000);       using(var page = new Page())     using(var ctl = page.LoadControl(path))     using(var writer = new StringWriter(output))     using(var htmlWriter = new HtmlTextWriter(writer))     {         ctl.DataBind();         ctl.RenderControl(htmlWriter);     }     return output.ToString(); } When control is bound to data we ask to render it its contents to StringBuilder. Now we have the output of control as string and we can return it from our method. Of course, notice how correct I am with resources disposing. :) The method that returns contents for substitution control is static method that has no connection with control instance because hen page is read from cache there are no instances of controls available. Conclusion As you saw it was not very hard to use donut caching with user controls. Instead of writing mark-up of controls to static method that is bound to substitution control we can still use our user controls.

    Read the article

  • Using VS12 to create and manage an Azure-SQL DB (simple tasks)

    - by Konrad Viltersten
    On occasion, I'm in a project where I need to store some information in an external DB. Usually, I create one in Azure and run some scripts that I adapt (the usual create table, create login etc.). It just struck me that there might (and definitely should) be a tool in VS that allows me to create a project for my DB, pull out some boxes to create a model of a DB schema, execute a script or two on it (possibly virtual or temporary) and then somehow push it up the cloud. Haven't found such a tool. Is there one and how do I get to it? NB. I'm not looking for an optimized or well structured schema (that's what the DB pros are for at a later stage). I'm not a DB guy nor do I aspire to become one (too old, hehe). I'll probably be satisfied with a Q&D approach.

    Read the article

  • Downloads killing internet on my home network

    - by Travis
    I am currently having a problem with my wireless. Whenever I try to download anything it kills the internet for every other application(tabs within the same browser, browsers on other computers on the same network) except the process doing the download. This occurs with everything from downloading updates to iso's. I am not using a torrent. It happens when downloading upgrades, browser downloads, or anything else. This problem does not occur when I use Windows 7 on the same computer and it stops killing the internet for other computers if I turn the download/Ubuntu off. I am using an ASUS G74SX laptop running Ubuntu 12.10 with Gnome 3.6. My wireless card is an Intel Corporation Centrino Wireless-N + WiMAX 6150 (rev 67) Thanks!

    Read the article

  • 62 miles up

    - by fatherjack
    RedGate are known for being a software company with a big personality and having a huge presence in the SQL Community. They run the annual Exceptional DBA competition, having held a party at the PASS summit last night to celebrate this years winner - Jeff Moden. They have also got a great attitude towards their staff as demonstrated on their website. Today, just after the PASS Summit keynote speech they made an announcement that is literally going to give one lucky winner the ride of their life....(read more)

    Read the article

  • Feature pack for SQL Server 2005 SP4 - collection of standalone packages

    - by ssqa.net
    With the release of SQL2005Sp4 an additional task is essential for DBAs & Developers to avoid any compatibility issues with existing code agains SP4 instance. Feature pack for SQL Server 2005 SP4 is available to download which contains the standalone packages such as SQLNative Client, ADOMD, OLAPDM etc.... as it states the feature pack are built on latest versions of add-on and backward compatibility contents for SQL Server 2005. The above link provides individual file to download for each environment...(read more)

    Read the article

  • Big label generator

    - by jamiet
    Sometimes I write blog posts mainly so that I can find stuff when I need it later. This is such a blog post. Of late I have been writing lots of deployment scripts and I am fan of putting big labels into deployment scripts (which, these days, reside in SSDT) so one can easily see what’s going on as they execute. Here’s such an example from my current project: which results in this being displayed when the script is run: In case you care….PM_EDW is the name of one of our databases. I’m almost embarrassed to admit that I spent about half an hour crafting that and a few others for my current project because a colleague has just alerted me to a website that would have done it for me, and given me lots of options for how to present it too: http://www.patorjk.com/software/taag/#p=testall&f=Banner3&t=PM__EDW Very useful indeed. Nice one! And yes, I’m sure there are a myriad of sites that do the same thing - I’m a latecomer, ok? @Jamiet

    Read the article

  • Which framework would you recommend to use to add "social networking" components to a website?

    - by blueberryfields
    Given which already enables users to create and publish content, is there a service or tool which can add the standard social networking suite of components? Specifically, I'm looking quickly add functionality which allows users to friend each other, vote on/like/rank content on the site, send each other links to parts they find interesting, chat and send offline message each other. There's no specific limitation on the technology used for these components - as long as its been proven to work, and scales without issue. I'd slightly prefer a solution which is offered as a service rather than one that I have to install. Edit Some additional commenter requested clarifications - there are no restrictions that the site imposes on user identification or authentication. Feel free to assume that portion of the work is not relevant to the answers.

    Read the article

  • Sprite sheet generator

    - by Andrea Tucci
    I need to generate a sprite sheet with squared sprite for a 2D game. How can I generate a sprite sheet where each frame has x = y? The only think I have to do is to "insert" some blank space between sprites (in case y were x in the original sprite). Is there any program that I can use to trasform "irregular" sprite sheets to "squared" sprite sheets? An example of non-squared sprite sheet: http://spriters-resource.com/gameboy_advance/khcom/sheet/1138

    Read the article

  • Two Free Training Webcasts Open for Registration

    - by KKline
    We've got two sessions that you need to sign up for right away. The upcoming webcast for Oracle-oriented folks has huge registration numbers. So get in while you still can before we hit the limit of what LiveMeeting can handle. Pain of the Week: SQL Server for the Oracle DBA Webcast: SQL Server for the Oracle DBA Date: Thursday, May 27, 2010 (Just a couple days hence!) Time: 8 a.m. Pacific / 11 a.m. Eastern / 4 p.m. United Kingdom / 5 p.m. Central Europe Duration: 45-60 minutes Cost: FREE In enterprise...(read more)

    Read the article

  • DBCC MEMUSAGE in 2005/8 ?

    - by steveh99999
    I used to like using undocumented command DBCC MEMUSAGE in SQL 2000 to see which tables were using space in SQL data cache. In SQL 2005, this command is not longer present. Instead a DMV – sys.dm_os_buffer_descriptors – can be used to display data cache contents,  but this doesn’t quite give you the same output as DBCC MEMUSAGE. I’m also aware that you can use Quest’s spotlight tool to view a summary of data cache contents. Using  this post by Umachandar Jayachandran  of Microsoft, I was able to create the following equivalent for SQL 2005/8. I’ve wrapped Umachandar’s original query in a CTE to produce summary information :- ;WITH memusage_CTE AS (SELECT bd.database_id, bd.file_id, bd.page_id, bd.page_type , COALESCE(p1.object_id, p2.object_id) AS object_id , COALESCE(p1.index_id, p2.index_id) AS index_id , bd.row_count, bd.free_space_in_bytes, CONVERT(TINYINT,bd.is_modified) AS 'DirtyPage' FROM sys.dm_os_buffer_descriptors AS bd JOIN sys.allocation_units AS au ON au.allocation_unit_id = bd.allocation_unit_id OUTER APPLY ( SELECT TOP(1) p.object_id, p.index_id FROM sys.partitions AS p WHERE p.hobt_id = au.container_id AND au.type IN (1, 3) ) AS p1 OUTER APPLY ( SELECT TOP(1) p.object_id, p.index_id FROM sys.partitions AS p WHERE p.partition_id = au.container_id AND au.type = 2 ) AS p2 WHERE  bd.database_id = DB_ID() AND bd.page_type IN ('DATA_PAGE', 'INDEX_PAGE') ) SELECT TOP 20 DB_NAME(database_id) AS 'Database',OBJECT_NAME(object_id,database_id) AS 'Table Name', index_id,COUNT(*) AS 'Pages in Cache', SUM(dirtyPage) AS 'Dirty Pages' FROM memusage_CTE GROUP BY database_id, object_id, index_id ORDER BY COUNT(*) DESC I’m not 100% happy with the results of the above query however… I’ve noticed that on a busy BizTalk messageBox database  it will return information on pages that contain GHOST rows – . ie where data has already been deleted but has yet to be cleaned-up by a background process – I’m need to investigate further why cache on this server apparently contains so much GHOST data… For more information on the background ghost cleanup process, see this article by Paul Randall. However, I think the results of this query should still be of interest to a DBA. I have another post to come shortly regarding an example I encountered where this information proved useful to me… I notice in SQL 2008, sys.dm_os_buffer_descriptors gained an extra column – numa_mode – I’m interested to see how this is populated and how useful this column can be on a NUMA-enabled system. I’m assuming in theory you could use this column to help analyse how your tables are spread across Numa-enabled data-cache ?

    Read the article

  • How do you find all the links to disavow for a Google reconsideration request? [duplicate]

    - by QF_Developer
    This question already has an answer here: How to identify spammy domains giving backlinks to my site (to submit in disavow links in WMT) 2 answers A few months ago I received the following notification on Google Webmaster for a website I look after. Unnatural links to your site—impacts links Google has detected a pattern of unnatural artificial, deceptive, or manipulative links pointing to pages on this site. Some links may be outside of the webmaster’s control, so for this incident we are taking targeted action on the unnatural links instead of on the site’s ranking as a whole. Learn more. The question here is, should we actively attempt to disavow these links given that the action is seemingly targeted to just a bunch of keywords? I've downloaded the inbound links sample from Google Webmaster and so far I've been through the disavow and reconsideration requests process 6 times, each taking 2-3 weeks only to be supplied just 2 more links that Google don't approve of. At this rate it will take me the rest of my natural life to cleanup all these spammy links! It seems disavowing is futile as they haven't implemented broad actions against the website as a whole and (from what I can gather) have already nullified the value of those offending links. Under the quoted statement above however is a reconsideration request button that seems to imply I should be actively doing something here? UPDATE 14th October -- I have since created a small .NET application that you can feed the CSV sample links file into from Google Webmaster. What this tool does is crawl all the links and looks for specific linking patterns as per some configurable match strings. I realised that many of the links that Google are taking issue with were created by a rogue SEO firm we hired several years ago. All the links are appended with 1 of 5 different descriptions. The application I built uses some regexes to isolate any link sources with these matching appendages and automatically builds the disavow txt file. In the end it had to come down to an algorithm as manually disavowing links on this scale would take weeks! I will post the app here once I've cleaned it up.

    Read the article

  • What's the deal with URLs for Yandex.Metrica not prepended with "http"?

    - by sharptooth
    The description of Yandex.Metrica explicitly says that URLs like //mc.yandex.ru/metrika/watch.js (no http: in front) that the web site owner has to insert into his pages are not erroneous. So for example this code: <img src="//mc.yandex.ru/watch/00000" style="position:absolute; left:-9999px;" alt="" /> is claimed to be okay. However the code validator thinks such URLs are not okay and I'd rather make the validator happy so that noone breaks the code later trying to "fix" it. Why are these URLs not prepended with http:? What happens if I actually prepend them with http:?

    Read the article

  • Idea to develop a caching server between IIS and SQL Server

    - by John
    I work on a few high traffic websites that all share the same database and that are all heavily database driven. Our SQL server is max-ed out and, although we have already implemented many changes that have helped but the server is still working too hard. We employ some caching in our website but the type of queries we use negate using SQL dependency caching. We tried SQL replication to try and kind of load balance but that didn't prove very successful because the replication process is quite demanding on the servers too and it needed to be done frequently as it is important that data is up to date. We do use a Varnish web caching server (Linux based) to take a bit of the load off both the web and database server but as a lot of the sites are customised based on the user we can only do so much. Anyway, the reason for this question... Varnish gave me an idea for a possible application that might help in this situation. Just like Varnish sits between a web browser and the web server and caches response from the web server, I was wondering about the possibility of creating something that sits between the web server and the database server. Imagine that all SQL queries go through this SQL caching server. If it's a first time query then it will get recorded, and the result requested from the SQL server and stored locally on the cache server. If it's a repeat request within a set time then the result gets retrieved from the local copy without the query being sent to the SQL server. The caching server could also take advantage of SQL dependency caching notifications. This seems like a good idea in theory. There's still the same amount of data moving back and forward from the web server, but the SQL server is relieved of the work of processing the repeat queries. I wonder about how difficult it would be to build a service that sort of emulates requests and responses from SQL server, whether SQL server's own caching is doing enough of this already that this wouldn't be a benefit, or even if someone has done this before and I haven't found it? I would welcome any feedback or any references to any relevant projects.

    Read the article

  • Who Tests the Tester?

    It is scarcely surprising that it can take up to five years to release a new version of SQL Server when one understands the extent of the effort required to test it. When enterprises depend on the reliability of an application or tool such as SQL Backup, the contribution of the tester is of paramount importance. It is an interesting and enjoyable role as well, as Andrew Clarke found out by chatting to testers at Red Gate.

    Read the article

  • T4Toolbox and Visual Studio 2010

    - by Ben Griswold
    I’ve been using the T4Toolbox to help generate my ASP.NET MVC models and scaffolding for a while now.  Another developer tried using my generator project last week and ran into troubles due to a breaking change around the RenderCore() and TransformText() methods in support for VS 2010.  If you upgraded to the latest version of T4Toolbox and receive a build error similar to the following, you are probably in the same boat: GeneratedTextTransformation.[Template].RenderCore(): no suitable method found to override We took the easy way out.  I had him uninstall the latest version of T4Toolbox and install version 9.7.25.1 which my templates were initially coded against.  For now, that worked great, but it sounds like I’ll be doing some rework of the 20+ templates in my project to support Visual Studio 2010 when we migrate later this month.

    Read the article

  • SVN Export or Recursively Remove .SVN Folders

    - by Ben Griswold
    I shared this script with a coworker yesterday. It doesn’t do much; it recursively deletes .svn folders from a source tree.  It comes in handy if you want to share your codebase or you get in a terrible spot with SVN and you just want to start all over. Just blow away all svn artifacts and use your mulligan. It’s true. You can nearly get the same result using the SVN export command which copies your source sans the .svn folders to an alternate location.  The catch is an export only includes those files/folders which exist under version control.  If you want a clean copy of your source – versioned or not – export just might not do. The contents of the .cmd file include the following: for /f "tokens=* delims=" %%i in (’dir /s /b /a:d *.svn’) do ( rd /s /q "%%i" ) Just download and drop the unzipped “SVN Cleanup.cmd” file into the root of the project, execute and away you go.  If you search around enough, I know you can find similar scripts and approaches elsewhere, but I’m still uploading my script for completeness and future reference. Download SVN Cleanup

    Read the article

  • Query Tuning Mastery at PASS Summit 2012: The Video

    - by Adam Machanic
    An especially clever community member was kind enough to reverse-engineer the video stream for me, and came up with a direct link to the PASS TV video stream for my Query Tuning Mastery: The Art and Science of Manhandling Parallelism talk, delivered at the PASS Summit last Thursday. I'm not sure how long this link will work , but I'd like to share it for my readers who were unable to see it in person or live on the stream. Start here. Skip past the keynote, to the 149 minute mark. Enjoy!...(read more)

    Read the article

  • Host your own private git repository via SSH

    - by kerry
    If you are like me you have tons of projects you would like to keep private but track with git, but do not want to pay a git host for a private plan. One of the problems is that most hosts scale their plans by project instead of users. Luckily, it is easy to host your own git repositories on any el cheapo host that provides ssh access. In the interest of full disclosure, I learned this trick from this blog post. I decided to recreate it in case the source material vanishes for some reason. To setup your host, login via ssh and run the following commands: mkdir ~/git/yourprojectname.git cd ~/git/yourprojectname.git git --bare init Then in your project directory (on your local machine): # setup your user info git config --global user.name "Firstname Lastname" git config --global user.email "[email protected]" # initialize the workspace git init git add . git commit -m "initial commit" git add remote origin ssh://[email protected]/~/git/yourprojectname.git git push origin master It’s that easy! To keep from entering your password every time add your public key to the server: Generate your key with ‘ssh-keygen -t rsa‘ on your local machine.  Then add the contents of the generated file to ~/.ssh/authorized_keys on your server.

    Read the article

  • Visual Web Developer 2010 Express, automated testing, and SVN

    - by Mr. Jefferson
    We have an HTML designer who is not a developer but needs to modify .aspx files from our ASP.NET 2.0 projects from time to time in order to get CSS to work properly with them. Currently, this involves giving her the .aspx page by itself, which she opens and edits via Visual Studio 2008 (her computer used to be a developer's). I'm considering getting her set up with Visual Web Developer 2010 Express and Subversion access so she can be more independent, but I wanted to make sure VS Express will work properly with what we do. So: Does VWD 2010 Express support automated tests? If no to the above, what happens when it opens a solution file that includes a test project, modifies it, and saves it? Are there any potential snags with setting up AnkhSVN with VWD 2010 Express?

    Read the article

  • Looking for WAMP Benchmarking (my current WAMP is very slow, so are other solutions)

    - by therobyouknow
    I'm running ZWAMP WAMP stack on my local development machine. However I have found it to be very slow at serving pages from a Drupal site I have setup. By contrast, my live production site on shared hosting is reasonably quick. For me the goal with a local WAMP stack was to develop offline and send completed work to the live production site. I liked ZWAMP because it didn't require adjustments to User Access Control or other permissions. I've looked at Drupal Acquia Development Stack but found this too restrictive: only one site instance/doc root can be installed. I've looked at other DAMP stacks and heard reports of them being slow. My local development machine that I am running the WAMP stack on is a Dual Core 2.6Ghz hyperthreaded Intel i7, 4Gb RAM, 7200rpm hard disk, running Windows 64bit professional. Surely this is fast enough. So I'm looking for: Causes of the slowness of the WAMP and how to improve the speed Benchmark data of various WAMP stacks

    Read the article

  • Query Tuning Mastery at PASS Summit 2012: The Video

    - by Adam Machanic
    An especially clever community member was kind enough to reverse-engineer the video stream for me, and came up with a direct link to the PASS TV video stream for my Query Tuning Mastery: The Art and Science of Manhandling Parallelism talk, delivered at the PASS Summit last Thursday. I'm not sure how long this link will work , but I'd like to share it for my readers who were unable to see it in person or live on the stream. Start here. Skip past the keynote, to the 149 minute mark. Enjoy!...(read more)

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >