Search Results

Search found 2182 results on 88 pages for 'grant smith'.

Page 73/88 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • How Can I Test My Computer’s Power Supply?

    - by Jason Fitzpatrick
    You’re concerned your computer troubles stem from a failing (or outright fried) power supply unit. How can you test the unit to be sure that it’s the source of your hardware headaches? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Sam Hoice has some PSU concerns: My computer powered off the other day on its own, and now when I push the power button, nothing happens. My assumption would naturally be that the power supply is done (possibly well done) but is there any good way to test this before I buy a new one? How can Sam test things without damaging his current computer or other hardware?   The Answer SuperUser contributor Grant writes: Unplug the power supply from any of the components inside the computer (or just remove it from the computer completely). USE CAUTION HERE (Though you’d only be shocked with a max of 24 volts) Plug the power supply into the wall. Find the big 24-ish pin connector that connects to the motherboard. Connect the GREEN wire with the adjacent BLACK wire. The power supply’s fan should start up. If it doesn’t then it’s dead. If the fan starts up, then it could be the motherboard that’s dead. You can use a multimeter to check if there is power output from the power supply. Adrien offers a solution for readers who may not be comfortable jamming wires into their power supply unit’s MOBO connector: Most well-stocked geek-stores sell a “power-supply tester” that has all the appropriate connectors to plug each part of your PSU into, with spiffy LEDs indicating status of the various rails, connectors for IDE/SATA/floppy power cables, etc. They run ~$20 US. With a little careful shopping you can even find a highly-rated PSU tester for a measly $6. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • SQL Server Scripts I Use

    - by Bill Graziano
    When I get to a new client I usually find myself using the same set of scripts for maintenance and troubleshooting.  These are all drop in solutions for various maintenance issues. Reindexing.  I use Michelle Ufford’s (SQLFool) re-indexing script.  I like that it has a throttle and only re-indexes when needed.  She also has a variety of other interesting scripts on her blog too. Server Activity.  Adam Machanic is up to version 10 of sp_WhoIsActive.  It’s a great replacement for the sp_who* stored procedures and does so much more.  If a server is acting funny this is one of the first tools I use. Backups.  Tara Kizer has a great little T-SQL script for SQL Server backups.  Wait Stats.  Paul Randal has a great script to display wait stats.  The biggest benefit for me is that his script filters out at least three dozen wait stats that I just don’t care about (for example LAZYWRITER_SLEEP). Update Statistics.  I didn’t find anything I liked so I wrote a simple script to update stats myself.  The big need for me was that it had to run inside a time window and update the oldest statistics first.  Is there a better one? Diagnostic Queries.  Glenn Berry has a huge collection of DMV queries available.  He also just highlighted five of them including two I really like dealing with unused indexes and suggested indexes. Single Use Query Plans.  Kim Tripp has a script that counts the number of single-use query plans.  This should guide you in whether to enable the Optimize for Adhoc Workloads option in SQL Server 2008. Granting Permissions to Developers.  This is one of those scripts I didn’t even know I needed until I needed it.  Kendra Little wrote it to grant a login read-only permission to all the databases.  It also grants view server state and a few other handy permissions.   What else do you use?  What should I add to my list?

    Read the article

  • Silverlight Cream for April 04, 2010 -- #830

    - by Dave Campbell
    In this Issue: Michael Washington, Hassan, David Anson, Jeff Wilcox, UK Application Development Consulting, Davide Zordan, Victor Gaudioso, Anoop Madhusudanan, Phil Middlemiss, and Laurent Bugnion. Shoutouts: Josh Smith has a good-read post up: Design-time data is still data Shawn Hargreaves reported his MIX demo released From SilverlightCream.com: Silverlight MVVM: Enabling Design-Time Data in Expression Blend When Using Web Services Michael Washington has a tutorial up on MVVM and using a web service to get design-time data that works in Blend also... lots of information and screenshots. WP7 Transition Animation Hassan has a new WP7 tutorial up that demonstrates playing media and adding transition animation between pages. Tip: For a truly read-only custom DependencyProperty in Silverlight, use a read-only CLR property instead David Anson's latest tip is in response to comments on his previous post and details one by Dr. WPF who points out that a read-only DependencyProperty doesn't actually need to be a DependencyProperty as long as the class implements INotifyPropertyChanged. Template parts and custom controls (quick tip) Jeff Wilcox has posted a set of tips and recommendations to use when developing control development in Silverlight ... this is a post to bookmark. Flexible Data Template Support in Silverlight The UK Application Development Consulting details a 'problem' in Silverlight that doesn't exist in WPF and that is data templates that vary by type... and discusses a way around it. Multi-Touch enabling Silverlight Simon using Blend behaviors and the Surface sample for Silverlight Davide Zordan brought Multi-Touch to the Silverlight Simon game on CodePlex using Blend Behaviors. New Video Tutorial: How to Use a Behavior to Fire Methods from Objects in Styles Victor Gaudioso has a video tutorial up responding to a question from a developer. He demonstrates development of a Behavior that can be attached to objects in or out of Styles that allows you to specify what Method they need to fire. Creating a Silverlight Client for @shanselman ’s Nerd Dinner, using oData and Bing Maps Anoop Madhusudanan took Scott Hanselman's post on an OData API for StackOverflow, and has created a Silverlight client for Nerd Dinner, including BingMaps. A Chrome and Glass Theme - Part 2 Phil Middlemiss has the next part of his Chrome and Glass Theme up. In this one he creates a very nice chrome-look button with visual state changes. MVVM Light Toolkit V3 SP1 for Windows Phone 7 Laurent Bugnion has released a new version of MVVM Light for WP7. Included is an installation manual and information about what was changed. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • SQL SERVER – guest User and MSDB Database – Enable guest User on MSDB Database

    - by pinaldave
    I have written a few articles recently on the subject of guest account. Here’s a quick list of these articles: SQL SERVER – Disable Guest Account – Serious Security Issue SQL SERVER – Force Removing User from Database – Fix: Error: Could not drop login ‘test’ as the user is currently logged in. SQL SERVER – Detecting guest User Permissions – guest User Access Status One of the advices which I gave in all the three blog posts was: Disable the guest user in the user-created database. Additionally, I have mentioned that one should let the user account become enabled in MSDB database. I got many questions asking if there is any specific reason why this should be kept enabled, questions like, “What is the reason that MSDB database needs guest user?” Honestly, I did not know that the concept of the guest user will create so much interest in the readers. So now let’s turn this blog post into questions and answers format. Q: What will happen if the guest user is disabled in MSDB database? A:  Lots of bad things will happen. Error 916 - Logins can connect to this instance of SQL Server but they do not have specific permissions in a database to receive the permissions of the guest user. Q: How can I determine if the guest user is enabled or disabled for any specific database? A: There are many ways to do this. Make sure that you run each of these methods with the context of the database. For an example for msdb database, you can run the following code: USE msdb; SELECT name, permission_name, state_desc FROM sys.database_principals dp INNER JOIN sys.server_permissions sp ON dp.principal_id = sp.grantee_principal_id WHERE name = 'guest' AND permission_name = 'CONNECT' There are many other methods to detect the guest user status. Read them here: Detecting guest User Permissions – guest User Access Status Q: What is the default status of the guest user account in database? A: Enabled in master, TempDb, and MSDB. Disabled in model database. Q: Why is the default status of the guest user disabled in model database? A: It is not recommended to enable the guest in user database as it can introduce serious security threat. It can seriously damage the database if configured incorrectly. Read more here: Disable Guest Account – Serious Security Issue Q: How to disable guest user? A: REVOKE CONNECT FROM guest Q: How to enable guest user? A: GRANT CONNECT TO guest Did I miss any critical question in the list? Please leave your question as a comment and I will add it to this list. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • ASP.NET 4 Hosting :: How to Debug Your ASP.NET Applications

    - by mbridge
    Remote debugging of a process is a privilege, and like all privileges, it must be granted to a user or group of users before its operation is allowed. The Microsoft .NET Framework and Microsoft Visual Studio .NET provide two mechanisms to enable remote debugging support: The Debugger Users group and the "Debug programs" user right. Debugger Users Group When you debug a remote .NET Framework-based application, the Debugger on your computer must communicate with the remote computer using DCOM. The remote server must grant the Debugger access, and it does this by granting access to all members of the Debugger Users group. Therefore, you must ensure that you are a member of the Debugger Users group on that computer. This is a local security group, meaning that it is visible to only the computer where it exists. To add yourself or a group to the Debugger Users group, follow these steps: 1. Right-click the My Computer icon on the Desktop and choose Manage from the context menu. 2. Browse to the Groups node, which is found under the Local Users and Groups node of System Tools. 3. In the right pane, double-click the Debugger Users group. 4. Add your user account or a group account of which you are a member. Debug Programs User Right To debug programs that run under an account that is different from your account, you must be granted the "Debug programs" user right on the computer where the program runs. By default, only the Administrators group is granted this user right. You can check this by opening Local Security Policy on the computer. To do so, follow these steps: 1. Click Start, Administrative Tools, and then Local Security Policy. 2. Browse to the User Rights Assignment node under the Local Policies node. 3. In the right pane, double-click the "Debug programs" user right. 4. Add your user account or a group account of which you are a member.

    Read the article

  • What was missing from the Content Strategy Forum?

    - by Roger Hart
    In April, Paris hosted the first ever Content Strategy Forum. The event's website proudly proclaims: 170 attendees, 18 nationalities, 17 speakers, 1 volcano... Content Strategy Forum 2010 rocked the world! The volcano was in Iceland, and the closest we came to rocking the world was a cursory mention in the Huffington Post, but I'll grant the event was awesome. One thing missing from that list, however, is "94 companies" (Plus a couple of universities and freelancers, and what have you). A glance through the attendees directory reveals a fairly wide organisational turnout - 24 students from two Parisian universities, countless design and marketing agencies, a series of tech firms, small and large. Two delegates from IBM, two from ARM, an appearance from RIM, Skype, and Facebook; twelve from the various bits of eBay. Oh, and, err, nobody from Google, Microsoft, Yahoo, Amazon, Play, Twitter, LinkedIn, Craigslist, the BBC, no banks I noticed, and I didn't spot a newspaper. You get the idea. Facebook notwithstanding, you have to scroll through a few pages to Alexa rankings to find company names from the attendee list. I find this interesting, and I'm not wholly sure what to make of it. Of the large, web-centric, content-rich organizations conspicuously absent, at least one of two things is true: They didn't know about the event They didn't care about the event Maybe these guys all have content strategy completely sorted, and it's an utterly naturalised part of their business process. Maybe nobody at say, Apple or Play.com ever publishes a single piece of content that isn't neatly tailored to their (clearly defined, of course) user and business goals. Wouldn't that be lovely? The thing is, in that rosy and beatific world, there's still a case for those folks to join the community. There are bound to be other perspectives, and things to learn. You see, the other thing achingly conspicuous by its absence was case studies. In her keynote address, Kristina Halvorson made the point that what content strategy really needs is some big, loud success stories. A point I'd firmly second as a content strategist working within an organisation. Sarah Cancilla's presentation on content strategy at Facebook included some very neat, specific examples, and was richer for it. It didn't hurt that the example was Facebook - you're getting impressively big numbers off base. What about the other big boys? Is there anybody out there with a perspective? Do we all just look very silly to you, fretting away over text and images and users and purposes? Is content validation and maintenance so accustomed a part of your business that calling attention to it is like sniffing the air and saying "Hmm, a lot of nitrogen about today."? And if it is, do you have any wisdom to share?

    Read the article

  • SQL SERVER – Quiz and Video – Introduction to SQL Server Security

    - by pinaldave
    This blog post is inspired from Beginning SQL Joes 2 Pros: The SQL Hands-On Guide for Beginners – SQL Exam Prep Series 70-433 – Volume 1. [Amazon] | [Flipkart] | [Kindle] | [IndiaPlaza] This is follow up blog post of my earlier blog post on the same subject - SQL SERVER – Introduction to SQL Server Security – A Primer. In the article we discussed various basics terminology of the security. The article further covers following important concepts of security. Granting Permissions Denying Permissions Revoking Permissions Above three are the most important concepts related to security and SQL Server.  There are many more things one has to learn but without beginners fundamentals one can’t learn the advanced  concepts. Let us have small quiz and check how many of you get the fundamentals right. Quiz 1) If you granted Phil control to the server, but denied his ability to create databases, what would his effective permissions be? Phil can do everything. Phil can do nothing. Phil can do everything except create databases. 2) If you granted Phil control to the server and revoked his ability to create databases, what would his effective permissions be? Phil can do everything. Phil can do nothing. Phil can do everything except create databases. 3) You have a login named James who has Control Server permission. You want to elimintate his ability to create databases without affecting any other permissions. What SQL statement would you use? ALTER LOGIN James DISABLE DROP LOGIN James DENY CREATE DATABASE To James REVOKE CREATE DATABASE To James GRANT CREATE DATABASE To James Now make sure that you write down all the answers on the piece of paper. Watch following video and read earlier article over here. If you want to change the answer you still have chance. Solution 1) 3 2) 1 3) 3 Now compare let us check the answers and compare your answers to following answers. I am very confident you will get them correct. Available at USA: Amazon India: Flipkart | IndiaPlaza Volume: 1, 2, 3, 4, 5 Please leave your feedback in the comment area for the quiz and video. Did you know all the answers of the quiz? Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Joes 2 Pros, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Paying by Cash

    - by David Dorf
    I'll grant you paying by cash in the context of stores isn't particularly interesting, but in my quest to try new payment methods I decided to pay by cash at an online store. Using a credit card means I have to hoist myself off the couch, find the card, and enter all those digits. Google Checkout certainly makes that task easier by storing my credit card information, but what happens to all those people that don't have a credit card? What about the ones that are afraid to use credit cards over the internet. There are three main options for cash payment, not all of which are accepted by every merchant. The most popular is PayPal. The issue I have with them is that returns and disputes have to be handled with PayPal, not the merchant. I once used PayPal at a shady online store and lost my money. Yeah, my bad but they wouldn't help me at all. PayPal was purchased by eBay in 2002. BillMeLater is best for larger purchases, because at checkout they actually run a credit check to make sure you're credit worthy. Assuming you are, they pay the merchant on your behalf and mail you a bill, which you better pay quickly or interest will start to accrue. That's nice for the merchant because they get paid right away, and I presume there's no charge-backs. BillMeLater was purchased by eBay in 2008. Last night I tried eBillMe for the first time. After checkout, they send you a bill via email and expect you to pay either via online banking (they provide the instructions to set everything up) or walk-in locations across the US (typically banks). The process was quick and easy. The merchant doesn't ship the product until the bill is paid, so there's a day or two delay. For the merchant there are no charge-backs, and the fees are less than credit cards. For the shopper, they provide buyer protection similar to that offered by credit cards, and 1% cashback on purchases. Once the online bill-pay is setup, its easy to reuse in the future. Seems like a win-win for merchants and shoppers.

    Read the article

  • Sites To Download Free eBooks For Kindle

    - by Gopinath
    Amazon Kindle is the top selling gadget of this holiday season and many of you would have received it as a gift. For those who got a Amazon Kindle here are few websites that offer free eBooks to fulfil reading appetite at no cost. 1. Free Kindle Books – Amazon Website – This page on Amazon lists nice collection of free books available for Kindle that includes Serial by Jack Kiborn, The Wild’s Call by Jeri Smith, Star Wars by John Jackson MIller and several other books from a list of 40 books. 2. Project Gutenberg: This site as 33,000 + free books that not work let you read on Kindle but also on iPad, PCs and smart phones.  This site is very popular for free ebooks. 3. Google E-Bookstore: Google’s eBookStore has thousands of free ebooks for Kindle in their free books section. 4. Internet Archive: Here you find millions of rare print works that are especially useful for academic research. Multiple language books are also available for Kindle. 5. Open Library: This site is sort of Wikipedia for eBooks with over 20 million user-contributed books and magazines. They are all Kindle friendly. 6. ManyBooks.net: Nearly 30,000 titles, many of which have been pulled from Project Gutenberg. Has a good collection of little-known Creative Commons works. 7. Freebooks.com – the public domain section of this site contains many free ebooks that are perfect for your Kindle. 8. freecomputerbooks.com, freetechbooks.com and onlinecomputerbooks.com - if you are geek and looking for technology books, this is the site you should visit to grab free books. Image credit: bike/flickr This article titled,Sites To Download Free eBooks For Kindle, was originally published at Tech Dreams. Grab our rss feed or fan us on Facebook to get updates from us.

    Read the article

  • Plan Caching and Query Memory Part I – When not to use stored procedure or other plan caching mechanisms like sp_executesql or prepared statement

    - by sqlworkshops
      The most common performance mistake SQL Server developers make: SQL Server estimates memory requirement for queries at compilation time. This mechanism is fine for dynamic queries that need memory, but not for queries that cache the plan. With dynamic queries the plan is not reused for different set of parameters values / predicates and hence different amount of memory can be estimated based on different set of parameter values / predicates. Common memory allocating queries are that perform Sort and do Hash Match operations like Hash Join or Hash Aggregation or Hash Union. This article covers Sort with examples. It is recommended to read Plan Caching and Query Memory Part II after this article which covers Hash Match operations.   When the plan is cached by using stored procedure or other plan caching mechanisms like sp_executesql or prepared statement, SQL Server estimates memory requirement based on first set of execution parameters. Later when the same stored procedure is called with different set of parameter values, the same amount of memory is used to execute the stored procedure. This might lead to underestimation / overestimation of memory on plan reuse, overestimation of memory might not be a noticeable issue for Sort operations, but underestimation of memory will lead to spill over tempdb resulting in poor performance.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   To read additional articles I wrote click here.   In most cases it is cheaper to pay for the compilation cost of dynamic queries than huge cost for spill over tempdb, unless memory requirement for a stored procedure does not change significantly based on predicates.   The best way to learn is to practice. To create the below tables and reproduce the behavior, join the mailing list by using this link: www.sqlworkshops.com/ml and I will send you the table creation script. Most of these concepts are also covered in our webcasts: www.sqlworkshops.com/webcasts   Enough theory, let’s see an example where we sort initially 1 month of data and then use the stored procedure to sort 6 months of data.   Let’s create a stored procedure that sorts customers by name within certain date range.   --Example provided by www.sqlworkshops.com create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1)       end go Let’s execute the stored procedure initially with 1 month date range.   set statistics time on go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 48 ms to complete.     The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.       The estimated number of rows, 43199.9 is similar to actual number of rows 43200 and hence the memory estimation should be ok.       There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 679 ms to complete.      The stored procedure was granted 6656 KB based on 43199.9 rows being estimated.      The estimated number of rows, 43199.9 is way different from the actual number of rows 259200 because the estimation is based on the first set of parameter value supplied to the stored procedure which is 1 month in our case. This underestimation will lead to sort spill over tempdb, resulting in poor performance.      There was Sort Warnings in SQL Profiler.    To monitor the amount of data written and read from tempdb, one can execute select num_of_bytes_written, num_of_bytes_read from sys.dm_io_virtual_file_stats(2, NULL) before and after the stored procedure execution, for additional information refer to the webcast: www.sqlworkshops.com/webcasts.     Let’s recompile the stored procedure and then let’s first execute the stored procedure with 6 month date range.  In a production instance it is not advisable to use sp_recompile instead one should use DBCC FREEPROCCACHE (plan_handle). This is due to locking issues involved with sp_recompile, refer to our webcasts for further details.   exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go Now the stored procedure took only 294 ms instead of 679 ms.    The stored procedure was granted 26832 KB of memory.      The estimated number of rows, 259200 is similar to actual number of rows of 259200. Better performance of this stored procedure is due to better estimation of memory and avoiding sort spill over tempdb.      There was no Sort Warnings in SQL Profiler.       Now let’s execute the stored procedure with 1 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-31' go The stored procedure took 49 ms to complete, similar to our very first stored procedure execution.     This stored procedure was granted more memory (26832 KB) than necessary memory (6656 KB) based on 6 months of data estimation (259200 rows) instead of 1 month of data estimation (43199.9 rows). This is because the estimation is based on the first set of parameter value supplied to the stored procedure which is 6 months in this case. This overestimation did not affect performance, but it might affect performance of other concurrent queries requiring memory and hence overestimation is not recommended. This overestimation might affect performance Hash Match operations, refer to article Plan Caching and Query Memory Part II for further details.    Let’s recompile the stored procedure and then let’s first execute the stored procedure with 2 day date range. exec sp_recompile CustomersByCreationDate go --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-02' go The stored procedure took 1 ms.      The stored procedure was granted 1024 KB based on 1440 rows being estimated.      There was no Sort Warnings in SQL Profiler.      Now let’s execute the stored procedure with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-06-30' go   The stored procedure took 955 ms to complete, way higher than 679 ms or 294ms we noticed before.      The stored procedure was granted 1024 KB based on 1440 rows being estimated. But we noticed in the past this stored procedure with 6 month date range needed 26832 KB of memory to execute optimally without spill over tempdb. This is clear underestimation of memory and the reason for the very poor performance.      There was Sort Warnings in SQL Profiler. Unlike before this was a Multiple pass sort instead of Single pass sort. This occurs when granted memory is too low.      Intermediate Summary: This issue can be avoided by not caching the plan for memory allocating queries. Other possibility is to use recompile hint or optimize for hint to allocate memory for predefined date range.   Let’s recreate the stored procedure with recompile hint. --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, recompile)       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.      The stored procedure with 1 month date range has good estimation like before.      The stored procedure with 6 month date range also has good estimation and memory grant like before because the query was recompiled with current set of parameter values.      The compilation time and compilation CPU of 1 ms is not expensive in this case compared to the performance benefit.     Let’s recreate the stored procedure with optimize for hint of 6 month date range.   --Example provided by www.sqlworkshops.com drop proc CustomersByCreationDate go create proc CustomersByCreationDate @CreationDateFrom datetime, @CreationDateTo datetime as begin       declare @CustomerID int, @CustomerName varchar(48), @CreationDate datetime       select @CustomerName = c.CustomerName, @CreationDate = c.CreationDate from Customers c             where c.CreationDate between @CreationDateFrom and @CreationDateTo             order by c.CustomerName       option (maxdop 1, optimize for (@CreationDateFrom = '2001-01-01', @CreationDateTo ='2001-06-30'))       end go Let’s execute the stored procedure initially with 1 month date range and then with 6 month date range.   --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-01-30' exec CustomersByCreationDate '2001-01-01', '2001-06-30' go The stored procedure took 48ms and 291 ms in line with previous optimal execution times.    The stored procedure with 1 month date range has overestimation of rows and memory. This is because we provided hint to optimize for 6 months of data.      The stored procedure with 6 month date range has good estimation and memory grant because we provided hint to optimize for 6 months of data.       Let’s execute the stored procedure with 12 month date range using the currently cashed plan for 6 month date range. --Example provided by www.sqlworkshops.com exec CustomersByCreationDate '2001-01-01', '2001-12-31' go The stored procedure took 1138 ms to complete.      2592000 rows were estimated based on optimize for hint value for 6 month date range. Actual number of rows is 524160 due to 12 month date range.      The stored procedure was granted enough memory to sort 6 month date range and not 12 month date range, so there will be spill over tempdb.      There was Sort Warnings in SQL Profiler.      As we see above, optimize for hint cannot guarantee enough memory and optimal performance compared to recompile hint.   This article covers underestimation / overestimation of memory for Sort. Plan Caching and Query Memory Part II covers underestimation / overestimation for Hash Match operation. It is important to note that underestimation of memory for Sort and Hash Match operations lead to spill over tempdb and hence negatively impact performance. Overestimation of memory affects the memory needs of other concurrently executing queries. In addition, it is important to note, with Hash Match operations, overestimation of memory can actually lead to poor performance.   Summary: Cached plan might lead to underestimation or overestimation of memory because the memory is estimated based on first set of execution parameters. It is recommended not to cache the plan if the amount of memory required to execute the stored procedure has a wide range of possibilities. One can mitigate this by using recompile hint, but that will lead to compilation overhead. However, in most cases it might be ok to pay for compilation rather than spilling sort over tempdb which could be very expensive compared to compilation cost. The other possibility is to use optimize for hint, but in case one sorts more data than hinted by optimize for hint, this will still lead to spill. On the other side there is also the possibility of overestimation leading to unnecessary memory issues for other concurrently executing queries. In case of Hash Match operations, this overestimation of memory might lead to poor performance. When the values used in optimize for hint are archived from the database, the estimation will be wrong leading to worst performance, so one has to exercise caution before using optimize for hint, recompile hint is better in this case. I explain these concepts with detailed examples in my webcasts (www.sqlworkshops.com/webcasts), I recommend you to watch them. The best way to learn is to practice. To create the above tables and reproduce the behavior, join the mailing list at www.sqlworkshops.com/ml and I will send you the relevant SQL Scripts.     Register for the upcoming 3 Day Level 400 Microsoft SQL Server 2008 and SQL Server 2005 Performance Monitoring & Tuning Hands-on Workshop in London, United Kingdom during March 15-17, 2011, click here to register / Microsoft UK TechNet.These are hands-on workshops with a maximum of 12 participants and not lectures. For consulting engagements click here.     Disclaimer and copyright information:This article refers to organizations and products that may be the trademarks or registered trademarks of their various owners. Copyright of this article belongs to R Meyyappan / www.sqlworkshops.com. You may freely use the ideas and concepts discussed in this article with acknowledgement (www.sqlworkshops.com), but you may not claim any of it as your own work. This article is for informational purposes only; you use any of the suggestions given here entirely at your own risk.   R Meyyappan [email protected] LinkedIn: http://at.linkedin.com/in/rmeyyappan

    Read the article

  • Some of my favourite Visual Studio 2012 things&ndash;Teams

    - by Aaron Kowall
    Getting the balance right for when and how many team projects to create has always been a bit of a balance.  On large initiatives, there are often teams who work toward a common system.  These teams often have quite a bit of autonomy, but need to roll up to some higher level initiative.  In TFS 2010, people were often tempted to create separate Team Projects for each of the sub-teams and then do some magic with reporting and cross-team queries to get the consolidated view.  My recommendation was always to use Areas as a means of separating work across the team, but that always resulted in a large number of queries that need to be maintained and just seemed confusing.  When doing anything you had to remember to filter the query or view by Area in order to get correct results. Along with the awesome web access portal that comes in TFS 2012 (which I will cover details of in another post) the product group has introduced the concept of Teams.  A team is a sub-group within a TFS 2012 Team Project which allows us to more easily divide work along team boundaries. Technically, a Team is defined by an Area Path and a TFS Group, both of which could be done in TFS 2012.  However, by allowing for creation of a ‘Team’ in TFS 2012, the web portal is able to do a bunch of ‘magic’ for us.  We can view the project site (backlog, taskboard, etc) for the the team, we can assign items to the team and we can view the burndown for the team.  Basically, all the stuff that we had to prepare manually we now get created and managed for us with a nice UI. When you create a Team Project in TFS 2012, a ‘Default’ team is created with the same name as the Team Project.  So, if you only have 1 team working on the project, you are set.  If you want to divide the work into additional teams, you can create teams by using the Team Web Client. Teams are created using the ‘Administer Server’ icon in the top right of the web site.   You can select the team site by using the team chooser: Once you have selected a team, the Product Backlog, TaskBoard, Burndown Charts, etc. are all filtered to that team. NOTE: You always have the ability to choose the ‘Default’ team to see items for the entire project. PS: It’s been a long while since I shared on this blog.  To help with that I’m in a blogging challenge with some other developer and agilist friends.  Please check out their blogs as well: Steve Rogalsky: http://winnipegagilist.blogspot.ca Dylan Smith: http://www.geekswithblogs.net/optikal Tyler Doerkson: http://blog.tylerdoerksen.com David Alpert: http://www.spinthemoose.com Dave White: http://www.agileramblings.com   Technorati Tags: TFS 2012,Agile,Team

    Read the article

  • links for 2011-03-15

    - by Bob Rhubart
    Dr. Frank Munz: Resize AWS EC2 Cloud Instances Dr Munz says: "You cannot dynamically resize a running cloud instance. E.g. there is no API call to ask for 2.2 GHz CPU speed instead of 1.8 GHz or to dynamically add another 3.5 GB of RAM." (tags: oracle cloud amazon ec2) Roddy Rodstein: Oracle VM Manager Architecture and Scalability Rodstein says: "Oracle VM Manager can be installed in an all-in-one configuration using the default Oracle 10g Express Database or in a more traditional two tier architecture with an OC4J web tier and a 10 or 11g database tier." (tags: oracle otn virtualization oraclevm) Mark Nelson: Getting started with Continuous Integration for SOA projects Nelson says: "I am exploring how to use Maven and Hudson to create a continuous integration capability for SOA and BPM projects. This will be the first post of several on this topic, and today we will look at setting up some simple continuous integration for a single SOA project." (tags: oracle maven hudson soa bpm) 5 New Java Champions (The Java Source) Tori Wieldt shares the big news. Congratulations to new Java Champs Jonas Bonér, James Strachan, Rickard Oberg, Régina ten Bruggencate, and Clara Ko. (tags: oracle java) Alert for Forms customers running Oracle Forms 10g (Grant Ronald's Blog) Ronald says: "While you might have been happily running your Forms 10g applications for about 5 years or so now, the end of premier support is creeping up and you need to start planning for a move to Oracle Forms 11g." (tags: oracle oracleforms) Brenda Michelson: Enterprise Architecture Rant #4,892 "I’m increasingly concerned about the macro-direction of our field, as we continue to suffer ivory tower enterprise architecture punditry, rigid frameworks and endless philosophical waxing." - Brenda Michelson (tags: entarch enterprisearchitecture ivorytower) Amitabh Apte: Enterprise Architecture - Different Perspectives "Business does not need Enterprise Architecture," says Apte, "it needs value and outcomes from the EA function." (tags: entarch enterprisearchitecture) First Ever MySQL on Windows Online Forum - March 16, 2011 (Oracle's MySQL Blog) Monica Kumar shares the details. (tags: oracle mysql mswindows) Jeff Davies: Running Multiple WebLogic and OSB Domains "There is a small 'gotcha' if you want to create multiple domains on a devevelopment machine," says Jeff Davies. But don't worry - there's a solution. (tags: oracle soa osb weblogic servicebus) The Arup Nanda Blog: Good Engineering "Engineering is not about being superficially creative," Nanda says, "it's about reliability and trustworthiness." (tags: oracle engineering software technology) Welcome to the SOA & E2.0 Partner Community Forum (SOA Partner Community Blog) (tags: ping.fm)

    Read the article

  • Roll your own free .NET technical conference

    - by Brian Schroer
    If you can’t get to a conference, let the conference come to you! There are a ton of free recorded conference presentations online… Microsoft TechEd Let’s start with the proverbial 800 pound gorilla. Recent TechEds have recorded the majority of presentations and made them available online the next day. Check out presentations from last month’s TechEd North America 2012 or last week’s TechEd Europe 2012. If you start at http://channel9.msdn.com/Events/TechEd, you can also drill down to presentations from prior years or from other regional TechEds (Australia, New Zealand, etc.) The top presentations from my “View Queue”: Damian Edwards: Microsoft ASP.NET and the Realtime Web (SignalR) Jennifer Smith: Design for Non-Designers Scott Hunter: ASP.NET Roadmap: One ASP.NET – Web Forms, MVC, Web API, and more Daniel Roth: Building HTTP Services with ASP.NET Web API Benjamin Day: Scrum Under a Waterfall NDC The Norwegian Developer Conference site has the most interesting presentations, in my opinion. You can find the videos from the June 2012 conference at that link. The 2011 and 2010 pages have a lot of presentations that are still relevant also. My View Queue Top 5: Shay Friedman: Roslyn... hmmmm... what? Hadi Hariri: Just ‘cause it’s JavaScript, doesn’t give you a license to write rubbish Paul Betts: Introduction to Rx Greg Young: How to get productive in a project in 24 hours Michael Feathers: Deep Design Lessons ØREDEV Travelling on from Norway to Sweden... I don’t know why, but the Scandinavians seem to have this conference thing figured out. ØREDEV happens each November, and you can find videos here and here. My View Queue Top 5: Marc Gravell: Web Performance Triage Robby Ingebretsen: Fonts, Form and Function: A Primer on Digital Typography Jon Skeet: Async 101 Chris Patterson: Hacking Developer Productivity Gary Short: .NET Collections Deep Dive aspConf - The Virtual ASP.NET Conference Formerly known as “mvcConf”, this one’s a little different. It’s a conference that takes place completely on the web. The next one’s happening July 17-18, and it’s not too late to register (It’s free!). Check out the recordings from February 2011 and July 2010. It’s two years old and talks about ASP.NET MVC2, but most of it is still applicable, and Jimmy Bogard’s Put Your Controllers On a Diet presentation is the most useful technical talk I have ever seen. CodeStock Videos from the 2011 edition of this Tennessee conference are available. Presentations from last month’s 2012 conference should be available soon here. I’m looking forward to watching Matt Honeycutt’s Build Your Own Application Framework with ASP.NET MVC 3. UserGroup.tv User Group.tv was founded in January of 2011 by Shawn Weisfeld, with the mission of providing User Group content online for free. You can search by date, group, speaker and category tags. My View Queue Top 5: Sergey Rathon & Ian Henehan: UI Test Automation with Selenium Rob Vettor: The Repository Pattern Latish Seghal: The .NET Ninja’s Toolbelt Amir Rajan: Get Things Done With Dynamic ASP.NET MVC Jeffrey Richter: .NET Nuggets – Houston TechFest Keynote

    Read the article

  • Inverted schedctl usage in the JVM

    - by Dave
    The schedctl facility in Solaris allows a thread to request that the kernel defer involuntary preemption for a brief period. The mechanism is strictly advisory - the kernel can opt to ignore the request. Schedctl is typically used to bracket lock critical sections. That, in turn, can avoid convoying -- threads piling up on a critical section behind a preempted lock-holder -- and other lock-related performance pathologies. If you're interested see the man pages for schedctl_start() and schedctl_stop() and the schedctl.h include file. The implementation is very efficient. schedctl_start(), which asks that preemption be deferred, simply stores into a thread-specific structure -- the schedctl block -- that the kernel maps into user-space. Similarly, schedctl_stop() clears the flag set by schedctl_stop() and then checks a "preemption pending" flag in the block. Normally, this will be false, but if set schedctl_stop() will yield to politely grant the CPU to other threads. Note that you can't abuse this facility for long-term preemption avoidance as the deferral is brief. If your thread exceeds the grace period the kernel will preempt it and transiently degrade its effective scheduling priority. Further reading : US05937187 and various papers by Andy Tucker. We'll now switch topics to the implementation of the "synchronized" locking construct in the HotSpot JVM. If a lock is contended then on multiprocessor systems we'll spin briefly to try to avoid context switching. Context switching is wasted work and inflicts various cache and TLB penalties on the threads involved. If context switching were "free" then we'd never spin to avoid switching, but that's not the case. We use an adaptive spin-then-park strategy. One potentially undesirable outcome is that we can be preempted while spinning. When our spinning thread is finally rescheduled the lock may or may not be available. If not, we'll spin and then potentially park (block) again, thus suffering a 2nd context switch. Recall that the reason we spin is to avoid context switching. To avoid this scenario I've found it useful to enable schedctl to request deferral while spinning. But while spinning I've arranged for the code to periodically check or poll the "preemption pending" flag. If that's found set we simply abandon our spinning attempt and park immediately. This avoids the double context-switch scenario above. One annoyance is that the schedctl blocks for the threads in a given process are tightly packed on special pages mapped from kernel space into user-land. As such, writes to the schedctl blocks can cause false sharing on other adjacent blocks. Hopefully the kernel folks will make changes to avoid this by padding and aligning the blocks to ensure that one cache line underlies at most one schedctl block at any one time.

    Read the article

  • Before the Summit of 2012

    - by Ajarn Mark Caldwell
    Today, Monday, was the first day of the PASS Summit Preconference training events, but instead I spent the day at the free SQL in the City event put on by Red Gate. For me this was not a financial decision (pre-con sessions cost extra above the general Summit registration) but rather a matter of interest.  I had already included money for pre-cons in this year’s training budget, but none of them really stood out to me, so even if the Red-Gate event were not going on at the same time, I probably would not have gone to any pre-cons this year.  However, the topics being presented at the SQL in the City event were of great interest to me.  There promised to be good information on Continuous Integration and automated deployment of database changes, which lately has been a real hot topic at my work.  And indeed, Red-Gate announced the release of a new tool (still in Early Access Program…a.k.a. Beta) which is called the Deployment Manager.  Since we are in the middle of a TFS implementation project, it will be interesting to see how this plays out and compares to what we put together with the automated builds in TFS.  But, as I understand it, the primary focus of Deployment Manager is not to be the Build process (Red Gate uses JetBrains’ Team City for that in their shop) but rather to aid in the deployment of those build packages, as well as providing easy rollback and a good visualization of which versions of software are in which environments.  It looks promising and I’ve already downloaded the installer package to play with it later. Overall, I was quite impressed with the SQL in the City event.  Having heard many current and past members of the PASS Board of Directors describe the challenges of putting on a large conference, and the growing pains that the PASS Summit has gone through, I am even more impressed that the Red Gate event ran as smoothly as it did.  And it is quite impressive the amount of money that Red Gate must have spent given that this was a no-charge event to attend, they had a very nice hot lunch, and the after-event drinks celebration.  Well done, folks! Of course it was great to hear from a variety of speakers.  Today I listened to some folks from Red Gate like Grant Fritchey (blog | @GFritchey) and David Atkinson (Product Manager for SQL Source Control and now the Deployment Manager tool set); and also Brent Ozar (blog | @BrentO) and Buck Woody (blog | @BuckWoody).  By the way, if you have never seen either Brent or Buck speak, you really should.  Different styles, but both are very entertaining and educational at the same time.  I love Buck’s sense of humor (here’s a tip…don’t be late to Buck’s session or you’ll become part of the presentation) and I praise Brent’s slides.  Brent’s style very much reminds me of that espoused by Garr Reynolds on his Presentation Zen blog (and book) and I am impressed that he can make a technical presentation so engaging. It was a great day, a great way to kick off the week, and I am excited to get into the full Summit!

    Read the article

  • The Red Gate Guide to SQL Server Team based Development Free e-book

    - by Mladen Prajdic
    After about 6 months of work, the new book I've coauthored with Grant Fritchey (Blog|Twitter), Phil Factor (Blog|Twitter) and Alex Kuznetsov (Blog|Twitter) is out. They're all smart folks I talk to online and this book is packed with good ideas backed by years of experience. The book contains a good deal of information about things you need to think of when doing any kind of multi person database development. Although it's meant for SQL Server, the principles can be applied to any database platform out there. In the book you will find information on: writing readable code, documenting code, source control and change management, deploying code between environments, unit testing, reusing code, searching and refactoring your code base. I've written chapter 5 about Database testing and chapter 11 about SQL Refactoring. In the database testing chapter (chapter 5) I cover why you should test your database, why it is a good idea to have a database access interface composed of stored procedures, views and user defined functions, what and how to test. I talk about how there are many testing methods like black and white box testing, unit and integration testing, error and stress testing and why and how you should do all those. Sometimes you have to convince management to go for testing in the development lifecycle so I give some pointers and tips how to do that. Testing databases is a bit different from testing object oriented code in a way that to have independent unit tests you need to rollback your code after each test. The chapter shows you ways to do this and also how to avoid it. At the end I show how to test various database objects and how to test access to them. In the SQL Refactoring chapter (chapter 11) I cover why refactor and where to even begin refactoring. I also who you a way to achieve a set based mindset to solve SQL problems which is crucial to good SQL set based programming and a few commonly seen problems to refactor. These problems include: using functions on columns in the where clause, SELECT * problems, long stored procedure with many input parameters, one subquery per condition in the select statement, cursors are good for anything problem, using too large data types everywhere and using your data in code for business logic anti-pattern. You can read more about it and download it here: The Red Gate Guide to SQL Server Team-based Development Hope you like it and send me feedback if you wish too.

    Read the article

  • OTN Latinoamérica Tour 2012

    - by Dana Singleterry
    Better late than never. Sorry for the delay on getting this content up for all of you and thanks again for your attendance. A number of excellent questions came out of the sessions I delivered and herein I'm providing you with the content, in pdf format, for those sessions. I'm also providing pointers to Forms to ADF integration/migration as well as some details around OAF as used in E-Business Suite and ADF. Here's the sessions delivered by location. Click on any of the links to download the session content in pdf format. Montevideo Uruguay: Is Oracle ADF Simpler than Oracle Forms? Understanding the Fusion Development Platform Building Web Data Dashboards Without Coding Buenos Aires, Argentina: Is Oracle ADF Simpler than Oracle Forms? Developing Cross Device Mobile Applications Sao Paulo, Brazil Understanding the Fusion Development Platform Is Oracle ADF Simpler than Oracle Forms? A brief note on Form Integration & Migration: Does your organization have an Oracle Forms application that you'd like to migrate to ADF? Or, perhaps you're an Oracle Forms Developer and want to modernize your application development skills? If so, you've come to the right place! This section will strive to answer common questions that arise as you move from Forms to ADF. Our Oracle Forms Statement of Direction points out that Oracle is committed to the long-term support of Oracle Forms and Reports. However, many customers feel they are outgrowing their Forms applications. Users are demanding more sophisticated and interactive users interfaces. Executives are requiring SOA-enabled applications that integrate with peripheral services. Development leads are encouraging a more modern approach to application development, including adherence to design patterns like MVC. So even as Oracle still supports Forms, the list of reasons to move off of it is becoming more compelling and is only gaining further momentum by the fact that Oracle's own Fusion Applications are using ADF. Developers and organizations looking to align with both the technology stack and look-and-feel of Fusion Applications are choosing ADF, and thus reaping the benefits of years of best practices in enterprise application development that are baked into the ADF framework. So, if you decide to migrate off of Forms for any of these reasons, ADF is the way to go. Grant Ronald has published a video of our position on the subject, along with an ODTUG article explaining our direction. These materials explain that there are other migration tools/frameworks/paths, but the best choice is usually to follow Gartner's recommendation that if you are going to migrate off of Oracle Forms, ADF is the least risky and least costly migration path. Please visit the Oracle Forms page here. For details around OAF as used in E-Business Suite (EBS) and when to use ADF with EBS you can review the following blogs from Shay Shmeltzer. To ADF or to OAF? or Can I use ADF with Oracle E-Business Suite?

    Read the article

  • Most Unprofessional Workplace

    - by TehGrumpyCoder
    I've worked lots of places in lots of roles: Delivery truck driver, Boilermaker, antenna rigger, Professional Musician, Electronic Technician, Electrical Engineer, and for most of my career: Software Turkey. I want to say this large company is the most unprofessional place I've ever worked, but then I think about other jobs such as TTI that stiffed us all for 10 months salary -- or had us work 2-1/2 years at 66% however you want to look at it, or maybe NeoPlanet with a cast from a bad sitcom running the show, I could go on, but I digress (as usual). So maybe this place isn't the *most* unprofessional, but the personnel rank up there. I'm in a small room off a factory. There are 3 managerial offices, and 36 common-folk of various skill-sets in a variety of single to quad cubicles. No matter where you sit though, because of the layout and location, you've got a hard wall as one wall of your cubicle. Because of that hard wall, everything echoes. I get off the phone, and the guy in the next cubicle makes a comment in response to my phone conversation... I hate that it can be heard and I hate that they do that! These people have no problem yelling from cube to cube to carry on running conversations some of which are actually work-related. There's a lady two cubes away that talks so loud I can clearly hear every phone conversation she has... all work-related but still... Then the one in the next cubicle must have been raised on a farm because there's only one volume setting: LOUD... "HEY MARGE, CAN I GET IN FOR A QUICK APPOINTMENT AFTER WORK TONIGHT?" ... sigh Also that cube is the 'party cube' so that's where all the candy, cake, donuts, and leftovers sits. Anything MzLoud brings in has to have a verbal recipe associated with it at least 10 times during the day, and of course at volume. I've had running conversations over the top of my cube from people in the next one on each side. The weird thing is... the boss sits with an open door closer to this whole fiasco than me. So I wear a pair of Bose noise-cancelling headphones, and crank up Kenny Burrell, Herb Ellis, Wes Montgomery, or Jimmy Smith to the point I can't hear the racket... what the heck, I already have a hearing loss from playing guitar.

    Read the article

  • Six Unusual Blogs I Like

    - by Bill Graziano
    I subscribe to and read over 100 SQL Server blogs every day.  I link to posts that I think are interesting.  I also read a fair number of non-SQL Server blogs.  Here are a few that I think are interesting. danah boyd. She is a researcher with Microsoft and writes about privacy, social media and teenagers.  I discovered her blog while looking for strategies to keep my personal and professional life separate.  (I haven’t found a good solution to that yet.)  Her stories of how teenagers use Facebook and other social media tools are fascinating. Clayton’s Web Snacks.  Steve Clayton works at Microsoft and has a variety of blogs out there.  This one focuses on … hmmm.  His latest posts are on graffiti, infographics, paper tweets, cartoons and slow motion videos.  It’s mostly visual and you never really know what you’ll get.  It’s always interesting though and I like what he posts.  It’s good creative stuff. Seth Godin.  Seth writes about Marketing.  I read him for motivation to get off my butt and get things done.  He’s a great motivator who encourages you to think big.  And do something! Ask the Pilot.  Patrick Smith is a commercial airline pilot writing about the airline industry.  He’s a great debunker of myths (no they don’t reduce oxygen in the cabin to keep you docile).  My favorite topics include the TSA, flying myths, airport reviews and flight delays. My old favorite flight blog used to be enplaned.  No one knew who wrote it.  It focused on the economics of the airline industry.  It was fascinating stuff.  One day it was gone.  The entire blog was deleted.  Someone tracked down some partial archives and put them online. The Agent’s Journal.  Jack Bechta is an NFL agent.  He writes about the business side of the NFL, the draft and free agency.  Lately he’s been writing about the potential lockout.  He has a distinct lack of hype which I find very refreshing.  xkcd.  I call this the comic for smart people.  A little math, some IT and internet privacy thrown in all make an unusual comic. Funny and intelligent.

    Read the article

  • Comparison of Extreme Programming (XP) to Traditional Programming Methodologies

    The comparison of extreme programming (XP) to traditional programming methodologies can find similarities between the historic biblical battle between David and Goliath. Goliath of Gath is a Philistine warrior renowned for his size, strength and battle tested skills. Much like Goliath, traditional methodologies are known to be cumbersome due to large amounts of documentation, and time consuming do to the time needed to gather all the information. However, traditional methodologies have been widely accepted by the software development community for years because of its attention to detail regarding project development and maintenance. David is a male Israelite teenager, who was small, fearless, and untrained in any type of formal combat. In a similar fashion, extreme programming focuses more on code over documentation so that time is spent on developing the project and not on cumbersome documentation of a project. Typically, project managers and developers are fearless when they start this type of project because they usually start with little to no documentation, and they expect to be given changes to be implemented at the start of every new project iteration. Because of the lack of need or desire for documentation in extreme programming projects they appear to act as if there is no formal process involved in developing an extreme programming project.  This is a misnomer, because of the consistent development iterations and interaction with clients and users the quickly takes form because each iteration allows the project to be refined as the customer needs and desires change. Ravikant Agarwal and David Umphress documented a new approach to extreme programming called personal extreme programming (PXP) at the ACM Southeast Regional Conference in 2008. PXP is the application of extreme programming core concepts in a single developer team environment.  PXP focuses on how to adjust the main concepts and practices of extreme programming that is typically centered in a group environment and how they can be altered to be beneficial for a single developer environment. Suzanne Smith and Sara Stoecklin are both advocates of extreme programming according to the Journal of Computing Sciences in Colleges and in fact they feel that it should receive more attention in introductory programming classes to allow students to better understand the software development process. Reasons why extreme programming is a good thing: Developers get to do more of what they love, Develop. Traditional software development methodologies tend to  add additional demands on a project by requiring all requirements and project specifications to be fully defined prior to the start of the implementation phase of a project. A standard 40 hour work week. With limiting the work week to only 40 hours prevents developers from getting burned out on projects.

    Read the article

  • ArchBeat Top 10 for November 18-24, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared on the OTN ArchBeat Facebook page for the week of November 18-24, 2012. One-Stop Shop for over 200 On-Demand Oracle Webcasts Webcasts can be a great way to get information about Oracle products without having to go cross-eyed reading yet another document off your computer screen. Oracle's new Webcast Center offers selectable filtering to make it easy to get to the information you want. Yes, you have to register to gain access, but that process is quick, and with over 200 webcasts to choose from you know you'll find useful content. Oracle SOA Suite 11g PS 5 introduces BPEL with conditional correlation for aggregation scenarios | Lucas Jellema An extensive, detailed technical post from Oracle ACE Director Lucas Jellema. Oracle Utilities Application Framework V4.2.0.0.0 Released | Anthony Shorten Principal Product Manager Anthony Shorten shares an overview of the changes implemented in the new release. Fault Handling and Prevention - Part 1 | Guido Schmutz and Ronald van Luttikhuizen In this technical article, part one of a four part series, Oracle ACE Directors Guido Schmutz and Ronald van Luttikhuizen guide you through an introduction to fault handling in a service-oriented environment using Oracle SOA Suite and Oracle Service Bus. Oracle BPM Process Accelerators and process excellence | Andrew Richards "Process Accelerators are ready-to-deploy solutions based on best practices to simplify process management requirements," says Capgemini's Andrew Richards. "They are considered to be 'product grade,' meaning they have been designed; engineered, documented and tested by Oracle themselves to a level that they can be deployed as-is for a solution to a problem or extended as appropriate for a particular scenario." Videos: Getting Started with Java Embedded | The Java Source Interested in Java Embedded? You'll want to check out these videos provided Tori Weildt, including interviews with Oracle's James Allen and Kevin Smith, recorded at ARM TechCon. JPA SQL and Fetching tuning ( EclipseLink ) | Edwin Biemond Oracle ACE Edwin Biemond's post illustrates how to "use the department and employee entity of the HR Oracle demo schema to explain the JPA options you have to control the SQL statements and the JPA relation Fetching." Devoxx 2012 Trip Report - clouds and sunshine | Markus Eisele Oracle ACE Director Markus Eisele shares an extensive and entertaining account of his experience at Devoxx 2012. Towards Ultra-Reusability for ADF - Adaptive Bindings | Duncan Mills "The task flow mechanism embodies one of the key value propositions of the ADF Framework," says Duncan Mills. "However, what if we could do more? How could we make task flows even more re-usable than they are today?" As you might expect, Duncan has answers for those questions. Java Specification Requests in Numbers | Markus Eisele Oracle ACE Director Markus Eisele shares some interesting data culled from the Java Community Process site. Thought for the Day "You can't have great software without a great team, and most software teams behave like dysfunctional families." — Jim McCarthy Source: SoftwareQuotes.com

    Read the article

  • The Windows 8 and Ubuntu 12.04 Dual Boot NIghtmare

    - by Steve
    I have done some research as to how to go about this dual-boot, and I am close, but I need some guidance with booting into Windows 8 (Ubuntu is installed). I have a Lenovo Ideapad y510p. I will go over what I have done to dual-boot this laptop, with windows 8 pre-installed, with Ubuntu 12.04: I followed every instruction to the letter for the 97-vote response here, and everything worked fine up until after the repair boot section: Installing on a Pre-Installed Windows 8 System (UEFI Supported) I ran into the following error upon restarting after the repair boot section: error: invalid arch independent elf magic. This error (a grub issue) disabled me from booting into Ubuntu :( After a little googling, I followed the instructions in the reactivating grub 2 section to resolve the error: http://kb.acronis.com/content/1686 I found a possible solution to fixing the Windows 8 boot issue, and tried it: http://webcache.googleusercontent.com/search?q=cache:i9JMyXzzRpYJ:askubuntu.com/questions/279275/dual-boot-problem-windows-8-ubuntu-12-04+&cd=1&hl=en&ct=clnk&gl=us&client=ubuntu I thought the above solution worked, but when I attempt to boot into Windows 8, I get the following missing file error: File: \Boot\BCD Status: 0xc000000e Info: The Boot Configuration Data for your PC is missing or contains errors. Here is some other information that may be useful: I have 3 partitions devoted to Ubuntu. The first, sda8, has a flag bios_grub (1049 kb). The second, sda9, is where everything else is (96.6 GB). The last, sda10, is for swap (8299 MB). My question is: How do I fix the boot configuration for Windows 8? Any help would be greatly appreciated :) Update 1: When I attempt to boot into UEFI mode, I get the following error: invalid arch independent elf magic (the same error I saw in step 2). Update 2: A useful link here I found: Dual booting Ubuntu 12.04: UEFI and Legacy So, this is my 4th time installing Ubuntu on the laptop, and it looks like I need to install it in UEFI mode. Should I scrap it all again, and reinstall? Or is there ANY way of salvaging my installation? At this point, I can't even boot into Windows (although I have an installation cd to fix the windows boot issue, that would ultimately screw over ubuntu). Update 3: After doing a little more browsing around, I found a cool way around this messy grub stuff, using rEFInd. Rod Smith's post here saved me! Installing ubuntu 12.04.02 in uefi mode Now, I am able to dual-boot Windows 8 and Ubuntu and boot into both operating systems :) I have another issue (relating to the boot configuration in the bios) that I will post as a separate question :)

    Read the article

  • GParted detects entire disk as UNALLOCATED SPACE + hd0 out of disk

    - by msPeachy
    Good day to everyone. I hope someone can help me with my problem. I have a dual boot Windows and Ubuntu system. I recently encountered an hd0 out of disk error and wasn't able to boot Ubuntu. So I booted into Windows. After 2 to 3 times of booting and rebooting Windows, I tried booting Ubuntu again but still I get the same hd0 out of disk error. I decided to run Ubuntu from LIVEUSB to try to fix my Ubuntu partition using GParted, but when I run GParted, it shows my entire disk as UNALLOCATED SPACE! The strange thing is that Nautilus still shows and mounts my partitions. Also every time I boot into Windows , my partitions exists and I am able to read and write to them. I have no idea what is wrong. Please help! I can't stand using Windows since most of the tools I use are in Ubuntu. I don't mind reinstalling Ubuntu. In fact I already tried reinstalling using the LIVEUSB but since GParted or the Ubuntu installer itself does not recognize my partitions and shows the entire disk as unallocated space, I decided not to continue. I am currently running Ubuntu from LIVEUSB. Here's the outpuf of sudo fdisk -l Disk /dev/sda: 320.1 GB, 320072933376 bytes 255 heads, 63 sectors/track, 38913 cylinders, total 625142448 sectors Units = sectors of 1 * 512 = 512 bytes Sector size (logical/physical): 512 bytes / 512 bytes I/O size (minimum/optimal): 512 bytes / 512 bytes Disk identifier: 0xb30ab30a Device Boot Start End Blocks Id System /dev/sda1 * 2048 104869887 52433920 83 Linux /dev/sda2 104869888 105074687 102400 7 HPFS/NTFS/exFAT /dev/sda3 105074688 156149759 25537536 7 HPFS/NTFS/exFAT /dev/sda4 156151800 625153409 234500805 f W95 Ext'd (LBA) /dev/sda5 156151808 169156591 6502392 82 Linux swap / Solaris /dev/sda6 169158656 294991871 62916608 7 HPFS/NTFS/exFAT /dev/sda7 294993920 471037944 88022012+ 7 HPFS/NTFS/exFAT /dev/sda8 471041928 625121152 77039612+ 7 HPFS/NTFS/exFAT When I run, sudo parted -l, I got this error message: ubuntu@ubuntu:~$ sudo parted -l Error: Can't have a partition outside the disk! UPDATE I think I might know the problem. The total sectors of sda is 625142448 but the extended partition (sda4) ends at 625153409. Now, my question is, how do I fix this or modify the extended partition (sda4) to matched the total number of sectors? Anyone, please??? UPDATE I was able to fix the unallocated space issue with the help of Rod Smith's tool called fixparts I am now able to view my partitions via GParted in LiveUSB. But the error: hd0 out of disk. Press any key to continue... still persists on reboot. I still can't boot into Ubuntu. Can someone help me please???

    Read the article

  • Aptronyms: fitting the profession to the name

    - by Tony Davis
    Writing a recent piece on the pains of index fragmentation, I found myself wondering why, in SQL Server, you can’t set the equivalent of a fill factor, on a heap table. I scratched my head…who might know? Phil Factor, of course! I approached him with a due sense of optimism only to find that not only did he not know, he also didn’t seem to care much either. I skulked off thinking how this may be the final nail in the coffin of nominative determinism. I’ve always wondered if there was anything in it, though. If your surname is Plumb or Leeks, is there even a tiny, extra percentage chance that you’ll end up fitting bathrooms? Some examples are quite common. I’m sure we’ve all met teachers called English or French, or lawyers called Judge or Laws. I’ve also known a Doctor called Coffin, a Urologist called Waterfall, and a Dentist called Dentith. Two personal favorites are Wolfgang Wolf who ended up managing the German Soccer team, Wolfsburg, and Edmund Akenhead, a Crossword Editor for The Times newspaper. Having forgiven Phil his earlier offhandedness, I asked him for if he knew of any notable examples. He had met the famous Dr. Batty and Dr. Nutter, both Psychiatrists, knew undertakers called Death and Stiff, had read a book by Frederick Page-Turner, and suppressed a giggle at the idea of a feminist called Gurley-Brown. He even managed to better my Urologist example, citing the article on incontinence in the British Journal of Urology (vol.49, pp.173-176, 1977) by A. J. Splatt and D. Weedon. What, however, if you were keen to gently nudge your child down the path to a career in IT? What name would you choose? Subtlety probably doesn’t really work, although in a recent interview, Rodney Landrum did congratulate PowerShell MVP Max Trinidad on being named after a SQL function. Grant “The Memory” Fritchey (OK, I made up that nickname) doesn’t do badly either. Some surnames, seem to offer a natural head start, although I know of no members of the Page-Reid clan in the profession. There are certainly families with the Table surname, although sadly, Little Bobby Tables was merely a legend by xkcd. A member of the well-known Key family would need to name their son Primary, or maybe live abroad, to make their mark. Nominate your examples of people seemingly destined, by name, for their chosen profession (extra points for IT). The best three will receive a prize. Cheers, Tony.

    Read the article

  • Web Experience Management: Segmentation & Targeting - Chalk Talk with John

    - by Michael Snow
    Today's post comes from our WebCenter friend, John Brunswick.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Having trouble getting your arms around the differences between Web Content Management (WCM) and Web Experience Management (WEM)?  Told through story, the video below outlines the differences in an easy to understand manner. By following the journey of Mr. and Mrs. Smith on their adventure to find the best amusement park in two neighboring towns, we can clearly see what an impact context and relevancy play in our decision making within online channels.  Just as when we search to connect with the best products and services for our needs, the Smiths have their grandchildren coming to visit next week and finding the best park is essential to guarantee a great family vacation.  One town effectively Segments and Targets visitors to enhance their experience, reducing the effort needed to learn about their park. Have a look below to join the Smiths in their search.    Learn MORE about how you might measure up: Deliver Engaging Digital Experiences Drive Digital Marketing SuccessAccess Free Assessment Tool

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >