Search Results

Search found 1078 results on 44 pages for 'bill parson'.

Page 10/44 | < Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >

  • Why did my Google links disappear after a redesign?

    - by Bill
    I recently did a complete redesign of my site. As soon as Google picked up the changes (I could tell because the excerpt in the search results was brought up to date), I noticed that my traffic slowed by about 30%. I started to investigate, ran a "link:" query on my site and saw only two links there. I know there are many more links to my site, mostly from reputable sources like magazines and large blogs. Why aren't these links showing up anymore? There's nothing even remotely spammy about my site, so I don't see why there would be weirdness going on.

    Read the article

  • PASS: FY10 Actuals Posted

    - by Bill Graziano
    Earlier this year we published preliminary fiscal year 2010 financials to the Governance page on the PASS web site.  Please remember that FY10 runs from July 1st, 2009 through June 30th, 2010 and includes the November 2009 Summit.  We do our fiscal year this way so that the Summit falls earlier in the fiscal year.  The financials we had posted were P&L numbers at the portfolio level.  Prior to this we had posted our detailed budget but only posted the auditors report at the end of each year.  Today we updated our published financials to include: Pre-audit actuals from FY10 at the same level as our budget.  The document has both actuals and budget for FY10 side by side.  This is over 20 pages of detailed financial information covering hundreds of line-items. A letter describing key differences between our budget and actuals.  I walked through each line item where the difference was greater than $25,000 and explained what happened and why. We updated the financial graph going back to 2003 to include FY10. This update should “close the loop” on our financials.  You can now start with the published budget and compare it to the finished financials at the same level of detail.  We also plan to publish the auditor’s report when that is completed -- as we do every year. Overall I’m very happy with how FY10 turned out.  Keep in mind that this was the November 2009 Summit so we were still facing economic challenges.  With all that we were roughly break-even showing a $15,000 profit on $3.9 million of revenue.  I didn’t find anything shocking in reviewing our actual vs. budget but there were a few things that needed explanation.  You can see those in the letter on the governance page. Please keep in mind that these are the actuals from our operating financials.  The auditor may have us make adjustments for depreciation or other financial transactions.  We may also account for certain transactions differently for tax purposes than we do for financial reporting purposes.  I feel these financial statements give you the clearest picture of how our organization spends its money. We were late publishing these this year.  We were working through some tax issues and that delayed our ability to file our final tax forms which delayed this process.  In hindsight I should have published these documents as soon as we had them and not waited for the tax issues.  We’ll do this better in the future. And on a final note, you don’t need to login to view these documents.  If you have any questions you can post them here.  If we get more than a few questions we may see about creating some forums for financial issues on the PASS web site.

    Read the article

  • ClearTrace Performance on 170GB of Trace Files

    - by Bill Graziano
    I’ve always worked to make ClearTrace perform well.  That’s probably because I spend so much time watching it work.  I’m often going through two or three gigabytes of trace files but I rarely get the chance to run it on a really large set of files. One of my clients wanted to run a full trace for a week and then analyze the results.  At the end of that week we had 847 200MB trace files for a total of nearly 170GB. I regularly use 200MB trace files when I monitor production systems.  I usually get around 300,000 statements in a file that size if it’s mostly stored procedures.  So those 847 trace files contained roughly 250 million statements.  (That’s 730 bytes per statement if you’re keeping track.  Newer trace files have some compression in them but I’m not exactly sure what they’re doing.)  On a system running 1,000 statements per second I get a new file every five minutes or so. It took 27 hours to process these files on an older development box.  That works out to 1.77MB/second.  That means ClearTrace processed about 2,654 statements per second. You can query the data while you’re loading it but I’ve found it works better to use a second instance of ClearTrace to do this.  I’m not sure why yet but I think there’s still some dependency between the two processes. ClearTrace is almost always CPU bound.  It’s really just a huge, ugly collection of regular expressions.  It only writes a summary to its database at the end of each trace file so that usually isn’t a bottleneck.  At the end of this process, the executable was using roughly 435MB of RAM.  Certainly more than when it started but I think that’s acceptable. The database where all this is stored started out at 100MB.  After processing 170GB of trace files the database had grown to 203MB.  The space savings are due to the “datawarehouse-ish” design and only storing a summary of each trace file. You can download ClearTrace for SQL Server 2008 or test out the beta version for SQL Server 2012.  Happy Tuning!

    Read the article

  • ClearTrace for SQL Server 2012

    - by Bill Graziano
    I’ve updated the beta for ClearTrace that support SQL Server 2012.  This requires SQL Server 2012 to be installed on the computer where ClearTrace is running.  It will read traces from SQL Server 2008 R2, SQL Server 2008 and SQL Server 2005. It includes some minor improvements in performance and handling large SQL statements. It should also give better errors. If you do find any of those errors, please report them in the support forum.

    Read the article

  • SQL Saturday 300 BBQ Crawl

    - by Bill Graziano
    SQL Saturday #300 is coming up right here in Kansas City on September 13th, 2014.  This is our fifth SQL Saturday which means it's the fifth anniversary of our now infamous BBQ Crawl.  We get together on Friday afternoon before the event and visit a few local joints.  We've done nice places and we've done dives.  We haven’t picked the venues yet but I promise you’ll be well fed! And if you’re thinking about the BBQ crawl you should think about submitting a session.  Our call for speakers closes Tuesday, July 15th so you just have time!  If you’re going to be at the event, contact me and I’ll get you added to the list.

    Read the article

  • Gnome 3 freezes when switching workspaces

    - by Bill Cheatham
    I have found an odd problem with Gnome 3 on Ubuntu recently. Under certain circumstances, when switching between two workspaces the whole desktop and user interface will 'freeze' for anything up to 2 minutes; during this time I cannot click anything or interact using the keyboard. Often, the screen will show a 'half-completed' animation of the workspace swapper. This is on Ubuntu 12.04.1 running on a 2-year old Macbook Pro, and it only seems to occur when using an external monitor. This does not seem to be a processor or memory issue, and does not occur every time I switch workspaces. Is this a known problem? What can I do in the short or long term to fix it?

    Read the article

  • Install Ubuntu 12.04 on Drive "D"?

    - by Bill Jones
    I have a Dell Inspiron 531S that originally came loaded with Windows Vista. A couple years ago I purchased a copy of Windows 7, formatted the hard drive and installed the updated operating system. In the process I formatted the 10 GB recovery drive partition on drive D as it was no longer needed for Windows Vista. I would really like to install Ubuntu 12.04 LTS alongside Windows 7 using the empty 10 GB drive D:. I have two questions. (1) Can Ubuntu be installed on a separate partition, a drive removed from the boot sector on drive C:? (2) If so would Grub be installed in the boot sector and properly select Windows 7 on drive C: or Ubuntu 12.04 on drive D?

    Read the article

  • Speaking in Omaha: December 7, 2011

    - by Bill Graziano
    I’m presenting in Omaha on Writing Faster SQL at 6PM on December 7th.  You can find meeting details on the Omaha SQL Server User Group page. The meeting location requires an RSVP so building security has a list of attendees. The presentation is a series of suggestions on improving performance.  It ranges from simple things like comparing indexed columns to scalar values up to tips for reducing query compiles and asynchronous processing patterns.  Nearly all of these come from specific issues I’ve encountered working on poorly performing SQL Servers.

    Read the article

  • Android 2.3 (Gingerbread) officially released

    - by Bill Osuch
    Google today officially released their latest version of the Android OS - 2.3, Gingerbread. It won't hit a phone (the Nexus S) until 12/16, but developers can start working with it today. Some of the new features include: Enhancements for game development Rich multimedia New forms of communication Simplified debug builds Integrated ProGuard support HierarchyViewer improvements Preview of new UI Builder See the complete details at http://developer.android.com/sdk/android-2.3.html

    Read the article

  • App Engine charges in November 2011 [closed]

    - by broiyan
    I had a billing enabled test application on Google App Engine left over from early 2011. I have not received a bill in many months because I have not been hitting the URL and according to the activity monitor, nobody has. Then unexpectedly in November 2011, I received 2 bills in as many weeks for quite minimal amounts. Checking the monitor it looks like nobody has been hitting the URL and according to the SQL-like search, there is nothing in the Datastore. I know that GAE has left the "preview" in recent weeks but I am not sure how that would affect what is essentially a dormant application with no Datastore objects. Has GAE started charging for completely unused applications in recent weeks? Edit: Most of my applications were already disabled and I have just disabled the only one that was enabled but unused the past several months. If I get another bill next week that should be informative.

    Read the article

  • ClearTrace Supports SQL Server 2008 R2

    - by Bill Graziano
    It was a long time coming and I hope worth the wait.  If you have SQL Server 2008 R2 installed on the same box as ClearTrace and you download the latest ClearTrace build (36) you’ll be able to read SQL Server 2008 R2 traces.  I also fixed a bug handling very, very large SQL statements.  I encountered an INSERT statement that was 12MB in size.  It was storing XML in a database.  ClearTrace uses regular expressions to clean up the SQL it finds.  Running two dozen regular expressions over this 12MB string caused the application to crash.  In this build I’m just skipping any statement over 1MB.  I’ll do something a little nicer in the next build. (And if you don’t want to download ClearTrace you can test the online version at www.TraceTune.com)

    Read the article

  • What does it mean if a job requires a "Bachelor's degree in Computer Science or related field"?

    - by Bill
    Specifically, what is meant by "related field"? I'm in the process of pursuing an IT Infrastructure B.A.S. from the U of M (Twin Cities), but have been playing around with the idea of just doing the CSCI B.S. I don't want to be a hardcore programmer, but would having the CSCI degree, instead of the ITI degree, open more doors to whatever profession within the IT world I end up setting my sights on?

    Read the article

  • An online version of ClearTrace

    - by Bill Graziano
    When I visit clients for the first time and conduct a performance review I introduce them to ClearTrace. It’s still the best way I know to identify exactly which queries are consuming the most resources.  The downside is that it needs to be downloaded and create a database to store the results.  I finally decided it would be easier if I could just upload a trace immediately. You can find the online version of ClearTrace at TraceTune.com.  It provides a simple way to upload a trace file and see exactly which stored procedures or SQL statements consume the most CPU and disk.   This is still a work in progress as I try to determine exactly which features from ClearTrace are important.  I’ve also limited the file upload to 10MB in this beta release.  That might not sound like much but I get over 20,000 events using this stored procedure to generate the trace. If you’re looking for something to do on a Friday, I’d suggest a little performance tuning.  Generating 10MB of trace data doesn’t take long at all and in a short time you’ll see exactly which SQL statements you need to tune first.

    Read the article

  • Windows 7 and 11.10 side by side from scratch what am I doing wrong?

    - by Bill.Caffery
    I have tried everything I can think of at this point and I'm at a total loss. I have a 2TB drive I want to install Windows 7 and 11.10 (both are 64 bit) side by side, but once I install Windows it must need gpt to work or something because no matter how many times I've removed it, it always returns. If I use gdisk and set the disk as mbr, Windows won't load. Now this last time I ran gdisk and tried to boot into Windows before I installed Ubuntu all the way and ended up at a grub rescue prompt. Any help please? Also let me say I have been searching and reading for days and have tried everything I can find to make this work so this isn't a one time event, it's continual.

    Read the article

  • Another Update to SQL Server Configuration Scripting Utility

    - by Bill Graziano
    I’ve been gradually adding features to my utility that scripts the configuration of a SQL Server.  Since my last post I’ve added the following features: Skip any encrypted object in a database Script alerts, alert notifications and operators Script audits Always script model, master and msdb to capture any user-defined objects in those databases Logins are now scripted so that everything for a login is grouped together. There’s a second section in the logins that handles default databases.  In many cases a login’s default database is a mirror target and can’t be set.  This is now handled gracefully.  It also includes a separate section for all default databases so those can be quickly set in the event of a disaster. Script credentials Script proxy accounts Script database mail My goal is still to get everything outside a database scripted.  This release is enough that I can keep my mirror target servers in sync with their principals.

    Read the article

  • transfer Thunderbird (17) Profile on Win7 to Ubuntu 12.04

    - by William Curran
    I want to transfer Thunderbird Profile on Win7 to thunderbird (17) Ubuntu 12.04. I already copied the profile folder from Windows to Ubuntu and modified progile.ini on the ubuntu machine to include [Profile] Name=Bill IsRelative=1 Path=(the name of the transfered profile folder) I think the problem is that the Win. TB profile content (files and folder structure) look VERY different that the that of the unbuntu TB profile's that was created on installation. The Ubuntu install is new where as the Win TB has undergone many updates. Seems the system for profile storage has changed drastically. I tried to start TB in safe mode but could't get the path correct to start TB in the terminal with the -safe-mode switch. What can I do? Bill

    Read the article

  • TraceTune shows Reads graphically

    - by Bill Graziano
    TraceTune now shows a graphical view of logical reads for each SQL statement in a trace file.  The width of the colored bar in the screen capture below is the percentage of logical reads for that statement.  The absolute number of reads is shown to the right. Any statement that has a user entered comment is shown in bold.  If you hover over the statement it will show the most recent comment for that statement.

    Read the article

  • SQL Server Configuration Scripting Utility Release 9

    - by Bill Graziano
    There’s another update to my little utility to script a SQL Server’s configuration.  I use this for two purposes.  First, I use it to keep my database mirroring servers up to date.  Second, I capture the output in a version control system and keep that for historical reference. In release 3.0.9 I made the following changes: Rewrote the encrypted trigger scripting.  It will now list the encrypted triggers in a comment in the table script but can’t actually script them. It now scripts any server event notifications. You can script a single database using the /scriptdb flag.  Please note that it will also script the instance and system databases when it does this. It will script any user-defined endpoints.  This will capture your mirroring endpoints and more importantly any service broker endpoints. It will gracefully skip database mail on the Express Edition. It still doesn’t support SQL Server 2012.  I think that’s the next feature to add though.

    Read the article

  • How do I make a dialog box? [on hold]

    - by bill
    By dialog box I mean when player talks to someone, a box shows up with text on it. I haven't found much about this topic online, so I created a basic dialog box: //in dialog box i have only two methods public void createBox(int x, int y, int width, int height, String txt) { this.x = x; this.y = y; this.width = width; this.height = height; this.txt = txt; } //draw dialog box public void draw(Graphics2D g) { if (txt != null) { g.setColor(Color.red); g.drawRect(x,y,width,height); g.setColor(Color.black); g.fillRect(x, y, width, height); g.setColor(Color.white); g.drawString(txt, x + 10, y + 10); } } I wanted to now can I make this better?

    Read the article

  • What version(s) of Android should I be targeting?

    - by Bill Osuch
    If you're wondering about what screen sizes and flavors of Android are currently out in the wild, the Android Developers site has a handy little chart of both (each image below is hyperlinked to the page): So, as far as platforms go, there aren't many people still out there running less than 2.X... and you'd probably be safe not worrying about Small or Low Density screens. I'd be curious to know if any of the other app stores (Amazon, Archos, etc.) offer similar stats...

    Read the article

  • Writing files to an Airport Extreme using afp

    - by Bill Oldroyd
    Using Nautilus I can connect from Ubuntu 12.04 (64-bit) to my Apple Airport Extreme using user & password without a problem. I can read, browse folders and delete files. However I cannot write files, the file is created, but the contents of the file are not transferred. The transfer fails with the error message "kFPMiscErr" which I think means that "authentication has already been established" ?. I have tried the command line tools for access using AFP but these do not work either. Is there a solution to this problem ?

    Read the article

  • How to tell if Microsoft Works is 32 or 64 bit? Please Help!

    - by Bill Campbell
    Hi, I am trying to convert one of our apps to run on Win7 64 bit from XP 32 bit. One of the things that it uses is Excel to import files. It's a little complicated since it was using Microsoft.Jet.OLEDB.4.0 (Excel). I found Office 14 (2010) has a 64bit version I can download. I downloaded Office 2010 Beta but it didn't seem to install Microsoft.ACE.OLEDB.14.0. I found that I could download 2010 Office System Driver Beta: Data Connectivity Components which has the ACE.OLEDB.14 in it but when I try to install it, the installed tells me "You cannot install the 64-bit version of Access Database engine for Microsoft Office 2010 because you currently have 32-bit Office products installed". How do I determine what 32bit office products this is reffering to? My Dell came with Microsoft Works installed. I don't know if this is 32 or 64 bit. Is there anyway to tell? I don't want to uninstall this if it's not the problem and I'm not sure what else might be the problem. Any help would be appreciated! thanks, Bill

    Read the article

  • WPF TextBlock Binding to DependencyProperty

    - by Bill Campbell
    Hi, I have what I believe to be about one of the most simple cases of attempting to bind a view to a dependencyproperty in the view model. It seems that the initial changes are reflected in the view but other changes to the DP do not update the view's TextBlock. I'm probably just missing something simple but I just can't see what it is. Please take a look... My XAML has a status bar on the bottom of the window. I want to bind to the DP "VRAStatus". <StatusBar x:Name="sbar" Grid.Column="0" Grid.Row="2" Grid.ColumnSpan="2" VerticalAlignment="Bottom" Background="LightBlue" Opacity="0.4" DockPanel.Dock="Bottom" > <StatusBarItem> <TextBlock x:Name="statusBar" Text="{Binding VRAStatus}" /> </StatusBarItem> <StatusBarItem> <Separator Style="{StaticResource StatusBarSeparatorStyle}"/> </StatusBarItem> </StatusBar> My viewmodel has the DP defined: public string VRAStatus { get { return (string)GetValue(VRAStatusProperty); } set { SetValue(VRAStatusProperty, value); } } // Using a DependencyProperty as the backing store for VRAStatus. public static readonly DependencyProperty VRAStatusProperty = DependencyProperty.Register("VRAStatus", typeof(string), typeof(PenskeRouteAssistViewModel),new PropertyMetadata(string.Empty)); Then, in my code I set the DP: VRAStatus = "Test Message..."; Is there something obvious here that I am missing? In my constructor for the viewmodel I set the DP like this: VRAStatus = "Ready"; I never get the Test Message to display. Please Help. thanks in advance! Bill

    Read the article

  • Operation is not valid due to the current state of the object?

    - by Bill
    I am programming in C#; the code was working about a week ago, however it throws an exception and I don't understand at all what could be wrong with it. Var root = new CalculationNode(); -> Throw exception. In the call stack thats the only thing listed, I've been told that it could be that I need a clean build, but I am open to any ideas or suggestions. Thanks, -Bill Update: Exception's Detail System.InvalidOperationException was unhandled by user code Message=Operation is not valid due to the current state of the object. Source=Calculator.Logic StackTrace: at ~.Calculator.Logic.MyBaseExpressionParser.Parse(String expression) in ~\Source\Calculator.Logic\MyBaseExpressionParser.cs:line 44 at ~.Calculator.Logic.Tests.MyBaseCalculatorServiceTests.BasicMathDivision() in ~\Projects\Tests\Calculator.Logic.Tests\MyBaseCalculatorServiceTests.cs:line 60 InnerException: CalculationNode's code: public sealed calss CalculationNode { public CalculationNode() { this.Left = null; this.Right = null; this.Element = new CalculationElement(); } public CalculationNode Left {get;set;} public CalculationNode Right {get;set;} public CalculationElement Element {get; set;} } CalculationElement's code: public sealed class CalculationElement { public CalculationElement() { Value = string.Empty; IsOperator = false; } public string Value {get; set} public bool IsOperator {get; set} }

    Read the article

  • Best way to associate data files with particular tests in RSpec / Ruby

    - by Bill T
    For my RSpec tests I would to automatically associate data files with each test. To clarify, if my tests each require an xml file as input data and then some xpath statements to validate the responses they get back I would like to externalize the xml and xpath as files and have the testing framework easily associate them with the particular test being run by using the unique ID of the test as the file(s) name. I tried to get this behavior but the solution isn't very clean. I wrote a helper method that takes the value of "description" and combines it with FILE to create a unique identifier which is set into a global variable that other utilities can access. The unique identifier is used to associate the data files I need. I have to call this helper method as the first line of every test, which is ugly. If I have an RSpec example that looks like this: describe "Basic functions of this server I'm testing" do it "should give me back a response" do # Sets a global var to: "my_tests_spec.rb_should_give_me_back_a_response" TestHelper::who_am_i __FILE__, description ... end end Is there some better/cleaner/slicker way I can get an unique ID for each test that I could use to associate data files with? Perhaps something build into RSpec I'm unaware of? Thank you, -Bill

    Read the article

< Previous Page | 6 7 8 9 10 11 12 13 14 15 16 17  | Next Page >