Search Results

Search found 16807 results on 673 pages for 'david given'.

Page 38/673 | < Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >

  • IE9 and the Mystery of the Broken Video Tag

    - by David Wesst
    I was very excited when Microsoft released the Internet Explorer 9 Release Candidate. As far as I was concerned, this was another nail in the coffin for IE6 and step in the right direction for us .NET web developers as our base camp was finally starting to support the latest and greatest future-web standards. Unfortunately, my celebration was short lived as I soon hit a snag while loading up an HTML5 site I was building in Visual Studio 2010. The Mystery After updating Internet Explorer, I ran my HTML5 site that had the oh-so-lovely HTML5 video tag showing a video. Even though this worked in IE9 Beta, it appeared that IE9 RC could not load the same file. I figured that it was the video codec. Maybe IE9 RC no longer supported the video codec I used to encode my video. Here's the code I used: <video width="854" height="480" id="myOtherVideo" autoplay="" controls=""> <source src="/DemoSite1/Media/big_buck_bunny.mp4"/> <div> <p>Your browser does not support HTML5 Video.</p> </div> </video> As you can see from the code, I had the "fail-safe" code inside the video tag. The idea there being that if the video tag, or the video files themselves, are not supported by the browser my video should fail gracefully. What was even more strange was the fact that it worked in all the other HTML5 browsers that supported video. The Investigation Whoa! DJ stop the music. How can any of that make sense? Would the IE team really take such huge strides forward only to forget to include a feature that was already in the beta? I don't think so. I did plenty of searching on the web and asking around on the web, but could not seem to find anyone else having the same problem. Eventually I came across this post talking about declaring the MIME type in the .htaccess file. That got me thinking: does my web server support the video MIME type? I was using VS2010, so how do I know what kind of MIME types are supported by default? Still, my page hosted in Cassini (the web development server in VS2010) works on the other browsers. Why wouldn't it work with IE9 RC? To answer that, it was time to open up the upgraded toolbox known as the Developer's Tools in IE9 and use the new Network Tab. The Conclusion If you take a closer look at the results displayed from the Network tab, you can see that IE9 RC has interpreted the video file as text/html rather than video/mp4. To make this work, I decided to use IIS to debug my HTML5 web application by setting the web project's properties. Then, I added the MIME types that I want to support (i.e. video/mp4, video/ogg, video/webm). Et voila! The Mystery of the Broken Video Tag is solved. After Thoughts After solving the mystery, I still had the question about why my site worked in Chrome, Safari, and Firefox 3.6. After asking around, the best answer that I received was from my colleague Tyler Doerksen. He said that IE9 likely depends on the server telling it what kind of file it is downloading rather than trying to read the metadata about the data it is trying to download before doing anything. I have no facts to back this up, but it makes sense to me. In a browser war where milliseconds can make your browser fall back a few places in the race for supremacy, maybe the IE team opted to depend on the server knowing what kind of content it is serving up. Makes sense to me. In any case, that is just an educated guess. If you have any comments, feel free to post on them below. This post also appears at http://david.wes.st

    Read the article

  • Resources for a fighting game

    - by David
    As the title says, I need resources for a 2D fighting game for the PC. The game is being made by me and two close friends. I'm thinking of using the FlatRedBall engine and either Allegro Sprite Editor or Amiga DPaint for the sprites, but I don't know is there is anything better for a more or less beginner in video game making. So my questions are as follows, what would be the best engine to use so that we could also sell the game later on, (I don't really care what language I'd have to use) and what would be the best thing to use for sprite creating? I would really appreciate any help given.

    Read the article

  • Equation / formula to determine an objects position on an ellipitcal path

    - by David Murphy
    I'm making a space game, as such I need objects to follow an elliptical path (orbit). I've worked out how to calculate all the important aspects of my orbits, the only remaining thing is how to have an object follow it. My Orbit class contains the major, minor (and by extension semi-major,semi-minor) lengths. The focii radius, area and circumference even. What is the equation to determine an objects x/y position (only need 2D) on an ellipse with a certain speed after a period of time. Basically, every frame I want to update the position based on the amount of elapsed time. I would like to have the speed along the path speed up and slow down according to the distance from the object it's orbiting, but not sure how to factor this in to the above given that at any point in time the speed has changed from it's previous speed. EDIT I can't answer my own question. But I found the question and answer is already on stackexchange: Kepler orbit : get position on the orbit over time

    Read the article

  • Silverlight Cream for March 26, 2010 -- #821

    - by Dave Campbell
    In this Issue: Max Paulousky, Christian Schormann, John Papa, Phani Raj, David Anson(-2-, -3-), Brad Abrams(-2-), and Jeff Wilcox(-2-, -3-). Shoutouts: Jeff Wilcox posted his material from mix and some preview TestFramework bits: Unit Testing Silverlight & Windows Phone Applications – talk now online At MIX10, Jeff Wilcox demo'd an app called "Peppermint"... here's the bleeding edge demo: “Peppermint” MIX demo sources Erik Mork and Co. have put out their weekly This Week In Silverlight 3.25.2010 Brad Abrams has all his materials posted for his MIX10 session Mix2010: Search Engine Optimization (SEO) for Microsoft Silverlight... including play-by-play of the demo and all source. Do you use Rooler? Well you should! Watch a video Pete Brown did with Pete Blois on Expression Blend, Windows Phone, Rooler Interested in Silverlight and XNA for WP7? Me too! Michael Klucher has a post outlining the two: Silverlight and XNA Framework Game Development and Compatibility From SilverlightCream.com: Modularity in Silverlight Applications - An Issue With ModuleInitializeException Max Paulousky has a truly ugly error trace listed by way of not having a reference listed, and the obvious simple solution. Next time he'll talk about the difficult situations. Using SketchFlow to Prototype for Windows Phone Christian Schormann has a tutorial up on using Expression Blend to develop for WP7 ... who better than Christian for that task?? Silverlight TV 18: WCF RIA Services Validation John Papa held forth with Nikhil Kothari on WCF RIA Services and validation just prior to MIX10, and was posted yesterday. Building SL3 applications using OData client Library with Vs 2010 RC Phani Raj walks through building an OData consumer in SL3, the first problem you're going to hit, and the easy solution to it. Tip: When creating a DependencyProperty, follow the handy convention of "wrapper+register+static+virtual" David Anson has a couple more of his 'Tips' up... this first is about Dependency Properties again... having a good foundation for all your Dependency Properties is a great way to avoid problems. Tip: Do not assign DependencyProperty values in a constructor; it prevents users from overriding them In the next post, David Anson talks about not assigning Dependency Property values in a constructor and gives one of the two ways to get around doing so. Tip: Set DependencyProperty default values in a class's default style if it's more convenient In his latest post, David Anson gives the second way to avoid setting a Dependency Property value in the constructor. Silverlight 4 + RIA Services - Ready for Business: Search Engine Optimization (SEO) Brad Abrams Abrams adds SEO to the tutorial series he's doing. He begins with his PDC09 session material on the subject and then takes off on a great detailed tutorial all with source. Silverlight 4 + RIA Services - Ready for Business: Localizing Business Application Brad Abrams then discusses localization and Silverlight in another detailed tutorial with all code included. Silverlight Toolkit and the Windows Phone: WrapPanel, and a few others Jeff Wilcox has a few WP7 posts I'm going to push today. This first is from earlier this week and is about using the Toolkit in WP7 and better than that, he includes the bits you need if all you want is the WrapPanel Data binding user settings in Windows Phone applications In the next one from yesterday, Jeff Wilcox demonstrates saving some user info in Isolated Storage to improve the user experience, and shares all the necessary plumbing files, and other external links as well. Displaying 2D QR barcodes in Windows Phone applications In a post from today, Jeff Wilcox ported his Silverlight 2D QR Barcode app from last year into WP7 ... just very cool... get the source and display your Microsoft Tag. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone    MIX10

    Read the article

  • Silverlight Cream for May 13, 2010 -- #861

    - by Dave Campbell
    In this Issue: Sigurd Snørteland, Jeff Prosise, DaveDev, Joe Zhou, Chris Eargle, John Papa(-2-, -3-), and David Anson(-2-). Shoutouts: In with the links I've listed below, Sigurd Snørteland also sent a link to this app he's working on which is actually pretty cool to see: ZuneLight. The code is not yet available. He also has a no-code demo of a Silverlight Media Center Pieter Voloshyn, Luiz Thadeu, and Jhun Iti have a very nice Silverlight image editor up: Thumba From SilverlightCream.com: WP7 - Silverlight on mobile Sigurd Snørteland submitted some links for me that have been translated to English from his blog. I hope the pages come out good because he's got a lot of good stuff on there. This one has a link to a presentation he did, and 4 projects you can load up in the emulator that he's converted to the phone: weather, worldclock, coverflow, and solitaire ... pretty cool... thanks for the links Sigurd! Understanding Page Orientation in Silverlight for Windows Phone Jeff Prosise has a really nice post up on page orientation in WP7 ... what it means to your app, how to detect it, and example code for what to do then... also love a quote by Jeff: "Silverlight for Windows Phone is the hottest thing since color TV" Why you should check out Expression Blend Behaviors when using Silverlight DaveDev has a post up describing Behaviors and why we should use them, plus tons of external links to resources, blogs, videos... all good stuff... Fiddler inspector for WCF Silverlight Polling Duplex and WCF RIA Joe Zhou announces and provides a link to a new Fiddler inspector that understands the framing in Polling Duplex and also raw binary xml and binary SOAP. Windows Phone Controls v0.7 Chris Eargle reports the release of Version 0.7 of the Windows Phone Controls project on CodePlex ... this includes a Pivot Control and a Panorama Control... both very nicely done. Binding to Silverlight ComboBox and Using SelectedValue, SelectedValuePath and DisplayMemberPath John Papa responds to a user question and put up a nice post about binding to a ComboBox and then go from the selected item to some other property ... code included No More Boxes! Exploring the PathListBox (Silverlight TV #25) Silverlight TV 25 went up on Tuesday ... thought it was going to be Thursday?? anyway ... John Papa and Adam Kinney are discussing the PathListBox and looking at some cool demos thereof. Exposing SOAP, OData, and JSON Endpoints for RIA Services (Silverlight TV 26) Since today IS Thursday, we have a new Silverlight TV, number 26, and John Papa is chatting with Deepesh Mohnani of the WCF RIA Services team about exposing all sorts of endpoints... should be something in there for everybody :) Workaround for a Silverlight data binding bug affecting various scenarios - including DataGrid+ContextMenu David Anson details the rabbit-trail he and others on the team followed in response to a problem reported via Twitter where the binding on a DataGrid seemed off by a row(!) ... weird but true, validated, and SL3/4 are bug-for-bug compatible with this too! ... But David wouldn't leave us there.. he also has a workaround. Sharing the code for a simple Silverlight 4 REST-based cloud-oriented file management app for Azure and S3 David Anson had an opportunity to build an app he's wanted to build for a while and shares it with us: Blobstore -- a small, lightweight Silverlight 4 application that acts as a basic front-end for the Windows Azure Simple Data Storage and the Amazon Simple Storage Service (S3) -- and remember I said he shared the source :) Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you’ll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you’ve read my previous blog posts, you’ll be aware that I’ve been focusing on the database continuous integration theme. In my CI setup I create a “production”-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it’s not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn’t I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn’t an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley’s “Continuous Delivery” teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you’ve been allotted. 2. It’s not just about the storage requirements, it’s also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I’m just not going to get the feedback quickly enough to react. So what’s the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I’m sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server’s point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no ‘duplicate’ storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly “release test” process triggered by my CI tool. RESTORE DATABASE WidgetProduction_Virtual FROM DISK=N'D:\VirtualDatabase\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE WidgetProduction_Virtual WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the ‘virtual’ restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • Installing Catalyst 11.6 for an ATI HD 6970

    - by David Oliver
    Ubuntu Maverick 10.10 is displaying the desktop okay (though limited to 1600x1200) after my having installed my new HD 6970 card, so I'm now trying to install the proprietary driver (I understand the open source one requires a more recent kernel than that in Maverick). The proprietary driver under 'Additional Drivers' resulted in a black screen on boot, so I deactivated and am trying to follow the manual install instructions at the cchtml Ubuntu Maverick Installation Guide. When I try to create the .deb packages with: sh ati-driver-installer-11-6-x86.x86_64.run --buildpkg Ubuntu/maverick I get: david@skipper:~/catalyst11.6$ sh ati-driver-installer-11-6-x86.x86_64.run --buildpkg Ubuntu/maverick Created directory fglrx-install.oLN3ux Verifying archive integrity... All good. Uncompressing ATI Catalyst(TM) Proprietary Driver-8.861......................... ===================================================================== ATI Technologies Catalyst(TM) Proprietary Driver Installer/Packager ===================================================================== Generating package: Ubuntu/maverick Package build failed! Package build utility output: ./packages/Ubuntu/ati-packager.sh: 396: debclean: not found dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.64Vzxk dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . /usr/bin/fakeroot: line 176: debian/rules: Permission denied debuild: fatal error at line 1319: couldn't exec fakeroot debian/rules: dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.QEmIld dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . Can't exec "debian/rules": Permission denied at /usr/bin/debuild line 1314. debuild: fatal error at line 1313: couldn't exec debian/rules: Permission denied dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.xtY6vC dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . /usr/bin/fakeroot: line 176: debian/rules: Permission denied debuild: fatal error at line 1319: couldn't exec fakeroot debian/rules: dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.oYWICI dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Removing temporary directory: fglrx-install.oLN3ux I've installed devscripts which has debclean in it. I've tried running the command with and without sudo. I'm not experienced with installing from downloads/source, but it seems like the file debian/source isn't being set to be executable when it needs to be. If I extract only, without using the package builder command, debian/rules is 744. As to what to do next, I'm stumped. Many thanks.

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • Waiting for a subset of threads in a Java ThreadPool

    - by David Semeria
    Let's say I have a thread pool containing X items, and a given task employs Y of these items (where Y is much smaller than X). I want to wait for all of the threads of a given task (Y items) to finish, not the entire thread pool. If the thread pool's execute() method returned a reference to the employed thread I could simply join() to each of these Y threads, but it doesn't. Does anyone know of an elegant way to accomplish this? Thanks.

    Read the article

  • how do block websites using Ruckus ZoneDirector

    - by David A. Moody
    In my school we use Ruckus ZoneDirector to control our entire network. I have separate WLANs for faculty, elementary, and secondary. The elementary and secondary networks are set to go offline during recess/lunch breaks, and after school hours. This is working fine. What I need to be able to do is block Youtube access to students while leaving it accessible to teachers (faculty WLAN). Is it possible to do this? Thanks in advance. David.

    Read the article

  • Forum software advice needed

    - by David Thompson
    Hello All ... we want to migrate our sites current forum (proprietary built) to a newer, more modern (feature rich) platform. I've been looking around at the available options and have narrowed it down to vBulletin, Vanilla or Phorum (unless you have another suggestion ?). I hope someone here can give me some feedback on their experiences either migrating to a new forum or working deeply with one. The current forum we have has approx 2.2 million threads in it and is contained in a MySQL database. Data Migration is obviously the first issue, is one of the major Forum vendors better or worse in this regard ? The software needs to be able to be clustered and cached to ensure availability and performance. We want it to be PHP based and store it's data in MySQL. The code needs to be open to allow us to highly customise the software both to strip out a lot of stuff and be able to integrate our sites features. A lot of the forums I've looked at have a lot of duplicate features to our main site, in particular member management, profiles etc. I realise we'll have to do a good bit of development in removing these and tieing it all back to the main site so we want to find a platform that makes this kind of integration as easy as possible. Finally I guess if 'future proofing' the forum (as best as possible) given the above. Which platform will allow us to customise it but also allow us to keep instep with upgrades. Which forum software has the best track record for bringing online new features in a timely manner ? etc. etc. I know it's a big question but if anyone here has any experience in some or all of the above I'd be very grateful.

    Read the article

  • It’s On! Oracle Open World 2012 Opens Call for Papers is Open

    - by David Hope-Ross
    Oracle OpenWorld is among the world’s largest industry events for good reason. It offers a vast array of learning and networking opportunities in one of the planet’s great cities.  And one of the key reasons for its popularity among procurement and supply chain professionals is the prominence of presentations by customers.   If you’d like to deliver a presentation based on your experience, now is the time to submit your abstract for review by the selection panel. The competition is strong: roughly 18% of entries are accepted each year from more than 3,000 submissions. Review panels are made up of experts both internal and external to Oracle. Successful submissions often (but not exclusively) focus on customer successes, how-tos, or best practices. What’s in it for you? Recognition, for one thing. Accepted sessions are publicized in the content catalog, which goes live in mid-June, and sessions given by external speakers often prove the most popular. Plus, accepted speakers get a complimentary pass to Oracle OpenWorld with access to all sessions and networking events- that could save you up to $2,595!   Be sure designate your session for inclusion in the correct track by selecting  “APPLICATIONS: Supply Chain Management” or “APPLICATIONS: Sourcing and Procurement” from the Primary Track drop down menu.   We look forward to seeing you in San Francisco!

    Read the article

  • eBay Leads Mobile Commerce

    - by David Dorf
    For the first time, more smartphones where shipped than PCs. This important milestone helps reinforce that retailers need a strong mobile commerce strategy. IDC reported that for the 4th quarter of 2010, manufacturers shipped 100.9 million devices versus 92.1 million PCs shipped. One early adopter for the retail industry is eBay, the popular online auction and shopping site. In July 2008 they released their first mobile app and have increased investments ever since. In 2002 they bought PayPal for use with its online channel, but its becoming a force in the mobile world as well. In June 2010 they acquired RedLaser, the popular barcode scanning mobile app. Both pieces of technology enhance the mobile experience, and are available to other retailers as well. More recently, in December 2010 they acquired Critical Path Software, the developer of their eBay, StubHub, and Shopping.com mobile applications. Taking their mobile development in-house was a clear signal that mobile commerce is important to their strategy. Pop on over the eBay Inc's mobile commerce stats page to see just how well they are doing. You can use the animated map to see where people are using the app on any given day, and you can compare sales of the different categories. eBay's hottest category is Cars & Trucks, garnering 16.5% of the total $2B (yes, billion) in mobile sales in 2010. To understand why that category is so large, let's look at the top 10 most expensive cars sold on eBay mobile in 2010: $240,001 Mercedes-Benz: SLR McLaren $209,888 Lamborghini: Gallardo $208,500 Ferrari: 430 $199,900 Lamborghini: Gallardo $189,000 Lamborghini: Murcielago $185,000 Ferrari: 430 $175,000 Porsche: 911 $170,000 Ferrari: 550 $160,000 Bentley: Continental, GT $159,900 Lamborghini: Gallardo eBay claims they sell 3-4 Ferraris on their mobile app each month. Yes, mobile commerce is not limited to small items. While I would wait to get home and fire up the PC, the current generation that has grown up with mobile phones has no issue satisfying their impulses. Dave Sikora of Digby told me he's seen people buy furniture sets, mattresses, and diamonds via their mobile phones. I guess mobile commerce is rapidly becoming the norm.

    Read the article

  • How to make your File Adapter pick only one file at a time from a location

    - by anirudh.pucha(at)oracle.com
    In SOA 11g, you use File adapter to read files from the given location.With this read operation it picks all the files at time.You want to configure File Adapters that it should pick one file at time from the given location with given polling interval.Solution :You set the "SingleThreadModel" and "MaxRaiseSize" properties for your file adapter. Edit the adapter's jca file and add the following properties:property name="SingleThreadModel" value="true"property name="MaxRaiseSize" value="1"You can set these properties also through jdeveloper, by opening composite.xml, selecting the adapter and then changing the properties through the properties panel.

    Read the article

  • What Would Google Do?

    - by David Dorf
    Last year I read Jeff Jarvis' book What Would Google Do? and found it very interesting. Jeff is a long-time journalist that's been studying technology, and more specifically the internet. He used his skills to reverse-engineer Google into a list of "Google rules," then goes on to describe a futuristic world driven by these rules. Its an interesting look at crowd-sourcing, openness, and collaboration across many industries, including retail (Google Shops). This year Jeff Jarvis will be a keynote speaker at CrossTalk, Oracle's user conference dedicated to the retail industry. This year's theme is... Retail Redefined: Redesign. Reinvigorate. Reimagine. I think that's pretty appropriate given the massive changes the industry has undergone during the last three years. The thing I really like about this conference is that we try to let the retailers do most of the talking. I'm very interested in hearing about the innovative projects they've got brewing, and where they think our industry is heading. I'll be speaking, but I'm not sure about what so let me know of any interesting topics.

    Read the article

  • Database continuous integration step by step

    - by David Atkinson
    This post will describe how to set up basic database continuous integration using TeamCity to initiate the build process, SQL Source Control to put your database under source control, and the SQL Compare command line to keep a test database up to date. In my example I will be using Subversion as my source control repository. If you wish to follow my steps verbatim, please make sure you have TortoiseSVN, SQL Compare and SQL Source Control installed. Downloading and Installing TeamCity TeamCity (http://www.jetbrains.com/teamcity/index.html) is free for up to three agents, so it a great no-risk tool you can use to experiment with. 1. Download the latest version from the JetBrains website. For some reason the TeamCity executable didn't download properly for me, stalling frustratingly at 99%, so I tried again with the zip file download option (see screenshot below), which worked flawlessly. 2. Run the installer using the defaults. This results in a set-up with the server component and agent installed on the same machine, which is ideal for getting started with ease. 3. Check that the build agent is pointing to the server correctly. This has caught me out a few times before. This setting is in C:\TeamCity\buildAgent\conf\buildAgent.properties and for my installation is serverUrl=http\://localhost\:80 . If you need to change this value, if for example you've had to install the Server console to a different port number, the TeamCity Build Agent Service will need to be restarted for the change to take effect. 4. Open the TeamCity admin console on http://localhost , and specify your own designated username and password at first startup. Putting your database in source control using SQL Source Control 5. Assuming you've got SQL Source Control installed, select a development database in the SQL Server Management Studio Object Explorer and select Link Database to Source Control. 6. For the Link step you can either create your own empty folder in source control, or you can select Just Evaluating, which just creates a local subversion repository for you behind the scenes. 7. Once linked, note that your database turns green in the Object Explorer. Visit the Commit tab to do an initial commit of your database objects by typing in an appropriate comment and clicking Commit. 8. There is a hidden feature in SQL Source Control that opens up TortoiseSVN (provided it is installed) pointing to the linked repository. Keep Shift depressed and right click on the text to the right of 'Linked to', in the example below, it's the red Evaluation Repository text. Select Open TortoiseSVN Repo Browser. This screen should give you an idea of how SQL Source Control manages the object files behind the scenes. Back in the TeamCity admin console, we'll now create a new project to monitor the above repository location and to trigger a 'build' each time the repository changes. 9. In TeamCity Adminstration, select Create Project and give it a name, such as "My first database CI", and click Create. 10. Click on Create Build Configuration, and name it something like "Integration build". 11. Click VCS settings and then Create And Attach new VCS root. This is where you will tell TeamCity about the repository it should monitor. 12. In my case since I'm using the Just Evaluating option in SQL Source Control, I should select Subversion. 13. In the URL field paste your repository location. In my case this is file:///C:/Users/David.Atkinson/AppData/Local/Red Gate/SQL Source Control 3/EvaluationRepositories/WidgetDevelopment/WidgetDevelopment 14. Click on Test Connection to ensure that you can communicate with your source control system. Click Save. 15. Click Add Build Step, and Runner Type: Command Line. Should you be familiar with the other runner types, such as NAnt, MSBuild or Powershell, you can opt for these, but for the same of keeping it simple I will pick the simplest option. 16. If you have installed SQL Compare in the default location, set the Command Executable field to: C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe 17. Flip back to SSMS briefly and add a new database to your server. This will be the database used for continuous integration testing. 18. Set the command parameters according to your server and the name of the database you have created. In my case I created database RedGateCI on server .\sql2008r2 /scripts1:. /server2:.\sql2008r2 /db2:RedGateCI /sync /verbose Note that if you pick a server instance that isn't on your local machine, you'll need the TCP/IP protocol enabled in SQL Server Configuration Manager otherwise the SQL Compare command line will not be able to connect. 19. Save and select Build Triggering / Add New Trigger / VCS Trigger. This is where you tell TeamCity when it should initiate a build. Click Save. 20. Now return to SQL Server Management Studio and make a schema change (eg add a new object) to your linked development database. A blue indicator will appear in the Object Explorer. Commit this change, typing in an appropriate check-in comment. All being good, within 60 seconds (a TeamCity default that can be changed) a build will be triggered. 21. Click on Projects in TeamCity to get back to the overview screen: The build log will show you the console output, which is useful for troubleshooting any issues: That's it! You now have continuous integration on your database. In future posts I'll cover how you can generate and test the database creation script, the database upgrade script, and run database unit tests as part of your continuous integration script. If you have any trouble getting this up and running please let me know, either by commenting on this post, or email me directly using the email address below. Technorati Tags: SQL Server

    Read the article

  • Lenovo DVD Drive Disabled After Windows 7 Install

    - by David Lacher
    Upgraded hard drive in Lenovo T61P; decided to start fresh with Windows 7 Pro. Windows installed, so DVD drive was working. All of a sudden, driver is not recognized. Device is "HL-DT-ST DVDRAM GSA-U10N ATA Device". It appears on device manager but with the yellow tag; have tried uninstalling, searching for drivers, everything I can think of. Cannot even start over with Windows 7 installation disk because disk spins but then stops and My Computer does not recognize the drive. Help please. thank you. David Lacher

    Read the article

  • Lenovo DVD Drive Disabled After Windows 7 Install

    - by David Lacher
    Upgraded hard drive in Lenovo T61P; decided to start fresh with Windows 7 Pro. Windows installed, so DVD drive was working. All of a sudden, driver is not recognized. Device is "HL-DT-ST DVDRAM GSA-U10N ATA Device". It appears on device manager but with the yellow tag; have tried uninstalling, searching for drivers, everything I can think of. Cannot even start over with Windows 7 installation disk because disk spins but then stops and My Computer does not recognize the drive. Help please. thank you. David Lacher

    Read the article

  • Master Data Management and Cloud Computing

    - by david.butler(at)oracle.com
    Cloud Computing is all the rage these days. There are many reasons why this is so. But like its predecessor, Service Oriented Architecture, it can fall on hard times if the underlying data is left unmanaged. Master Data Management is the perfect Cloud companion. It can materially increase the chances for successful Cloud initiatives. In this blog, I'll review the nature of the Cloud and show how MDM fits in.   Here's the National Institute of Standards and Technology Cloud definition: •          Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.   Cloud architectures have three main layers: applications or Software as a Service (SaaS), Platforms as a Service (PaaS), and Infrastructure as a Service (IaaS). SaaS generally refers to applications that are delivered to end-users over the Internet. Oracle CRM On Demand is an example of a SaaS application. Today there are hundreds of SaaS providers covering a wide variety of applications including Salesforce.com, Workday, and Netsuite. Oracle MDM applications are located in this layer of Oracle's On Demand enterprise Cloud platform. We call it Master Data as a Service (MDaaS). PaaS generally refers to an application deployment platform delivered as a service. They are often built on a grid computing architecture and include database and middleware. Oracle Fusion Middleware is in this category and includes the SOA and Data Integration products used to connect SaaS applications including MDM. Finally, IaaS generally refers to computing hardware (servers, storage and network) delivered as a service.  This typically includes the associated software as well: operating systems, virtualization, clustering, etc.    Cloud Computing benefits are compelling for a large number of organizations. These include significant cost savings, increased flexibility, and fast deployments. Cost advantages include paying for just what you use. This is especially critical for organizations with variable or seasonal usage. Companies don't have to invest to support peak computing periods. Costs are also more predictable and controllable. Increased agility includes access to the latest technology and experts without making significant up front investments.   While Cloud Computing is certainly very alluring with a clear value proposition, it is not without its challenges. An IDC survey of 244 IT executives/CIOs and their line-of-business (LOB) colleagues identified a number of issues:   Security - 74% identified security as an issue involving data privacy and resource access control. Integration - 61% found that it is hard to integrate Cloud Apps with in-house applications. Operational Costs - 50% are worried that On Demand will actually cost more given the impact of poor data quality on the rest of the enterprise. Compliance - 49% felt that compliance with required regulatory, legal and general industry requirements (such as PCI, HIPAA and Sarbanes-Oxley) would be a major issue. When control is lost, the ability of a provider to directly manage how and where data is deployed, used and destroyed is negatively impacted.  There are others, but I singled out these four top issues because Master Data Management, properly incorporated into a Cloud Computing infrastructure, can significantly ameliorate all of these problems. Cloud Computing can literally rain raw data across the enterprise.   According to fellow blogger, Mike Ferguson, "the fracturing of data caused by the adoption of cloud computing raises the importance of MDM in keeping disparate data synchronized."   David Linthicum, CTO Blue Mountain Labs blogs that "the lack of MDM will become more of an issue as cloud computing rises. We're moving from complex federated on-premise systems, to complex federated on-premise and cloud-delivered systems."    Left unmanaged, non-standard, inconsistent, ungoverned data with questionable quality can pollute analytical systems, increase operational costs, and reduce the ROI in Cloud and On-Premise applications. As cloud computing becomes more relevant, and more data, applications, services, and processes are moved out to cloud computing platforms, the need for MDM becomes ever more important. Oracle's MDM suite is designed to deal with all four of the above Cloud issues listed in the IDC survey.   Security - MDM manages all master data attribute privacy and resource access control issues. Integration - MDM pre-integrates Cloud Apps with each other and with On Premise applications at the data level. Operational Costs - MDM significantly reduces operational costs by increasing data quality, thereby improving enterprise business processes efficiency. Compliance - MDM, with its built in Data Governance capabilities, insures that the data is governed according to organizational standards. This facilitates rapid and accurate reporting for compliance purposes. Oracle MDM creates governed high quality master data. A unified cleansed and standardized data view is produced. The Oracle Customer Hub creates a single view of the customer. The Oracle Product Hub creates high quality product data designed to support all go-to-market processes. Oracle Supplier Hub dramatically reduces the chances of 'supplier exceptions'. Oracle Site Hub masters locations. And Oracle Hyperion Data Relationship Management masters financial reference data and manages enterprise hierarchies across operational areas from ERP to EPM and CRM to SCM. Oracle Fusion Middleware connects Cloud and On Premise applications to MDM Hubs and brings high quality master data to your enterprise business processes.   An independent analyst once said "Poor data quality is like dirt on the windshield. You may be able to drive for a long time with slowly degrading vision, but at some point, you either have to stop and clear the windshield or risk everything."  Cloud Computing has the potential to significantly degrade data quality across the enterprise over time. Deploying a Master Data Management solution prior to or in conjunction with a move to the Cloud can insure that the data flowing into the enterprise from the Cloud is clean and governed. This will in turn insure that expected returns on the investment in Cloud Computing will be realized.       Oracle MDM has proven its metal in this area and has the customers to back that up. In fact, I will be hosting a webcast on Tuesday, April 10th at 10 am PT with one of our top Cloud customers, the Church Pension Group. They have moved all mainline applications to a hosted model and use Oracle MDM to insure the master data is managed and cleansed before it is propagated to other cloud and internal systems. I invite you join Martin Hossfeld, VP, IT Operations, and Danette Patterson, Enterprise Data Manager as they review business drivers for MDM and hosted applications, how they did it, the benefits achieved, and lessons learned. You can register for this free webcast here.  Hope to see you there.

    Read the article

  • What is the difference between DLNA and UPNP ?

    - by David Michel
    Hi All, Can someone tell me the what is the difference between DLNA and UPNP ? I can see that some devices such as NAS have their specifications mentioning both (e.g. Iomega StorCenter) or only DLNA (e.g. Netgear Stora). Is this a synomym for the same thing or is is actually 2 different protocals ? Are they compatible, i.e. if a media server uses DLNA and the streaming device uses UPNP, will it work ? I looked around but could not find any clear answer... Many thanks David

    Read the article

  • Firefox 3.6.3 on Snow Leopard 10.6.3 - symbolic link to command line binary doesn't work?

    - by David Watson
    I have Firefox 10.6.3 installed on Mac OS X Snow Leopard from the DMG. I can run firefox from the terminal using /Applications/Firefox.app/Contents/MacOS/firefox-bin. However, if I create a symbolic link: sudo ln -s /Applications/Firefox.app/Contents/MacOS/firefox-bin /bin/firefox then it refuses to run, or at least display. When I issue "firefox" from the terminal, I can see the process in top, but never get the GUI to appear. :/ = ls -lr /bin/firefox lrwxr-xr-x 1 root wheel 52 May 5 15:19 /bin/firefox - /Applications/Firefox.app/Contents/MacOS/firefox-bin Any ideas? Thanks, David

    Read the article

  • Facebook Stories for Retailers

    - by David Dorf
    Getting people to "like" a brand is important because it opens the door to a possible B2C relationship. Once a person likes that brand, the brand can post to their newsfeed with promotions, announcements, and surveys. At least for me, I "hide" the noisy brands and just monitor the ones that keep posts under 4 times a week. I see lots of people, especially with fashion brands, comment on postings at which point the posting is seen by their network. A metric I've heard (but not verified) is that for every person that comments, ten of their friends see the original posting. That's a pretty cheap way to communicate to potential customers in a viral way. Over at mainstreet.com they compiled the a list of the top liked retailers on Facebook as of Feb 1, 2011. They are listed below: 19,414,892 Starbucks 11,302,939 Victoria's Secret 7,925,184 Zara 7,032,398 McDonald's 6,117,222 H&M 5,400,586 Taco Bell 4,665,760 Subway 4,494,849 Lacoste 4,185,570 Hollister 3,973,181 Forever 21 So I guess the public likes their fast-food and fashion. To take this to the next level, Facebook is now displaying Sponsored Stories, which I saw for the first time on my page this weekend. I found this picture at the Wall Blog that depicits Sponsored Stories very well. Over on the right-hand column of a person's page, where they see advertisements and such, Facebook will post stories involving their network of friends and their interaction with sponsored brands. Now their "likes" can suddenly become your ads. "Jessica and Philip like Starbucks. What are you waiting for?" This is another great way to take messages viral by accessing social graphs. As usual there will be a certain level of outcry from privacy advocates, but given the other more iniquitous issues, I believe this will fall by the wayside. Retailers should consider using Sponsored Stories to increase their Likes, and thus increase their voice in the social world.

    Read the article

  • Agile Testing Days 2012 – Day 3 – Agile or agile?

    - by Chris George
    Another early start for my last Lean Coffee of the conference, and again it was not wasted. We had some really interesting discussions around how to determine what test automation is useful, if agile is not faster, why do it? and a rather existential discussion on whether unicorns exist! First keynote of the day was entitled “Fast Feedback Teams” by Ola Ellnestam. Again this relates nicely to the releasing faster talk on day 2, and something that we are looking at and some teams are actively trying. Introducing the notion of feedback, Ola describes a game he wrote for his eldest child. It was a simple game where every time he clicked a button, it displayed “You’ve Won!”. He then changed it to be a Win-Lose-Win-Lose pattern and watched the feedback from his son who then twigged the pattern and got his younger brother to play, alternating turns… genius! (must do that with my children). The idea behind this was that you need that feedback loop to learn and progress. If you are not getting the feedback you need to close that loop. An interesting point Ola made was to solve problems BEFORE writing software. It may be that you don’t have to write anything at all, perhaps it’s a communication/training issue? Perhaps the problem can be solved another way. Writing software, although it’s the business we are in, is expensive, and this should be taken into account. He again mentions frequent releases, and how they should be made as soon as stuff is ready to be released, don’t leave stuff on the shelf cause it’s not earning you anything, money or data. I totally agree with this and it’s something that we will be aiming for moving forwards. “Exceptions, Assumptions and Ambiguity: Finding the truth behind the story” by David Evans started off very promising by making references to ‘Grim up North’ referring to the north of England. Not sure it was appreciated by most of the audience, but it made me laugh! David explained how there are always risks associated with exceptions, giving the example of a one-way road near where he lives, with an exception sign giving rights to coaches to go the wrong way. Therefore you could merrily swing around the corner of the one way road straight into a coach! David showed the danger in making assumptions with lyrical quotes from Lola by The Kinks “I’m glad I’m a man, and so is Lola” and with a picture of a toilet flush that needed instructions to operate the full and half flush. With this particular flush, you pulled the handle all the way down to half flush, and half way down to full flush! hmmm, a bit of a crappy user experience methinks! Then through a clever use of a passage from the Jabberwocky, David then went onto show how mis-translation/ambiguity is the can completely distort the original meaning of something, and this is a real enemy of software development. This was all helping to demonstrate that the term Story is often heavily overloaded in the Agile world, and should really be stripped back to what it is really for, stating a business problem, and offering a technical solution. Therefore a story could be worded as “In order to {make some improvement}, we will { do something}”. The first ‘in order to’ statement is stakeholder neutral, and states the problem through requesting an improvement to the software/process etc. The second part of the story is the verb, the doing bit. So to achieve the ‘improvement’ which is not currently true, we will do something to make this true in the future. My PM is very interested in this, and he’s observed some of the problems of overloading stories so I’m hoping between us we can use some of David’s suggestions to help clarify our stories better. The second keynote of the day (and our last) proved to be the most entertaining and exhausting of the conference for me. “The ongoing evolution of testing in agile development” by Scott Barber. I’ve never had the pleasure of seeing Scott before… OMG I would love to have even half of the energy he has! What struck me during this presentation was Scott’s explanation of how testing has become the role/job that it is (largely) today, and how this has led to the need for ‘methodologies’ to make dev and test work! The argument that we should be trying to converge the roles again is a very valid one, and one that a couple of the teams at work are actively doing with great results. Making developers as responsible for quality as testers is something that has been lost over the years, but something that we are now striving to achieve. The idea that we (testers) should be testing experts/specialists, not testing ‘union members’, supports this idea so the entire team works on all aspects of a feature/product, with the ‘specialists’ taking the lead and advising/coaching the others. This leads to better propagation of information around the team, a greater holistic understanding of the project and it allows the team to continue functioning if some of it’s members are off sick, for example. Feeling somewhat drained from Scott’s keynote (but at the same time excited that alot of the points he raised supported actions we are taking at work), I headed into my last presentation for Agile Testing Days 2012 before having to make my way to Tegel to catch the flight home. “Thinking and working agile in an unbending world” with Pete Walen was a talk I was not going to miss! Having spoken to Pete several times during the past few days, I was looking forward to hearing what he was going to say, and I was not disappointed. Pete started off by trying to separate the definitions of ‘Agile’ as in the methodology, and ‘agile’ as in the adjective by pronouncing them the ‘english’ and ‘american’ ways. So Agile pronounced (Ajyle) and agile pronounced (ajul). There was much confusion around what the hell he was talking about, although I thought it was quite clear. Agile – Software development methodology agile – Marked by ready ability to move with quick easy grace; Having a quick resourceful and adaptable character. Anyway, that aside (although it provided a few laughs during the presentation), the point was that many teams that claim to be ‘Agile’ but are not, in fact, ‘agile’ by nature. Implementing ‘Agile’ methodologies that are so prescriptive actually goes against the very nature of Agile development where a team should anticipate, adapt and explore. Pete made a valid point that very few companies intentionally put up roadblocks to impede work, so if work is being blocked/delayed, why? This is where being agile as a team pays off because the team can inspect what’s going on, explore options and adapt their processes. It is through experimentation (and that means trying and failing as well as trying and succeeding) that a team will improve and grow leading to focussing on what really needs to be done to achieve X. So, that was it, the last talk of our conference. I was gutted that we had to miss the closing keynote from Matt Heusser, as Matt was another person I had spoken too a few times during the conference, but the flight would not wait, and just as well we left when we did because the traffic was a nightmare! My Takeaway Triple from Day 3: Release often and release small – don’t leave stuff on the shelf Keep the meaning of the word ‘agile’ in mind when working in ‘Agile Look at testing as more of a skill than a role  

    Read the article

< Previous Page | 34 35 36 37 38 39 40 41 42 43 44 45  | Next Page >