Search Results

Search found 803 results on 33 pages for 'greg mcguffey'.

Page 32/33 | < Previous Page | 28 29 30 31 32 33  | Next Page >

  • SQLSaturday 33 Observations

    - by Geoff N. Hiten
    Along with a lot of my colleagues, I went to SQLSaturday #33 in Charlotte this last weekend.  Overall a really good event, especially for a first-time organizer.  There is some controversy over certain events where my name got mentioned so I thought I would clear the air. Before I get to the core controversy, let's get the details out of the way.  The Microsoft Offices in Charlotte were an excellent venue for this event.  I really appreciated the Microsoft employees that helped out by letting us in and out of normally secure areas.  This is definitely above and beyond on their part. Thanks to the organizers (especially Greg and Peter) for the great hospitality they showed to the speakers.  Now for the specifics.  Like most events of this type, there was a raffle at the end for some cool swag.  As a speaker I got raffle tickets just like any other attendee.  The raffle was clearly promoted as "must be present to win".  The problem is that for various reasons, the raffle kicked off immediately after the last speaker finished in the largest room.  That room was across the parking lot from all the other rooms for the event.  I happened to have one of the last sessions of the day, and not in the main room.  I also ran long since the audience was very interactive and there were a lot of follow-up questions.  (BTW, thanks to everyone who came and stayed for my session.  Sorry it cost you the chance to win too.).  My name was drawn for an very nice piece of swag (iPod Touch if you insist).  Since I wasn't there, I didn't win. Several folks mentioned I was still speaking and was "here" (as in at the event) just not "here in the room". Yes, I was mad when I found out about it. I think that was handled poorly.  I personally lost out as did my audience (dunno if anyone specific lost anything, but it is the idea that counts).  It was a mistake. Mistakes happen.  Nobody acted maliciously.  Heck, the guys running the event who made the decision are my friends and remain so.  I got over my mad.  We talked about this privately and we are all OK with what happened.  I am not going to let a gadget get in the way of a couple of good friendships. I think the mistake was mostly due to a lack of unity between the venue buildings   Pam Shaw had a similar challenge in Tampa a few weeks ago, including a speaker who ran long on the last session (not me that time).  She had a couple of teenage volunteers to act as gofers/runners.  They counted heads in sessions, pointed people to last-minute room and session changes, and generally helped connect the organizers to what was actually happening.  Note that this was not Pam's first SQLSaturday event.  She knew but the knowledge had not been institutionalized.  We (The SQL community in general and SQLSaturday organizers in particular) now know how essential gofers are to success. I know I spent most of this post focusing on the controversy, but I wanted to clear everything up.  I don't want to let a minor mistake, made in good faith, overshadow what was a tremendously good event for the community. As for the iPod Touch, someone in the SQL community is enjoying it, so it is not a total loss.  And if losing out on it is the price I pay so we can learn this, then that is what a community leader does.  Consider it a gift.  Besides, I really wanted a Zune 120 :)

    Read the article

  • A Forming Repository of Script Samples for Automating Windows Server 2012 and Windows 8

    - by Jialiang
    Compared with Windows Server 2008/R2 that provides about 230 cmdlets, Windows Server 2012 beats that by a factor of over 10 shipping ~ 2,430 cmdlets.  You can automate almost every aspect of the server.   The new PowerShell 3.0, like Windows Server 2012, has a ton of new features.  In this automation script-centric move, Microsoft All-In-One Script Framework (AIOSF) is ready to support IT Pros with many new services and offerings coming this year.  We sincerely hope that the IT community will benefit from the effort. Here is the first one among our new services and offerings:  The team is preparing a large set of Windows 8 / Windows Server 2012 script samples based on frequently asked IT tasks that we collect in TechNet forums and support calls to Microsoft.   Because the script topics come from frequently asked IT tasks, we hope that these script samples can be helpful to many IT Pros worldwide.   With the General Availability of Windows Server 2012, we release the first three Windows Server 2012 / Windows 8 script samples today.    Get Network Adapter Properties in Windows Server 2012 and Windows 8 (PowerShell) http://gallery.technet.microsoft.com/scriptcenter/Get-Network-Adapter-37c5a913 Description: This script could be used to get network adapter properties and advanced properties in Windows Server 2012 and Windows 8. It combines the outputs of Get-NetAdapter and Get-NetAdapterAdvancedProperty. It can generate a report of network adapter configuration settings. Use Scenarios: In a real world, IT Administrators are required to check the configuration of network adapters after the deployment of new servers. One typical example is the duplex setting of network adapters. Also, IT administrators need to maintain a server list which contains network adapter configuration settings in a regular basis. Before Windows Server 2012, IT administrators often feel difficulties to handle these tasks. Acknowledgement: Thanks Greg Gu from AIOSF for collecting this script topic, and writing the script sample.  Thanks James Adams (Microsoft Premier Field Engineer) for reviewing the script sample and ensuring its quality.   How to batch create virtual machines in Windows Server 2012 (PowerShell) http://gallery.technet.microsoft.com/scriptcenter/How-to-batch-create-9efd1811 Description: This PowerShell Script illustrates how to batch create multiple virtual machines based on comma delimited file by using PowerShell 3.0 in Windows Server 2012. Use Scenarios: IT admin requires to batch creating virtual machines in Windows Server 2012, although they can use few commands due to the lack of programming knowledge. Although it’s a set of Hyper-V command-lets within Windows PowerShell, IT Admins are reluctant to use them except simple a command which is widely used. Acknowledgement: Thanks Anders Wang from AIOSF for collecting this script topic and writing the script sample.  Thanks Christopher Norris for reviewing the script sample and ensuring its quality before publishing.   Remove Windows Store Apps in Windows 8 (PowerShell) http://gallery.technet.microsoft.com/scriptcenter/Remove-Windows-Store-Apps-a00ef4a4 Description: This script can be used to remove multiple Windows Store Apps from a user account in Windows 8. It provides a list of installed Windows Store applications. You can specify the application IDs, and remove them all at once. Use Scenarios: 1. In Windows 8, you can remove a single Windows Store App by right-clicking the tile in the Start menu and choosing the uninstall command.  However, no command is provided for removing multiple Windows Store Apps all at once. If you want to do so, you can use this script sample. 2. Sometimes Windows Store Apps may crash in Windows 8.  Even though you can successfully uninstall and reinstall the App, the application may still crash after the reinstallation.  In this situation, you can use this example script to remove these Windows Store Apps cleanly. Acknowledgement: Thanks Edward Qi from AIOSF for collecting the script idea and composing the script sample.  Thanks James Adams (Microsoft Premier Field Engineer) for reviewing the script sample and ensuring its quality.   This is just the beginning, and more and more script samples are coming.  You can follow our blog (http://blogs.technet.com/b/onescript) to get the latest customer-driven script samples for Windows Server 2012 and Windows 8.

    Read the article

  • Proactive Support Sessions at OUG London and OUG Ireland

    - by THE
    .conf td { width: 350px; border: 1px solid black; background-color: #ffcccc; } table { border: 1px solid black; } tr { border: 0px solid black; } td { border: 1px solid black; padding: 5px; } Oracle Proactive Support Technology is proud to announce that two members of its team will be speaking at the UK and Ireland User Group Conferences this year. Maurice and Greg plan to run the following sessions (may be subject to change): Maurice Bauhahn OUG Ireland BI & EPM and Technology Joint SIG Meeting 20 November 2012 BI&EPM SIG event in Ireland (09:00-17:00) and OUG London EPM & Hyperion Conference 2012 Tuesday 23rd to Wednesday 24th Oct 2012 Profit from Oracle Diagnostic Tools Embedded in EPM Oracle bundles in many of its software suites valuable toolsets to collect logs and settings, slice/dice error messages, track performance, and trace activities across services. Become familiar with several enterprise-level diagnostic tools embedded in Enterprise Performance Management (Enterprise Manager Fusion Middleware Control, Remote Diagnostic Agent, Dynamic Monitoring Service, and Oracle Diagnostic Framework). Expedite resolution of Service Requests as you learn to upload output from these tools to My Oracle Support. Who will benefit from attending the session? Geeks will find this most beneficial, but anyone who raises Oracle technical service requests will learn valuable pointers that may speed resolution. The focus is on the EPM stack, but this session will benefit almost everyone who needs to drill deeper into Oracle software environments. What will delegates learn from the session? Delegates who participate in this session will learn: How to access and run Remote Diagnostic Agent, Enterprise Manager Fusion Middleware Control, Dynamic Monitoring Service, and Oracle Diagnostic Framework. How to exploit the strengths of each tool. How to pass the outputs to My Oracle Support. How to restrict exposure of sensitive information. OUG Ireland BI & EPM and Technology Joint SIG Meeting 20 November 2012 BI&EPM SIG event in Ireland (09:00-17:00) and OUG London EPM & Hyperion Conference 2012 Tuesday 23rd to Wednesday 24th Oct 2012 Using EPM-Specific Troubleshooting Tools EPM developers have created a number of EPM-specific tools to collect logs and configuration files, centralize configuration information, and validate a configured installation (Ziplogs, EPM Registry Editor, [Deployment Report, Registry Cleanup Utility, Reset Configuration Tool, EPMSYS Hostname Check] and Validate [EPM System Diagnostic]). Learn how to use these tools on your own or to expedite Service Request resolution. Who will benefit from attending the session? Anyone who monitors Oracle EPM environments or raises service requests will learn valuable lessons that could speed resolution of those requests. Anyone from novices to experts will benefit from this review of custom troubleshooting EPM tools. What will delegates learn from the session? Learn where to locate and start EPM troubleshooting tools created by EPM developers Learn how to collect and upload outputs of EPM troubleshooting tools. Adapt to history of changes in these tools across time and version. Learn how to make critical changes in configurations. Grzegorz Reizer OUG London EPM & Hyperion Conference 2012 Tuesday 23rd to Wednesday 24th Oct 2012 EPM 11.1.2.2: Detailed overview of new features and improvements in Financial Management products. This presentation is a detailed overview of new features and improvements introduced in Enterprise Performance Management 11.1.2.2 for Financial Management products (Hyperion Financial Managment, Hyperion Planning, Financial Close Management). The presentation will cover a number of new product features from recently introduced configurable dimensionality in HFM to new functionality enhancements in Planning. We'll close the session with an overview of upgrade options from earlier product releases.

    Read the article

  • Do MORE with WebCenter

    - by Michael Snow
    We’ve been extremely busy here on the Oracle WebCenter team. We hope that you’ve all be keeping up with the interesting news each week. Last week was jammed full of GartnerPCC and Gartner360 buzz. If you missed any of the highlights – be sure to check out both Kellsey’s post from last week: Gartner PCC: A Shovel & Some Ah-Ha's and Christie’s overview of Loren Weinberg’s PCC presentation: "Here Today, Gone Tomorrow: Engage Your Customers or Lose Them"  . This week, we’ll be focusing on “Doing More with WebCenter” leading up to a great webcast scheduled for Thursday, March 22 (invite and registration link below). This is the 2nd in a series of 3 webcasts dedicated to expanding the understanding of the full capabilities of WebCenter. Yes – that might mean that you are not getting the full benefits of the software you already own or the expansion potential via upgrade to the full WebCenter Suite Plus. Tune in on Thursday 10 a.m. PT / 1 p.m. ET.  ++++++++++++++ Want to be a Speaker at Oracle OpenWorld 2012? Oracle Open World planning has already kicked off. We know that it is only March and next October is far in the distance. But planning has already started for Oracle OpenWorld 2012. So if you want to be a speaker and propose your own session for this year's event in San Francisco on September 30th - October 4th, starting thinking now!  The annual OpenWorld Call for Papers is now open until April 9th! All of the details to submit a paper are available here. Of course, the WebCenter team here is interested in sessions including case studies, thought-leadership, customer stories around any of the Oracle WebCenter solutions, but the Call for Papers is open to all Oracle topics. When submitting your topic, be sure to describe what you plan to discuss and the value of the presentation to other attendees. Sell your session, because there will be a lot of competition to be selected.  Bonus News: Speakers for selected sessions receive a complimentary full conference pass! Get your papers in and we'll see you in San Francisco! ~~~~~~~~~~~~~~~~~~~~~~ Webcast Series: Do More with Oracle WebCenter - Expand Beyond Content Management Enable Employees, Partners, and Customers to Do More with Your Content Dear [FIRSTNAME] [LASTNAME],-- Did you know that, in addition to content management, Oracle WebCenter now also includes comprehensive portal, composite application, collaboration, and Web experience management capabilities? Join us for this Webcast and learn how you can provide a new level of user engagement. Learn how Oracle WebCenter: Drives task-specific application data and content to a single screen for executing specific business processes Enables mixed internal and external environments where content can be securely shared and filtered with employees, partners, and customers, based upon role-based security Offers Web experience management, driving contextually relevant, social, and interactive online experiences across multiple channels Provides social features that enable sharing, activity feeds, collaboration, expertise location, and best-practices communities Learn how to do more with Oracle WebCenter. Register now for the Webcast. Register Now Join us for the second Webcast in the series "Do More With Oracle WebCenter". March 22, 2012 10 a.m. PT / 1 p.m. ET Presented by: Michelle Huff Senior Director, WebCenter Product Management, Oracle Greg Utecht Project Manager,IT Operations,TIES Copyright © 2012, Oracle and/or its affiliates. All rights reserved. Contact Us | Legal Notices | Privacy Oracle Corporation - Worldwide Headquarters, 500 Oracle Parkway, OPL - E-mail Services, Redwood Shores, CA 94065, United States

    Read the article

  • DATEFROMPARTS

    - by jamiet
    I recently overheard a remark by Greg Low in which he said something akin to "the most interesting parts of a new SQL Server release are the myriad of small things that are in there that make a developer's life easier" (I'm paraphrasing because I can't remember the actual quote but it was something like that). The new DATEFROMPARTS function is a classic example of that . It simply takes three integer parameters and builds a date out of them (if you have used DateSerial in Reporting Services then you'll understand). Take the following code which generates the first and last day of some given years: SELECT 2008 AS Yr INTO #Years UNION ALL SELECT 2009 UNION ALL SELECT 2010 UNION ALL SELECT 2011 UNION ALL SELECT 2012SELECT [FirstDayOfYear] = CONVERT(DATE,CONVERT(CHAR(8),((y.[Yr] * 10000) + 101))),      [LastDayOfYear] = CONVERT(DATE,CONVERT(CHAR(8),((y.[Yr] * 10000) + 1231)))FROM   #Years y here are the results: That code is pretty gnarly though with those CONVERTs in there and, worse, if the character string is constructed in a certain way then it could fail due to localisation, check this out: SET LANGUAGE french;SELECT dt,Month_Name=DATENAME(mm,dt)FROM   (       SELECT  dt = CONVERT(DATETIME,CONVERT(CHAR(4),y.[Yr]) + N'-01-02')       FROM    #Years y       )d;SET LANGUAGE us_english;SELECT dt,Month_Name=DATENAME(mm,dt)FROM   (       SELECT  dt = CONVERT(DATETIME,CONVERT(CHAR(4),y.[Yr]) + N'-01-02')       FROM    #Years y       )d; Notice how the datetime has been converted differently based on the language setting. When French, the string "2012-01-02" gets interpreted as 1st February whereas when us_english the same string is interpreted as 2nd January. Instead of all this CONVERTing nastiness we have DATEFROMPARTS: SELECT [FirstDayOfYear] = DATEFROMPARTS(y.[Yr],1,1),    [LasttDayOfYear] = DATEFROMPARTS(y.[Yr],12,31)FROM   #Years y How much nicer is that? The bad news of course is that you have to upgrade to SQL Server 2012 or migrate to SQL Azure if you want to use it, as is the way of the world! Don't forget that if you want to try this code out on SQL Azure right this second, for free, you can do so by connecting up to AdventureWorks On Azure. You don't even need to have SSMS handy - a browser that runs Silverlight will do just fine. Simply head to https://mhknbn2kdz.database.windows.net/ and use the following credentials: Database AdventureWorks2012 User sqlfamily Password sqlf@m1ly One caveat, SELECT INTO doesn't work on SQL Azure so you'll have to use this instead: DECLARE @y TABLE ( [Yr] INT);INSERT @y([Yr])SELECT 2008 AS Yr UNION ALL SELECT 2009 UNION ALL SELECT 2010 UNION ALL SELECT 2011 UNION ALL SELECT 2012;SELECT [FirstDayOfYear] = DATEFROMPARTS(y.[Yr],1,1),      [LastDayOfYear] = DATEFROMPARTS(y.[Yr],12,31)FROM @y y;SELECT [FirstDayOfYear] = CONVERT(DATE,CONVERT(CHAR(8),((y.[Yr] * 10000) + 101))),      [LastDayOfYear] = CONVERT(DATE,CONVERT(CHAR(8),((y.[Yr] * 10000) + 1231)))FROM @y y; @Jamiet

    Read the article

  • Beginner SQL question: querying gold and silver tag badges in Stack Exchange Data Explorer

    - by polygenelubricants
    I'm using the Stack Exchange Data Explorer to learn SQL, but I think the fundamentals of the question is applicable to other databases. I'm trying to query the Badges table, which according to Stexdex (that's what I'm going to call it from now on) has the following schema: Badges Id UserId Name Date This works well for badges like [Epic] and [Legendary] which have unique names, but the silver and gold tag-specific badges seems to be mixed in together by having the same exact name. Here's an example query I wrote for [mysql] tag: SELECT UserId as [User Link], Date FROM Badges Where Name = 'mysql' Order By Date ASC The (slightly annotated) output is: as seen on stexdex: User Link Date --------------- ------------------- // all for silver except where noted Bill Karwin 2009-02-20 11:00:25 Quassnoi 2009-06-01 10:00:16 Greg 2009-10-22 10:00:25 Quassnoi 2009-10-31 10:00:24 // for gold Bill Karwin 2009-11-23 11:00:30 // for gold cletus 2010-01-01 11:00:23 OMG Ponies 2010-01-03 11:00:48 Pascal MARTIN 2010-02-17 11:00:29 Mark Byers 2010-04-07 10:00:35 Daniel Vassallo 2010-05-14 10:00:38 This is consistent with the current list of silver and gold earners at the moment of this writing, but to speak in more timeless terms, as of the end of May 2010 only 2 users have earned the gold [mysql] tag: Quassnoi and Bill Karwin, as evidenced in the above result by their names being the only ones that appear twice. So this is the way I understand it: The first time an Id appears (in chronological order) is for the silver badge The second time is for the gold Now, the above result mixes the silver and gold entries together. My questions are: Is this a typical design, or are there much friendlier schema/normalization/whatever you call it? In the current design, how would you query the silver and gold badges separately? GROUP BY Id and picking the min/max or first/second by the Date somehow? How can you write a query that lists all the silver badges first then all the gold badges next? Imagine also that the "real" query may be more complicated, i.e. not just listing by date. How would you write it so that it doesn't have too many repetition between the silver and gold subqueries? Is it perhaps more typical to do two totally separate queries instead? What is this idiom called? A row "partitioning" query to put them into "buckets" or something?

    Read the article

  • Pros/Cons of MySQL vs Postgresql for production Ruby on Rails environment?

    - by cakeforcerberus
    I will soon be switching from sqlite3 to either postgres or mysql. What should I consider when making this decision? Is mysql more suited for Rails than postgres in some areas and/or vice versa? Or, as I somewhat suspect, does it not really matter either way? Another factor that might play into my decision is the availability of tools to data pump my test data from the sqlite3 db to my new one. Is there anything that ActiveRecord provides natively to do this or any decent plugins/gems to help with this task? BONUS: How do I pronounce "Postgresql" and sound like I know what I'm talking about? :) Thanks Greg Smith for providing the following link that shows the most common pronunciations: http://www.postgresql.org/community/survey.33 UPDATE: Reference this question for more: http://stackoverflow.com/questions/110927/do-you-recommend-postgresql-over-mysql FYI: I ended up using MySQL. There is a neat plugin called yamldb that really saved me some time with the data transfer from my sqlite db to my new mysql one. Instructions on how to install and use it can be found here: http://accidentaltechnologist.com/ruby/change-databases-in-rails-with-yamldb/ Thanks Tom

    Read the article

  • Directly call distutils' or setuptools' setup() function with command name/options, without parsing

    - by Ryan B. Lynch
    I'd like to call Python's distutils' or setuptools' setup() function in a slightly unconventional way, but I'm not sure whether distutils is meant for this kind of usage. As an example, let's say I currently have a 'setup.py' file, which looks like this (lifted verbatim from the distutils docs--the setuptools usage is almost identical): from distutils.core import setup setup(name='Distutils', version='1.0', description='Python Distribution Utilities', author='Greg Ward', author_email='[email protected]', url='http://www.python.org/sigs/distutils-sig/', packages=['distutils', 'distutils.command'], ) Normally, to build just the .spec file for an RPM of this module, I could run python setup.py bdist_rpm --spec-only, which parses the command line and calls the 'bdist_rpm' code to handle the RPM-specific stuff. The .spec file ends up in './dist'. How can I change my setup() invocation so that it runs the 'bdist_rpm' command with the '--spec-only' option, WITHOUT parsing command-line parameters? Can I pass the command name and options as parameters to setup()? Or can I manually construct a command line, and pass that as a parameter, instead? NOTE: I already know that I could call the script in a separate process, with an actual command line, using os.system() or the subprocess module or something similar. I'm trying to avoid using any kind of external command invocations. I'm looking specifically for a solution that runs setup() in the current interpreter. For background, I'm converting some release-management shell scripts into a single Python program. One of the tasks is running 'setup.py' to generate a .spec file for further pre-release testing. Running 'setup.py' as an external command, with its own command line options, seems like an awkward method, and it complicates the rest of the program. I feel like there may be a more Pythonic way.

    Read the article

  • How to parse this xml document?

    - by dfjhdfjhdf
    Get such a XML document with the help of ajax (var data = request.responseXML;), how do I parse the contacts?: <?xml version="1.0" encoding="UTF-8"?> <Alladresses xmlns="http://somedomain.org/doc/2007-08-02/"> <Owner> <ID>gut74hfowesdfj49fjsifhryh8e8rta3uyhw4</ID> <Name>Mr.Bin</Name> </Owner> <Contacts> <Person> <Name>Greg</Name> <Phone>3254566756</Phone> </Person> <Person> <Name>Smith</Name> <Phone>342446446</Phone> </Person> <Person> <Name>Yuliya</Name> <Phone>675445566867</Phone> </Person> </Contacts> </Alladresses>

    Read the article

  • Using CURL within a loop to download a file, 1st one works, 2nd one times out

    - by kitenski
    Morning all, I am using CURL to download an image file within a loop. The first time it runs fine and I see the image appear in the directory. The second time it fails with a timeout, despite it being a valid URL. Can anyone suggest why it always fails on the 2nd time and how to fix it? The snippet of code is: // download image $extension = "gif"; $ch = curl_init(); curl_setopt($ch, CURLOPT_TIMEOUT, 90); curl_setopt($ch, CURLOPT_URL, $imgurl); curl_setopt($ch, CURLOPT_RETURNTRANSFER, true); curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true); echo $imgurl . " attempting to open URL "; $i = curl_exec($ch); if ( $i==false ) { echo curl_errno($ch).' '.curl_error($ch); } $image_name=time().'.'.$extension; $f = fopen('/fulldirectorypath/' . $image_name ,'w+'); fwrite($f,$i); fclose($f); I have put an echo in there to display the $IMGURL, to check it is valid, and upped the timeout to 90 secs, but it still fails. This is what I see on screen: http://images.eu-xmedia.de/thumbnails/34555861/5676051/pt=pic,lang=2,origfile=yes/image.gif attempting to open URL 28 Operation timed out after 90 seconds with 0 bytes received an empty file is created in my directory. thanks alot, Greg

    Read the article

  • Attempting to Convert Byte[] into Image... but is there platform issues involved

    - by user305535
    Greetings, Current, I'm attempting to develop an application that takes a Byte Array that is streamed to us from a Linux C language program across a TCPClient (stream) and reassemble it back into an image/jpg. The "sending" application was developed by a off-site developer who claims that the image reassembles back into an image without any problems or errors in his test environment (all Linux)... However, we are not so fortunate. I (believe) we successfully get all of the data sent, storing it as a string (lets us append the stream until it is complete) and then we convert it back into a Byte[]. This appears to be working fine... But, when we take the byte[] we get from the streaming (and our string assembly) and try to convert it into an image using the System.Drawing.Image.FromStream() we get errors.... Anyone have any idea what we're doing wrong? Or, does anyone know if this is a cross-platform issue? We're developing our app for Windows XP and C# .net, but the off-site developer did his work in c and Linux... perhaps there's some difference as to how each Operating System Coverts Images into Byte Arrays? Anyway, here's the code for converting our received ByteArray (from the TCPClient Stream) into an image. This code works when we send an image from a test machine we built that RUNS on XP, but not from the Linux box... System.Text.ASCIIEncoding encoding = new System.Text.ASCIIEncoding(); byte[] imageBytes = encoding.GetBytes(data); MemoryStream ms = new MemoryStream(imageBytes, 0, imageBytes.Length); // Convert byte[] to Image ms.Write(imageBytes, 0, imageBytes.Length); System.Drawing.Image image = System.Drawing.Image.FromStream(ms, false); <-- DIES here, throws a {System.ArgumentException: Parameter is not valid.} error Any advice, suggestions, theories, or HELP would be GREATLY appreciated! Please let me know??? Best wishes all! Thanks in advance! Greg

    Read the article

  • XNA Notes 011

    - by George Clingerman
    Even with a lot of the XNA community working on Dream Build Play entries ( I swear I’m going to finish mine this year!) people are still finding time to do side projects and be amazingly active in the XNA and XBLIG community. With my one eye on my code and one eye on the community, here’s what I noticed these over achievers doing this past week! Time Critical XNA News: Xbox LIVE Indie Games sales data will be delayed March 17-20th due to some schedule maintenance http://create.msdn.com/en-us/news/indie_games_data_delay_march2011 GameMarx is releasing a series of videos to help raise donations for victims of the earthquakes and tsunami in Japan. Help out if you can! http://www.gamemarx.com/video/special/29/help-japan-sushido.aspx XNA MVPs: Catalin Zima shares his thoughts on the MVP summit and my book! http://www.catalinzima.com/2011/03/mvp-summit-2011/ Glenn Wilson (@mykre) helps the XNA team announce some new educational content that you don’t want to miss if you’re porting your app or game to Windows Phone 7 http://www.virtualrealm.com.au/Blog/tabid/62/EntryId/653/Porting-your-App-or-Game-to-Windows-Phone-7.aspx and Windows Phone 7 from scratch http://www.virtualrealm.com.au/Blog/tabid/62/EntryId/654/Windows-Phone-from-Scratch.aspx and shares a link to some free architectural models and textures http://twitter.com/#!/Mykre/status/46410160784158720 George (that’s me!) shares his MVP Summit 2011 summary and XBLIG thoughts http://geekswithblogs.net/clingermangw/archive/2011/03/15/144366.aspx XNA Developers: @SmallCaveGames shares a Code of Ethics for Xbox LIVE Indie Game Developers http://smallcavegames.blogspot.com/2011/03/unofficial-xblig-developers-code-of.html Derek S adds more Xbox LIVE Indie Game studios to his master list of XBLIG links http://twitter.com/#!/Mr_Deeke/status/46140996056125440 http://xbl-indieverse.blogspot.com/p/xblig-links.html Making games and want to help kids? Then share your story with GameFace: America! http://gameitupinitiative.com/about-the-initiative/programs/gameface-america/ Xbox LIVE Indie Games (XBLIG): XonaGames shares some video footage of their booth from GDC 2011 Video 1: http://youtu.be/lxIV9nk3Gq4 Video 2: http://youtu.be/GgfrjqkxR_o Video 3: http://youtu.be/yVcpXrTX7SQ Joystiq on Mommy’s Best Games Serious Sam Double D http://www.joystiq.com/2011/03/16/the-most-important-thing-about-serious-sam-double-d/ And The Escapist recommends that gamers start learning to avoid cleavage now http://www.escapistmagazine.com/news/view/108543-Boobie-Bomber-Makes-First-Appearance-in-Serious-Sam-Double-D Magiko Gaming started a blog on the XBLIG dashboard daily Top 10 games in the US. Good way to go back in time and look at the history of which games were in the the Top 10. http://dailytop10indiegames.wordpress.com/ Where are they going now? XBLIG developers at a crossroads.. http://www.gamesetwatch.com/2011/03/where_are_they_going_now_xblig.php http://www.gamasutra.com/view/news/33527/InDepth_Where_Are_They_Going_Now_XBLIG_Developers_At_A_Crossroads_.php BinaryTweed’s Clover: A Curious Tail is Xbox LIVE’s Deal of the Week! http://www.armlessoctopus.com/2011/03/15/what-luck-clover-a-curious-tale-is-half-price-this-week/ Looking for an Xbox LIVE Indie Game to buy? Writings of Mass Deduction has over 125 suggestions at this point! http://writingsofmassdeduction.com/ SkaStudios shares Vampire Smile Achievements AND their PAX East 2011 Both Setup video http://www.ska-studios.com/2011/03/14/vampire-smile-achievement/ http://www.ska-studios.com/2011/03/15/pax-booth-setup-time-lapse/ MasterBlud and VVGTV starts a new community for XBLIG developers and gamers to join http://vvgtv.forumotion.com/ Raymond Matthews (@DrakstarMatryx) covers Mommy’s Best Games getting Serious http://www.darkstarmatryx.com/?p=286 XNA Development: Dave Henry (@mort8088) posts the 4th tutorial in his series XNA 4.0 SpriteBatch extended http://mort8088.com/2011/03/11/xna-4-0-tutorial-4-spritebatch-extended/ Tutorial 5 - Creating a manual blank texture http://mort8088.com/2011/03/13/xna-4-tutorial-5-manual-blank-texture/ XNA 4.0 Tutorial 6 - Spritesheet Object http://mort8088.com/2011/03/18/xna-4-0-tutorial-6-spritesheet-object/ Jason Mitchell shares a tutorial on setting the alpha value for spritebatch in XNA 4.0 http://www.jason-mitchell.com/index.php/2011/03/13/setting-alpha-value-for-spritebatch-draw-in-xna-4/ XNA for Silverlight Developers: Part 7 - Collision Detection http://www.silverlightshow.net/items/XNA-for-Silverlight-developers-Part-7-Collision-detection.aspx Markus Ewald (@Cygon4) shares the full Ninject 2.0 binding for XNA and Sunburn http://twitter.com/#!/Cygon4/status/48330203826622464 Michael B. McLaughlin shares an AccelerometerInput XNA GameComponent he created (which I’m probably going to snag for a game I’m working on...) http://geekswithblogs.net/mikebmcl/archive/2011/03/17/accelerometerinput-xna-gamecomponent.aspx Extra Credit tackles the building of a good tutorial. Must watch for all Indie game devs (thanks for pointing it out Evan Johnson!) http://twitter.com/#!/johnsonevan/status/48452115680604160 http://www.escapistmagazine.com/videos/view/extra-credits/2921-Tutorials-101 ExEn is fully funded at this point so definitely something for XBLIG developers to keep an eye on as they consider releasing their games on other platforms http://rockethub.com/projects/752-exen-xna-for-iphone-android-and-silverlight Channel 9 and Greg Duncan post Mixing the Game State Management and Platformer XNA Recipes http://channel9.msdn.com/coding4fun/blog/Mixing-the-Game-State-Management-and-Platformer-XNA-Recipes Sgt. Conker has noticed Mike McLaughlin has been crazy productive and has done a recap of his recent posts http://www.sgtconker.com/2011/03/recap-of-mikebmcls-posts/

    Read the article

  • SQL Saturday 43 in Redmond

    - by AjarnMark
    I attended my first SQLSaturday a couple of days ago, SQLSaturday #43 in Redmond (at Microsoft).  I got there really early, primarily because I forgot how fast I can get there from my home when nobody else is on the road.  On a weekday in rush hour traffic, that would have taken two hours to get there.  I gave myself 90 minutes, and actually got there in about 45.  Crazy! I made the mistake of going to the main Microsoft campus, but that’s not where the event was being held.  Instead it was in a big Microsoft conference center on the other side of the highway.  Fortunately, I had the address with me and quickly realized my mistake.  When I got back on track, I noticed that there were bright yellow signs out on the street corner that looked like they said they were for SOL Saturday, which actually was appropriate since it was the sunniest day around here in a long time. Since I was there so early, the registration was just getting setup, so I found Greg Larsen who was coordinating things and offered to help.  He put me to work with a group of people organizing the pre-printed raffle tickets and stuffing swag bags. I had never been to a SQLSaturday before this one, so I wasn’t exactly sure what to expect even though I have read about a few on some blogs.  It makes sense that each one will be a little bit different since they are almost completely volunteer driven, and the whole concept is still in its early stages.  I have been to the PASS Summit for the last several years, and was hoping for a smaller version of that.  Now, it’s not really fair to compare one free day of training run entirely by volunteers with a multi-day, $1,000+ event put on under the direction of a professional event management company.  But there are some parallels. At this SQLSaturday, there was no opening general session, just coffee and pastries in the common area / expo hallway and straight into the first group of sessions.  I don’t know if that was because there was no single room large enough to hold everyone, or for other reasons.  This worked out okay, but the organization guy in me would have preferred to have even a 15 minute welcome message from the organizers with a little overview of the day.  Even something as simple as, “Thanks to persons X, Y, and Z for helping put this together…Sessions will start in 20 minutes and are all in rooms down this hallway…the bathrooms are on the other side of the conference center…lunch today is pizza and we would like to thank sponsor Q for providing it.”  It doesn’t need to be much, certainly not a full-blown Keynote like at the PASS Summit, but something to use as a rallying point to pull everyone together and get the day off to an official start would be nice.  Again, there may have been logistical reasons why that was not feasible here.  I’m just putting out my thoughts for other SQLSaturday coordinators to consider. The event overall was great.  I believe that there were over 300 in attendance, and everything seemed to run smoothly.  At least from an attendee’s point of view where there was plenty of muffins in the morning and pizza in the afternoon, with plenty of pop to drink.  And hey, if you’ve got the food and drink covered, a lot of other stuff could go wrong and people will be very forgiving.  But as I said, everything appeared to run pretty smoothly, at least until Buck Woody showed up in his Oracle shirt.  Other than that, the volunteers did a great job! I was a little surprised by how few people in my own backyard that I know.  It makes sense if you really think about it, given how many companies must be using SQL Server around here.  I guess I just got spoiled coming into the PASS Summit with a few contacts that I already knew would be there.  Perhaps I have been spending too much time with too few people at the Summits and I need to step out and meet more folks.  Of course, it also is different since the Summit is the big national event and a number of the folks I know are spread out across the country, so the Summit is the only time we’re all in the same place at the same time.  I did make a few new contacts at SQLSaturday, and bumped into a couple of people that I knew (and a couple others that I only knew from Twitter, and didn’t even realize that they were here in the area). Other than the sheer entertainment value of Buck Woody’s session, the one that was probably the greatest value for me was a quick introduction to PowerShell.  I have not done anything with it yet, but I think it will be a good tool to use to implement my plans for automated database recovery testing.  I saw just enough at the session to take away some of the intimidation factor, and I am getting ready to jump in and see what I can put together in the next few weeks.  And that right there made the investment worthwhile.  So I encourage you, if you have the opportunity to go to a SQLSaturday event near you, go for it!

    Read the article

  • SQLAuthority News – #SQLPASS 2012 Seattle Update – Memorylane 2009, 2010, 2011

    - by pinaldave
    Today is the first day of the SQLPASS 2012 and I will be soon posting SQL Server 2012 experience over here. Today when I landed in Seattle, I got the nostalgia feeling. I used to stay in the USA. I stayed here for more than 7 years – I studied here and I worked in USA. I had lots of friends in Seattle when I used to stay in the USA. I always wanted to visit Seattle because it is THE place. I remember once I purchased a ticket to travel to Seattle through Priceline (well it was the cheapest option and I was a student) but could not fly because of an interesting issue. I used to be Teaching Assistant of an advanced course and the professor asked me to build a pop-quiz for the course. I unfortunately had to cancel the trip. Before I returned to India – I pretty much covered every city existed in my list to must visit, except one – Seattle. It was so interesting that I never made it to Seattle even though I wanted to visit, when I was in USA. After that one time I never got a chance to travel to Seattle. After a few years I also returned to India for good. Once on Television I saw “Sleepless in Seattle” movie playing and I immediately changed the channel as it reminded me that I never made it to Seattle before. However, destiny has its own way to handle decisions. After I returned to India – I visited Seattle total of 5 times and this is my 6th visit to Seattle in less than 3 years. I was here for 3 previous SQLPASS events – 2009, 2010, and 2011 as well two Microsoft Most Valuable Professional Summit in 2009 and 2010. During these five trips I tried to catch up with all of my all friends but I realize that time has its own way of doing things. Many moved out of Seattle and many were too busy revive the old friendship but there were few who always make a point to meet me when I travel to the city. During the course of my visits I have made few fantastic new friends – Rick Morelan (Joes 2 Pros) and Greg Lynch. Every time I meet them I feel that I know them for years. I think city of Seattle has played very important part in our relationship that I got these fantastic friends. SQLPASS is the event where I find all of my SQL Friends and I look for this event for an entire year. This year’s my goal is to meet as many as new friends I can meet. If you are going to be at SQLPASS – FIND ME. I want to have a photo with you. I want to remember each name as I believe this is very important part of our life – making new friends and sustaining new friendship. Here are few of the pointers where you can find me. All Keynotes – Blogger’s Table Exhibition Booth Joes 2 Pros Booth #117 – Do not forget to stop by at the booth – I might have goodies for you – limited editions. Book Signing Events – Check details in tomorrow’s blog or stop by Booth #117 Evening Parties 6th Nov – Welcome Reception Evening Parties 7th Nov - Exhibitor Reception – Do not miss Booth #117 Evening Parties 8th Nov - Community Appreciation Party Additionally at few other locations – Embarcadero Booth In Coffee shops in Convention Center If you are SQLPASS – make sure that I find an opportunity to meet you at the event. Reserve a little time and lets have a coffee together. I will be continuously tweeting about my where about on twitter so let us stay connected on twitter. Here is my experience of my earlier experience of attending SQLPASS. SQLAuthority News – Book Signing Event – SQLPASS 2011 Event Log SQLAuthority News – Meeting SQL Friends – SQLPASS 2011 Event Log SQLAuthority News – Story of Seattle – SQLPASS 2011 Event Log SQLAuthority News – SQLPASS Nov 8-11, 2010-Seattle – An Alternative Look at Experience SQLAuthority News – Notes of Excellent Experience at SQL PASS 2009 Summit, Seattle Let us meet! Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: PostADay, SQL, SQL Authority, SQL PASS, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Reading the tea leaves from Windows Azure support

    - by jamiet
    A few idle thoughts… Three months ago I had an issue regarding Windows Azure where I was unable to login to the management portal. At the time I contacted Azure support, the issue was soon resolved and I thought no more about it. Until today that is when I received an email from Azure support providing a detailed analysis of the root cause, the fix and moreover precise details about when and where things occurred. The email itself is interesting and I have included the entirety of it below. A few things were interesting to me: The level of detail and the diligence in investigating and reporting the issue I found really rather impressive. They even outline the number of users that were affected (127 in case you can’t be bothered reading). Compare this to the quite pathetic support that another division within Microsoft, Skype, provided to Greg Low recently: Skype support and dead parrot sketches   This line: “Windows Azure performed a planned change from using the Microsoft account service (formerly Windows Live ID) to the Azure Active Directory (AAD) as its primary authentication mechanism on August 24th. This change was made to enable future innovation in the area of authentication – particularly for organizationally owned identities, identity federation, stronger authentication methods and compliance certification. ” I also found to be particularly interesting. I have long thought that one of the reasons Microsoft has proved to be such a money-making machine in the enterprise is because they provide the infrastructure and then upsell on top of that – and nothing is more infrastructural than Active Directory. It has struck me of late that they are trying to make the same play of late in the cloud by tying all their services into Azure Active Directory and here we see a clear indication of that by making AAD the authentication mechanism for anyone using Windows Azure. I get the feeling that we’re going to hear much much more about AAD in the future; isn’t it about time we could log on to SQL Azure Windows Azure SQL Database without resorting to SQL authentication, for example? And why do Microsoft have two identity providers – Microsoft Account (aka Windows Live ID) and AAD – isn’t it about time those things were combined? As I said, just some idle thoughts. Below is the transcript of the email if you are interested. @Jamiet  This is regarding the support request <redacted> where in you were not able to login into the windows azure management portal with live id. We are providing you with the summary, root cause analysis and information about permanent fix: Incident Title: You were unable to access Windows Azure Portal after Microsoft Account to Azure Active Directory account Migration. Service Impacted: Management Portal Incident Start Date and Time: 8/24/2012 4:30:00 PM Date and Time Service was Restored: 10/17/2012 12:00:00 AM Summary: Windows Azure performed a planned change from using the Microsoft account service (formerly Windows Live ID) to the Azure Active Directory (AAD) as its primary authentication mechanism on August 24th.   This change was made to enable future innovation in the area of authentication – particularly for organizationally owned identities, identity federation, stronger authentication methods and compliance certification.   While this migration was largely transparent to Windows Azure users, a small number of users whose sign-in names were part of a Windows Live Custom Domain were unable to login.   This incompatibility was not discovered during the Quality Assurance testing phase prior to the migration. Customer Impact: Customers whose sign-in names were part of a Windows Live Custom Domain were unable to sign-in the Management Portal after ~4:00 p.m. PST on August 24th, 2012.   We determined that the issue did impact at least 127 users in 98 of these Windows Live Custom Domains and had a maximum potential impact of 1,110 users in total. Root Cause: The root cause of the issue was an incompatibility in the AAD authentication service to handle logins from Microsoft accounts whose sign-in names were part of a Windows Live Custom Domains.  This issue was not discovered during the Quality Assurance testing phase prior to the migration from Microsoft Account (MSA) to AAD. Mitigations: The issue was mitigated for the majority of affected users by 8:20 a.m. PST on August 25th, 2012 by running some internal scripts to correct many known Windows Live Custom Domains.   The remaining affected domains fell into two categories: Windows Live Custom Domains that were not corrected by 8/25/2012. An additional 48 Windows Live Custom Domains were fixed in the weeks following the incident within 2 business days after the AAD team received an escalation from product support regarding those accounts. Windows Live Custom domains that were also provisioned in Office365. Some of the affected Windows Live Custom Domains had already been provisioned in AAD because their owners signed up for Office365 which is a service that also uses AAD.   In these cases the Azure customers had to work around the issue by renaming their Microsoft Account or using a different Microsoft Account to administer their Azure subscription. Permanent Fix: The Azure Active Directory team permanently fixed the issue for all customers on 10/17/2012 in an upgraded release of the AAD service.

    Read the article

  • Delphi TBytesField - How to see the text properly - Source is HIT OLEDB AS400

    - by myitanalyst
    We are connecting to a multi-member AS400 iSeries table via HIT OLEDB and HIT ODBC. You connect to this table via an alias to access a specific multi-member. We create the alias on the AS400 this way: CREATE ALIAS aliasname FOR table(membername) We can then query each member of the table this way: SELECT * FROM aliasname We are testing this in Delphi6 first, but will move it to D2010 later We are using HIT OLEDB for the AS400. We are pulling down records from a table and the field is being seen as a tBytesField. I have also tried ODBC driver and it sees as tBytesField as well. Directly on the AS400 I can query the data and see readable text. I can use the iSeries Navigation tool and see readable text as well. However when I bring it down to the Delphi client via the HIT OLEDB or HIT ODBC and try to view via asString then I just see unreadable text.. something like this: ñðð@ðõñððððñ÷@õôððõñòøóóöøñðÂÁÕÒ@ÖÆ@ÁÔÅÙÉÃÁ@@@@@@@@ÂÈÙÉâãæÁðòñè@ÔK@k@ÉÕÃK@@@@@@@@@ç I jumbled up the text above, but that is the character types that show up. When I did a test in D2010 the text looks like japanse or chinese characters, but if I display as AnsiString then it looks like what it does in Delphi 6. I am thinking this may have something to do with code pages or character sets, but I have no experience in this are so it is new to me if it is related. When I look at the Coded Character Set on the AS400 it is set to 65535. What do I need to do to make this text readable? We do have a third party component (Delphi400) that makes things behave in a more native AS400 manner. When I use its AS400 connection and AS400 query components it shows the field as a tStringField and displays just fine. BUT we are phasing out this product (for a number of reasons) and would really like the OLEDB with the ADO components work. Just for clarification the HIT OLEDB with tADOQuery do have some fields showing as tStringFields for many of the other tables we use... not sure why it is showing as a tBytesField in this case. I am not an AS400 expert, but looking at the field definititions on the AS400 the ones showing up as tBytesField look the same as the ones showing up as tStringFields... but there must be a difference. Maybe due to being a multi-member? So... does anyone have any guidance on how to get the correct string data that is readable? If you need more info please ask. Greg

    Read the article

  • CQRS - Should a Command try to create a "complex" master-detail entity?

    - by Simon Crabtree
    I've been reading Greg Young and Udi Dahan's thoughts on Command Query Responsibilty Separation and a lot of what I read strikes a chord with me. My domain (we track vehicles which are doing deliveries) has the concept of a Route which contains one or more Stops. I need my customers to be able to set these up in our system by calling a webservice, and then be able to retrieve information about a Route and how the vehicle is progressing. In the past I would have "cut-down" DTO classes which closely resemble my domain classes, and the customer would create a RouteDto with an array of StopDto(s), and call our CreateRoute webmethod, passing in the RouteDto. When they query our system by calling the GetRouteDetails method, I would return exactly the same objects to them. One of the appealing aspects of CQRS is that the RouteDto might have all manner of properties that the customer wants to query, but have no business setting when they create a Route. So I create a separate CreateRouteRequest class which is passed in when calling the CreateRoute "command", and a Route DTO class which gets returned as a query result. class Route{ string Reference; List<Stop> Stops; } But I need my customer to provide me with Route AND Stop details when they create a route. As I see it I could either... Give my CreateRouteRequest class a Stops(s) property which is an array of "something" representing the data they need to provide about each stop - but what do I call this class? It's not a Stop as that's what I'm calling the list of DTO inside my Route DTO, but I don't like "CreateStopRequest". I also wonder if I'm stuck in a CRUD mindset here thinking in terms of master-detail information and asking the customer to think like that too. class CreateRouteRequest{ string Reference; ... List<CreateStopRequest> Stops; } or They call CreateRoute, and then make a number of calls to an AddStopToRoute method. This feels a bit more "behavioural" but I'm going to lose the ability to treat creating a route including its stops as a single atomic command. If they create a Route and then try to add a Stop which fails due to some validation problem they're going to have a partially correct Route. The fact that I can't come up with a good name for the list of "StopCreationData" objects I'd be working with in option 1, makes me wonder if there's something I'm missing.

    Read the article

  • Company Review: Google Products

    Google, Inc offers an array of products and services to all of its end-users. However their search capabilities are the foundation for Google’s current success and their primary business focus. Currently, Google offers over twenty different search applications that allow users to search the internet for books, maps, videos, images, products and much more. Their product decisions have allowed users demands to be met while focusing on the free based model. This allows users to access Google data free of charge and indirectly gives Google a strong competitive advantage of other competitors along with the accuracy of the search results. According to Google, Inc, they offer the following types of searching capabilities: Alerts Get email updates on the topics of your choice Blog Search Find blogs on your favorite topics  Books Search the full text of books  Custom Search Create a customized search experience for your community  Desktop Search and personalize your computer  Dictionary Search for definitions of words and phrases Directory Search the web, organized by topic or category Earth Explore the world from your computer Finance Business info, news and interactive charts GOOG-411 Find and connect for free with businesses from your phone  Images Search for images on the web Maps View maps and directions News Search thousands of news stories Patent Search Search the full text of US Patents Product Search Search for stuff to buy Scholar Search scholarly papers Toolbar Add a search box to your browser Trends Explore past and present search trends Videos Search for videos on the web Web Search Search billions of web pages Web Search Features Find movies, music, stocks, books and more mapping Google’s free based business model is only one way it differentiates itself from its competition. There is also a strong focus on the accuracy of search results and the speed in which they are returned to the end-user. Quality function deployment (QFD) is a structured method used to help connect user needs to the design features of a project proposed to address those needs. This method is particularly useful in accounting for needs that are not easily articulated or precisely defined according to the U. S. Department of Transportation Federal Highway Administration. Due to the fact that QFD is so customer driven Google is always in a constant state of change in attempt to reengineer its search algorithms, and other dependant systems so that end-users requirements are constantly being met. Value engineering is a key example of this, Google is constantly trying to improve all aspects of its products, improve system maintainability, and system interoperability. Bridgefield Group defines value engineering as an organized methodology that identifies and selects the lowest lifecycle cost options in design, materials and processes that achieves the desired level of performance, reliability and customer satisfaction. In addition, it seeks to remove unnecessary costs in the above areas and is often a joint effort with cross-functional internal teams and relevant suppliers. Common issues that appear when developing large scale systems like Google’s search applications include modular design of a product and/or service and providing accurate value analysis. A design approach that adheres to four fundamental tenets of cohesiveness, encapsulation, self-containment, and high binding to design a system component as an independently operable unit subject to change is how the Open System Joint Task Force defines modular design. More specifically M. S. Schmaltz defines modular software design as having a large collection of statements strung together in one partition of in-line code; we segment or divide the statements into logical groups called modules. Each module performs one or two tasks, and then passes control to another module. By breaking up the code into "bite-sized chunks", so to speak, we are able to better control the flow of data and control. This is especially true in large software systems. Value analysis is a process to evaluate products and services based on effectiveness, safety, and cost. Value analysis involves assessing the quality as well as the cost of a product or service as defined by the Healthcare Financial Management Association.  “Operations Management deals with the design and management of products, processes, services and supply chains. It considers the acquisition, development, and utilization of resources that firms need to deliver the goods and services their clients want.” (MIT,2010) Google, Inc encourages an open environment between all employees, also known as Googlers. This is reinforced by a cross-section team or cross-functional teams comprised from multiple departments assigned to every project so that every department like marketing, finance, and quality assurance has input on every project. In addition, Google is known for their openness to new ideas regardless of the status or seniority of an employee. In fact, Google allows for 20% of an employee’s time can be devoted to developing new ideas and/or pet projects. HumTech.com defines a cross-functional team as a collection of people with varied levels of skills and experience brought together to accomplish a task. As the name implies, Cross-Functional Team members come from different organizational units. Cross-Functional Teams may be permanent or ad hoc. Google’s search application product strategy primarily focuses on mass customization. This is allows Google to create a base search application and allows results to be returned to the end-users quickly based on specific parameters and search settings. In addition, they also store the data that is returned in case other desire the same results based on other end-users supplying the same customized settings. This allows Google to appear to render search results in virtually real-time to the user while allowing for complete customization of the searching criteria. Greg Vogl, a professor at Uganda Martyrs University, defines mass customization as when a business gives its customers the opportunity to tailor its products or services to the customer's specifications. The IT staff at Google play a key role in ensuring that the search application’s product strategy is maintained simply because the IT staff designs, develops, and maintains all of their proprietary applications. In fact, they also maintain all network infrastructure to ensure that it is available to all end-users. References: http://www.google.com/intl/en/options/ http://ops.fhwa.dot.gov/freight/publications/ftat_user_guide/sec5.htm http://www.bridgefieldgroup.com/bridgefieldgroup/glos9.htm#V http://www.acq.osd.mil/osjtf/termsdef.html http://www.cise.ufl.edu/~mssz/Pascal-CGS2462/prog-dsn.html http://www.hfma.org/publications/business_caring_newsletter/exclusives/Supply+and+Inventory+Terms+Defined.htm http://mitsloan.mit.edu/omg/om-definition.php http://www.humtech.com/opm/grtl/ols/ols3.cfm http://www.gregvogl.net/courses/mis1/glossary.htm

    Read the article

  • SQL Sentry First Impressions

    - by AjarnMark
    After struggling to defend my SQL Servers from a political attack recently, I realized that I needed better tools to back me up, and SQL Sentry is the leading candidate. A couple of weeks ago, seemingly from out of nowhere, complaints from the business users started coming in that one of the core internal applications was running dramatically slower than normal, and fingers were being pointed at the SQL Server.  Unfortunately, we don’t have a production DBA whose entire job is to monitor and maintain our SQL Servers.  The responsibility falls to me to do the best I can, investing only a small portion of my time, because there are so many other responsibilities to take care of, and our industry is still deep in recession.  I inherited these SQL Servers and have made significant improvements in process and procedure, but I had not yet made the time to take real baseline measurements or keep a really close eye on the performance.  Like many DBAs, I wrote several of my own tools and used the “built-in tools” like Profiler, PerfMon, and sp_who2 (did I mention most of our instances are SQL Server 2000?).  These have all served me well for in-the-moment troubleshooting and maintenance, but they really fell down on the job when I was called upon to “prove” that SQL Server performance was acceptable and more importantly had not degraded recently (i.e. historical comparisons).  I really didn’t have anything from a historical comparison perspective, but I was able to show that current performance was acceptable, and deflect attention back onto other components (which in fact turned out to be the real culprit). That experience dramatically illustrated the need for better monitoring tools.  Coincidentally, I had been talking recently to my boss about the mini nightmare of monitoring several critical and interdependent overnight jobs that operate on separate instances of SQL Server.  Among other tools, I had been using Idera’s SQL Job Manager which is a free tool and did a nice job of showing me job schedules and histories in a nice calendar view.  This worked fairly well, and for the money (did I mention it was free?) it couldn’t be beat.  But it is based on the stored job history in MSDB, and there were other performance problems that we ran into when we started changing the settings for how much job history to retain, in order to be able to look back a month or more in the calendar view.  Another coincidence (if you believe in such things) was that when we had some of those performance challenges, I posted a couple of questions to the #sqlhelp hashtag on Twitter and Greg Gonzalez (@SQLSensei) suggested I check out SQL Sentry’s Event Manager.  At the time, I just thought he worked there, but later found out that he founded the company.  When I took a quick look at the features & benefits, the one that really jumped out at me is Chaining and Queueing which sounded like it would really help with our “interdependent jobs on different servers” issue. I know that is a lot of background story and coincidences, but hopefully you have stuck with me so far, and now we have arrived at the point where last week I downloaded and installed the 30-day trial of the SQL Sentry Power Suite, which is Event Manager plus Performance Advisor.  And I must say that I really like what I see so far.  Here are a few highlights: Great Support.  I had two issues getting the trial setup and monitoring a handful of our servers.  One of which was entirely my fault (missed a security setting in SQL 2008) and the other was mostly my fault (late change to some config settings that were apparently cached and did not get refreshed properly).  In both cases, the support staff at SQL Sentry were very responsive and rather quickly figured out what the cause and fix was for each of them.  This left me with a great impression of the company.  Kudos to them! Chaining and Queueing.  While I have not yet activated this feature, I am very excited about the possibilities.  We have jobs on three different instances of SQL Server that have to be run in a certain order, and each has to finish before the next can successfully begin, and I believe this feature will ensure just that.  It has been a real pain in the backside when one of those jobs runs just a little too long and does not finish before the job on another instance starts, thus triggering a chain reaction of either outright job failures, or worse, successful completion of completely invalid processing. Calendar View.  I really, really like the Event Manager calendar view where I can see all jobs and events across all instances and identify potential resource contention as well as windows of opportunity for maintenance activity.  Very well done, and based on Event Manager’s own database of accumulated historical information rather than querying the source instances every time. Performance Advisor Dashboard History View.  This view let’s me quickly select a date and time range and it displays graphs of key SQL Server and Windows metrics.  This is exactly the thing I needed to answer the “has performance changed recently” question at the beginning of this post. Reporting Services Subscription Jobs with Report Name.  This was a big and VERY pleasant surprise.  If you have ever looked at the list of SQL Server jobs that SQL Server Reporting Services creates when you make a Subscription, you will notice that they all have some sort of GUID as the name of the job.  This is really ugly, and really annoying because when you are just looking at the SQL Agent and Job Activity Monitor, if you see that Job X failed, you really do not have any indication in the name or the properties of the Job itself, as to what Report that was for.  But with SQL Sentry Event Manager you do.  The Jobs list in the Navigator pane in SQL Sentry, amazingly, displays the name of the Report that the Subscription Job is for.  And when you open it to see more details, it shows you the full Reporting Services path to that Report, so you can immediately track it down in the Report Manager in case you want to identify/notify the owner or edit the Subscription information.  I did not expect this at all, but I sure do like it.  HOORAY! That is just my first impressions from using the tools for a few days.  And I haven’t even gotten into how it showed me where I was completely mistaken about one aspect of my SQL Server disk configurations.  I’ll share that lesson in another blog entry.  But I have to say it again, the combination of Event Manager and Performance Advisor working together have really made me a fan.

    Read the article

  • How to create multiboot flash drive

    - by Nrew
    I've found a guide here: http://www.pendrivelinux.com/boot-multiple-iso-from-usb-multiboot-usb/ And found this menu.lst in my flash drive, which seems to be the one that I'm seeing when I boot using my flash drive: # This Menu Created by Lance http://www.pendrivelinux.com # Ongoing Suggested Menu Entries and the Suggestor are noted! default 0 timeout 30 color NORMAL HIGHLIGHT HELPTEXT HEADING splashimage=(hd0,0)/splash.xpm.gz foreground=FFFFFF background=0066FF title Memtest86+ find --set-root /memtest86+-4.00.iso map --mem /memtest86+-4.00.iso (hd32) map --hook root (hd32) chainloader (hd32) # Suggested by madprofessor title Boot Clonezilla root (hd0,0) kernel /clonezilla/live/vmlinuz live-media-path=clonezilla/live bootfrom=/dev/sd boot=live union=aufs noprompt ocs_live_run="ocs-live-general" ocs_live_extra_param="" ocs_live_keymap="" ocs_live_batch="no" ocs_lang="" vga=791 ip=frommedia initrd /clonezilla/live/initrd.img title Parted Magic 4.9 (Partition Tools) find --set-root /pmagic-4.9.iso map /pmagic-4.9.iso (hd32) map --hook root (hd32) chainloader (hd32) # Suggested by Deb title Partition Wizard 4.2 (Partition Tools) find --set-root /pwhe42.iso map /pwhe42.iso (hd32) map --hook root (hd32) chainloader (hd32) title Balder DOS image (FreeDOS) map --unsafe-boot /balder10.img (fd0) map --hook chainloader --force (fd0)+1 rootnoverify (fd0) # Suggested by Szymon Silski title Linux Mint 8 find --set-root /LinuxMint-8.iso map /LinuxMint-8.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/mint.seed boot=casper persistent iso-scan/filename=/LinuxMint-8.iso splash initrd /casper/initrd.lz title Ubuntu 10.04 find --set-root /ubuntu-10.04-desktop-i386.iso map /ubuntu-10.04-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/ubuntu.seed boot=casper persistent iso-scan/filename=/ubuntu-10.04-desktop-i386.iso splash initrd /casper/initrd.lz title Xubuntu 10.04 (XFCE Desktop) find --set-root /xubuntu-10.04-desktop-i386.iso map /xubuntu-10.04-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/xubuntu.seed boot=casper persistent iso-scan/filename=/xubuntu-10.04-desktop-i386.iso splash initrd /casper/initrd.lz title Kubuntu 10.04 (KDE Desktop) find --set-root /kubuntu-10.04-desktop-i386.iso map /kubuntu-10.04-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/kubuntu.seed boot=casper persistent iso-scan/filename=/kubuntu-10.04-desktop-i386.iso splash initrd /casper/initrd.lz # Suggested by Ambriel title Lubuntu 10.04 (LXDE Lightweight Desktop) find --set-root /lubuntu-10.04.iso map /lubuntu-10.04.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/lubuntu.seed boot=casper persistent iso-scan/filename=/lubuntu-10.04.iso splash initrd /casper/initrd.lz title Ubuntu 10.04 Netbook Remix (NetBook Distro) find --set-root /ubuntu-10.04-netbook-i386.iso map /ubuntu-10.04-netbook-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/netbook-remix.seed boot=casper persistent iso-scan/filename=/ubuntu-10.04-netbook-i386.iso splash initrd /casper/initrd.lz title Ubuntu 10.04 Server Edition Installer (32 bit Installer Only) find --set-root /ubuntu-10.04-server-i386.iso map /ubuntu-10.04-server-i386.iso (0xff) map --hook root (0xff) kernel /install/vmlinuz file=/cdrom/preseed/ubuntu-server.seed boot=install iso-scan/filename=/ubuntu-10.04-server-i386.iso splash initrd /install/initrd.gz title Ubuntu 9.10 find --set-root /ubuntu-9.10-desktop-i386.iso map /ubuntu-9.10-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/ubuntu.seed boot=casper persistent iso-scan/filename=/ubuntu-9.10-desktop-i386.iso splash initrd /casper/initrd.lz title Xubuntu 9.10 find --set-root /xubuntu-9.10-desktop-i386.iso map /xubuntu-9.10-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/xubuntu.seed boot=casper persistent iso-scan/filename=/xubuntu-9.10-desktop-i386.iso splash initrd /casper/initrd.lz title Kubuntu 9.10 find --set-root /kubuntu-9.10-desktop-i386.iso map /kubuntu-9.10-desktop-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/kubuntu.seed boot=casper persistent iso-scan/filename=/kubuntu-9.10-desktop-i386.iso splash initrd /casper/initrd.lz # Ubuntu Server and Netbook Remix suggested by Wojciech Holek title Ubuntu 9.10 Server Edition Installer (Installer Only) find --set-root /ubuntu-9.10-server-i386.iso map /ubuntu-9.10-server-i386.iso (0xff) map --hook root (0xff) kernel /install/vmlinuz file=/cdrom/preseed/ubuntu-server.seed boot=install iso-scan/filename=/ubuntu-9.10-server-i386.iso splash initrd /install/initrd.gz title Ubuntu 9.10 Netbook Remix (NetBook Distro) find --set-root /ubuntu-9.10-netbook-remix-i386.iso map /ubuntu-9.10-netbook-remix-i386.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/netbook-remix.seed boot=casper persistent iso-scan/filename=/ubuntu-9.10-netbook-remix-i386.iso splash initrd /casper/initrd.lz title Ubuntu 9.10 Rescue Remix (Recovery Tools) find --set-root /ubuntu-rescue-remix-9-10-revision1.iso map /ubuntu-rescue-remix-9-10-revision1.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/ubuntu.seed boot=casper iso-scan/filename=/ubuntu-rescue-remix-9-10-revision1.iso splash initrd /casper/initrd.lz title DSL 4.4.10 find --set-root /dsl-4.4.10-initrd.iso map --mem /dsl-4.4.10-initrd.iso (hd32) map --hook root (hd32) chainloader (hd32) title AVG Rescue CD (Anti-Virus + Anti-Spyware) find --set-root /avg_arl_en_90_100114.iso map /avg_arl_en_90_100114.iso (hd32) map --hook chainloader (hd32) title Ultimate Boot CD 4.11 find --set-root /ubcd411.iso map /ubcd411.iso (hd32) map --hook chainloader (hd32) title OphCrack XP 2.3.1 (XP Password Cracker) find --set-root /ophcrack-xp-livecd-2.3.1.iso map /ophcrack-xp-livecd-2.3.1.iso (0xff) map --hook root (0xff) kernel /boot/bzImage rw root=/dev/null vga=normal lang=C kmap=us screen=1024x768x16 autologin initrd /boot/rootfs.gz title OphCrack Vista 2.3.1 (Vista Password Cracker) find --set-root /ophcrack-vista-livecd-2.3.1.iso map /ophcrack-vista-livecd-2.3.1.iso (0xff) map --hook root (0xff) kernel /boot/bzImage rw root=/dev/null vga=normal lang=C kmap=us screen=1024x768x16 autologin initrd /boot/rootfs.gz # Suggested by Greg Steer title Offline NT Password & Registy Editor find --set-root /cd080802.iso map /cd080802.iso (hd32) map --hook chainloader (hd32) title SliTaz 2.0 find --set-root /slitaz-2.0.iso map --mem /slitaz-2.0.iso (hd32) map --hook chainloader (hd32) title Riplinux 9.3 find --set-root /RIPLinuX-9.3.iso map --heads=0 --sectors-per-track=0 /RIPLinuX-9.3.iso (0xff) || map --heads=0 --sectors-per-track=0 --mem /RIPLinuX-9.3.iso (0xff) map --hook chainloader (0xff) # Suggested by Sunny title YlmF (Windows Like OS) find --set-root /YlmF_OS_EN_v1.0.iso map /YlmF_OS_EN_v1.0.iso (0xff) map --hook root (0xff) kernel /casper/vmlinuz file=/cdrom/preseed/ubuntu.seed boot=casper persistent iso-scan/filename=/YlmF_OS_EN_v1.0.iso splash initrd /casper/initrd.lz # Suggested by Martin Andersson title DBAN 1.0.7 (Drive Nuker) find --set-root /dban-1.0.7_i386.iso map --mem /dban-1.0.7_i386.iso (hd32) map --hook root (hd32) chainloader (hd32) # Suggested by Robin McGough title xPUD 0.9.2 (NetBook Distro) find --set-root --ignore-floppies --ignore-cd /xpud-0.9.2.iso map --heads=0 --sectors-per-track=0 /xpud-0.9.2.iso (hd32) map --hook chainloader (hd32) title Puppy 4.3.1 find --set-root /puppy/pup-431.sfs kernel /puppy/vmlinuz initrd /puppy/initrd.gz # Suggested by Relst title Run a Linux OS from the Internet kernel /gpxe.lkrn I also put some .iso files for os installers (Windows xp sp2 and Ubuntu 10.04) But they didn't show up in the list when I booted Do I need to: extract the .iso files and put in in their respective folders? Add the os that I added on the menu.lst? How do I add the iso image(os) in the menu.lst? Before adding the .iso files I first made a folder named Windows xp sp2 then placed the .iso files in there. Please help, I think I need to add the folder name or the file name on the menu.lst but I don't know how

    Read the article

  • Supporting Piping (A Useful Hello World)

    - by blastthisinferno
    I am trying to write a collection of simple C++ programs that follow the basic Unix philosophy by: Make each program do one thing well. Expect the output of every program to become the input to another, as yet unknown, program. I'm having an issue trying to get the output of one to be the input of the other, and getting the output of one be the input of a separate instance of itself. Very briefly, I have a program add which takes arguments and spits out the summation. I want to be able to pipe the output to another add instance. ./add 1 2 | ./add 3 4 That should yield 6 but currently yields 10. I've encountered two problems: The cin waits for user input from the console. I don't want this, and haven't been able to find a simple example showing a the use of standard input stream without querying the user in the console. If someone knows of an example please let me know. I can't figure out how to use standard input while supporting piping. Currently, it appears it does not work. If I issue the command ./add 1 2 | ./add 3 4 it results in 7. The relevant code is below: add.cpp snippet // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing if (numbers.size() > 0) { double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } std::cout << sum << std::endl; } else { double input; // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> input; std::cout << input << std::endl; } } // ... MORE IRRELEVANT CODE ... So, I guess my question(s) is does anyone see what is incorrect with this code in order to support piping standard input? Are there some well known (or hidden) resources that explain clearly how to implement an example application supporting the basic Unix philosophy? @Chris Lutz I've changed the code to what's below. The problem where cin still waits for user input on the console, and doesn't just take from the standard input passed from the pipe. Am I missing something trivial for handling this? I haven't tried Greg Hewgill's answer yet, but don't see how that would help since the issue is still with cin. // ... COMMAND LINE PROCESSING ... std::vector<double> numbers = multi.getValue(); // using TCLAP for command line parsing double sum = numbers[0]; double arg; for (int i=1; i < numbers.size(); i++) { arg = numbers[i]; sum += arg; } // right now this is test code while I try and get standard input streaming working as expected while (std::cin) { std::cin >> arg; std::cout << arg << std::endl; } std::cout << sum << std::endl; // ... MORE IRRELEVANT CODE ...

    Read the article

  • Microsoft Channel 9 Interviews Mei Liang to Introduce Sample Browser Extension for Visual Studio 2012 and 2010

    - by Jialiang
    This morning, Microsoft Channel 9 interviewed Mei Liang - Group Manager of Microsoft All-In-One Code Framework - to introduce the newest Sample Browser extension for Visual Studio 2012 &2010.   This extension provides a way for developers to search and download more than 4500 code samples from within Visual Studio, including over 700 Windows 8 samples and more than 1000 All-In-One Code Framework customer-driven code samples. Mei shows us not only the extension, but also the standalone version of the Sample Browser.   http://channel9.msdn.com/Shows/Visual-Studio-Toolbox/Sample-Browser-Visual-Studio-Extension   Microsoft All-In-One Code Framework, working in close partnership with the Visual Studio product team and MSDN Samples Gallery, developed the Sample Browser extension for both Visual Studio 2012 and Visual Studio 2010.  As an effort to evolve the code sample use experience and improve developers' productivity, the Sample Browser allows programmers to search, download and open over 4500 code samples from within Visual Studio with just a few simple clicks.  If no existing code sample can meet the needs, developers can even request a code sample easily from Microsoft thanks to the free “Sample Request Service” offered by Microsoft All-In-One Code Framework.  Through innovations, the teams hope to put the power of tens of thousands of code samples at developers’ fingertips. In short 3 months, the Sample Browser Visual Studio Extension has been installed by 100K global users.  It is also selected as one of the six most highly regarded and commonly used tools for Visual Studio that will make your programming experience feel like never before.   Got to love the All-In-One Code Framework team! You guys know this is THE go to source for code samples. Get this extension and you'll never need to leave VS2012 (well except for bathroom trips, but that's TMI anyway... ;) Read More... From: Greg Duncan (Author of CoolThingOfTheDay) 9/6/2011 12:00 AM The one software design pattern that I have used in just about every application I’ve written is “cut-and-paste,” so the new “Sample Browser” – read sample as a noun not an adjective – is a great boon to my productivity. Read More... From: Jim O'Neil (Microsoft Developer Evangelist) 9/28/2011 12:00 AM Install: http://aka.ms/samplebrowservsx Microsoft All-In-One Code Framework also offers the standalone version of Sample Browser.   The standalone version is particularly useful to Visual Studio Express edition or Visual Studio 2008 users, who cannot install the Sample Browser Visual Studio extension.   From Grassroots’ Passion for Developers to the Innovation of Sample Browser This Sample Browser has come a very long way improving the code sample use experience.  The history can be traced back to a grass-root innovation three years ago.   In early 2009, a few MSDN forum support engineers observed that lots of developers were struggling to work in Visual Studio without adequate code samples. Programming tasks seem harder than they should be when you only read through the documentation.  Just a couple of lines of sample code could answer a lot of questions.   They had a brilliant idea: What if we produce code samples based on developers’ frequently asked programming tasks in forums, social networks and support incidents, and then aggregate all our sample code in a one-stop library to benefit developers?  And what if developers can request code samples directly from Microsoft, free of charge?  This small group of grassroots at Microsoft devoted their nights and weekends to prototyping such a customer-driven code sample library.  This simple idea eventually turned into “Microsoft All-In-One Code Framework”, aka. OneCode.  With the support from more and more passionate developers at Microsoft and the leaders in the Community and Online Support team and Microsoft Commercial Technical Services (CTS), the idea has become a continually growing library with over 1000 customer-driven code samples covering almost all Microsoft development technologies.  These code samples originated from developers’ common pains and needs should be able to help many developers.  However, if developers cannot easily discover the code samples, the effort would still be in vain.  So in early 2010, the team started the idea of Sample Browser to ease the discovery and access of these samples.  In just two months, the first version of Sample Browser was finished and released by a passionate developer.  It was a very simple application, only supporting the basic sample offline search.  Users had to download the whole 100MB sample package containing all samples first, and run the Sample Browser to search locally.   Though developers could not search and download samples on-demand, this simple application laid a solid foundation for the team’s continuous innovations of Sample Browsing experience. In 2011, MSDN Samples Gallery had a big refresh.  The online sample experience was brought to a new level thanks to its PM Steven Wilssens and the gallery team’s effort.  Microsoft All-In-One Code Framework Team saw the opportunity to realize the “on-demand” sample search and download feature with the new gallery.  The two teams formed a strong partnership to upload all the customer-driven code samples to MSDN Samples Gallery, and released the new version of Sample Browser to support “on-demand” sample downloading in April, 2011.  Mei Liang, the Group Manager of Microsoft All-In-One Code Framework, was interviewed by Channel 9 to demo the Sample Browser.  Customers love the effort and the innovation!!  This can be clearly seen from the user comments in the publishing page.   It was very encouraging to the team of All-In-One Code Framework. The team continues innovating and evolving the Sample Browser.  They found the Visual Studio product team this time, and integrated the Sample Browsing experience into the latest Visual Studio 2012.  The newly released Sample Browser Visual Studio extension makes good use of Visual Studio 2012 IDE such as the new Quick Launch bar, the code editor, the toolbar and menus to offer easy access to thousands of code samples from within the development environment.   The Visual Studio Senior Program Manager Lead - Anthony Cangialosi, the Program Manager - Murali Krishna Hosabettu Kamalesha, the MSDN Samples Gallery PM – Steven Wilssens, and the Visual Studio Senior Escalation Engineer - Ed Dore shared lots of insightful suggestions with the team.  Thanks to the brilliant cross-group collaboration inside Microsoft, tens of new features including “Local Language Support” and “Favorite Samples”, as well as a face-lifted user interface, were added to further enhance the user experience. Since the new Sample Browser Visual Studio extension was released, it has received over 100 thousand downloads and five-star ratings.  A customer told the team that he officially falls in LOVE with Microsoft All-In-One Code Framework.   The Sample Browser Innovation for Developers Never Stops! The teams would never stop improving the Sample Browser for developers’ easier lives.   The Microsoft All-In-One Code Framework, Visual Studio and MSDN Samples Gallery teams are working closely to develop the next version of Sample Browser.  Here are the key functions in development or in discussion.  We hope to learn your feedback of the effort.  You can submit your suggestions to the official Visual Studio UserVoice site.  We look forward to hearing from you! 1) Offline Sample Search This is one of the top feature requests that we have received for Sample Browser.   The Sample Browser will support the offline search mode so that developers can search downloaded code samples when they do not have internet access.  This is particularly useful to developers in Enterprises with strict proxy settings. 2) Code Snippet Support and Visual Studio Editor Integration Today, the Sample Browser supports downloading and opening sample project.   However, when developers are searching for code samples, a better user experience would be to see the code snippets in the search result first.  Developers can quickly decide if the code snippet is relevant.   They can also drag and drop the code snippet into the Visual Studio Editor to solve some simple programming tasks.  If developers want to learn more about the sample, they can then choose to download the sample project and open it in Visual Studio. 3) Enterprise Sample Sharing and Searching Large enterprises have many code samples for their own internal tools and APIs that are not appropriate to be shared publicly in MSDN Samples Gallery.   In that case, today’s Sample Browser and MSDN Samples Gallery cannot help these Enterprise developers.  The idea is to create a Code Sample Repository in TFS, and provide an additional Visual Studio extension for Enterprise developers to quickly share code samples to TFS.  The Sample Browser can be configured to connect to the TFS Code Sample Repository to search for and download code samples.  This would potentially enable the Enterprise developers to be more productive. 4) Windows Store Sample Browser With the upcoming release of Windows RT and Microsoft Surface, developers are facing a completely new world of application platform.   Not like laptop, people would often use Microsoft Surface in commute and in travel.  Internet may not be available.  Today’s Visual Studio cannot be installed and run on Windows RT, however, our enthusiastic developers would hope to spend every minute on code.  They love code!   The idea is to create a Windows Store version of Sample Browser. Search and download samples from the online Samples Gallery when the user has internet access. Browse the sample code files and learn the sample documentation of downloaded samples with or without internet access.   In addition to the "browse” function, the Sample Browser could further support “bookmark”, “learning notes”, “code review”, and “quick social sharing". Make full use of the new touch and Windows Store App UI to give developers a new “relaxing” code browsing and learning experience, anytime, anywhere. With Windows Store Sample Browser, developers can enjoy A new relaxing and enjoyable experience for developers to learn code samples You do not have to sit in front of desk and formally open Visual Studio to read code samples.  Many developers get sub-health due to staying in front of desk for a very long time.  With Windows RT, Microsoft Surface and this Windows Store Sample Browser combining with the online MSDN Samples Gallery, developers can sit in a sofa, relaxingly hold the tablet and enjoy to learn their beloved sample code with detailed documentation. Anytime, anywhere Whether you have internet access or not, whether you are at home, in office, or in commute/airplane, developers can always easily access and browse the sample code. Lightweight and fast Particularly for learning a small sample project, the Windows Store Sample Browser would be more lightweight and faster to open and browse the sample code. Please submit your feedback and suggestion to Visual Studio UserVoice.  We look forward to hearing from you and deliver a better and better sample use experience.  Happy Coding!   Special Thanks to People working behind the latest release of Sample Browser Visual Studio Extension and the great partnerships!

    Read the article

  • CodePlex Daily Summary for Wednesday, May 26, 2010

    CodePlex Daily Summary for Wednesday, May 26, 2010New Projects3D File Manager: 3D File manager is an application that aims to show how could look file manager in 3D. It´s developed in C# and XNA frameworkAcies: Acies is a dungeon crawler game done with C# and XNA.ActiveWinery: The open source winery and vineyard application.CC.Yacht: CC.Yacht is a client/server yacht dice game written in C# .NET. It utilizes a net.tcp WCF duplex service for client/server communication.Community Forums NNTP bridge: Community project for accessing the MS Web-Forums via an open source NNTP newsserver (bridge).Dojo Timer: WPF timer for Coding Dojo meetings. Timer feito em WPF para Coding Dojo.GameFX - The Game Development Framework: The Game Development Framework (GameFX) is simply a set of libraries to be used as the foundation for any simple 2D tile-based game. It can be used...Greg Roberts MVC Extensions: Asp.Net MVC Extensions including JSONP ActionResult. Targeted for MVC 2 and .NET 4.0.IIS Deploy: Project to develop a tool that automates the deploy Web sites and WCF services in single server environments and clustered.MarkLogic Sample Authoring App for Word: The MarkLogic Authoring Sample App for Word lets authors enrich Word documents using Content Controls, associate and manage metadata with those Con...Mono.Addins: Mono.Addins is a framework for creating extensible applications, and for creating add-ins which extend applications.MPCLI: MPCLI is a library that brings the power of the GNU MP big numbers library to those who use CLS-compliant languages such as C#, F#, and Visual Basi...NTFS parser classes: This is a C++ library to help parsing an NTFS volume, as well as file records and attributes. It will facilitate much when handling NTFS filesystem...Oddworld Level Gen: A 2D platform game, with Oddworld : Abe's Oddysee asset. The game introduce a dynamic system to generate the next level according to the previous l...Page Action Web Part for SharePoint 2007: This Web Part for SharePoint 2007 allows you to perform actions (such as causing an "Access is denied", redirect to another web page, view content ...Piggy Bank: Piggy Bank is a web-based financial application targetted towards kids.Productivity Hub Solutions: The Productivity Hub 2010 is a customizable, on-premise training solution for technology products. Developed by RedTech for Microsoft, the Producti...PyQt port of TortoiseHg: PyQt port of TortoiseHg (aka TortoiseHg 2.0)Releaser™: This is my private project. Currently, I'm not going to support it publicly.SLManagers: SLManagers 用于动态加载组件 实现对程序不同的的管理Smith Async .NET Memcached Client: Async .NET Memcached Client is a fully asynchronous implementation of a memcached client. The advantage of a fully asynchronous client is that you...Tauck Public API: Tauck's public API allows for travel agencies and other parterners to use Tauck's product information in their websites and other systems. Virtualizing WrapPanel: Virtualizing WrapPanel improves performance when binding a ListBox/ListView to a large amount of data. It is written in C#New Releases3D File Manager: 3D File manager: 3D File managerAragon Online Client: Aragon Online Client: The executable version of the Aragon Online Client can be installed from the Aragon Online page: http://aragon-online.net/aoclient/publish.phpASP.NET MVC CMS ( Using CommonLibrary.NET ): CommonLibrary CMS Alpha 2: CommonLibrary CMSA simple yet powerful CMS system in ASP.NET MVC 2 using C# 3.5. ActiveRecord based support for Blogs, Widgets, Pages, Parts, Ev...BFBC2 PRoCon: PRoCon 0.5.1.8: It's not even funny anymore =\Code for Rapid C# Windows Development eBook: LLBLGen LINQPad Data Context Driver Ver 1.0.0.3: Second release of a Static LLBLGen Pro Data Context Driver for LINQPad For LLBLGen Pro versions 2.6 and 3.0 beta. Fixed 'connection string not ini...Community Forums NNTP bridge: Community Forums NNTP Bridge V01: This is the first release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open sourcen NNTP Bridge.Community Forums NNTP bridge: Community Forums NNTP Bridge V02: This is the second release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open sourcen NNTP Bridge. ...DDDSample.Net: 0.9: Release 0.9 contains two major improvements: Vanilla version (both Synch and Asynch) has been updated so its model more closely resembles Java orig...DEWD: DEWD for Umbraco: Alpha release of the package. Usable for simple SQL editing, but lacking some core features such as validation, user friendly error handling, confi...Dojo Timer: Dojo Timer v1: Primeira versão Dojo Timer.eXpress Persistent Objects (XPO) Toolkit: Samples: Video Channel Channel.zip sample shows how to build a video site using XPO and WCF Data Services. DevExpress Channel DevExpress Channel Browse ...F# Project Extender: V0.9.2.1 (VS2008,VS2010): F# project extender for Visual Studio 2008 and Visual Studio 2010. Fixed bugs: -Project extender 0.9.2.0 can't be loaded in VS2008 without SDKFeedback Form: Feedback Application: Installer of the projectFeedback Form: Feedback Form: .sln for Feedback Form ApplicationGameFX - The Game Development Framework: Version 1.0 (Beta): Project is Visual Studio 2008 solution. GameFX Source code and sample program. The sample program allows you to create maps of any size, and drop ...MarkLogic Sample Authoring App for Word: MarkLogic Sample Authoring App for Word 1.0-1: Initial release of the MarkLogic Sample Authoring App for Word. See the home page for an overview on functionality. Within the release you'll ...MarkLogic Toolkit for Word: MarkLogic Toolkit for Word 1.2-1: Release built in support of the MarkLogic Sample Authoring App for Word. Updates include: update to XQuery API to expose functions for working w...Microsoft SQL Server Community & Samples: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release contains sample code for Microsoft SQL Server 2008R2. For many of these samples you will also need...Microsoft SQL Server Product Samples: Analysis Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Data Programming: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Database: AdventureWorks 2008R2 RTM: Sample Databases for Microsoft SQL Server 2008R2 (RTM)This release is dedicated to the sample databases that ship for Microsoft SQL Server 2008R2. ...Microsoft SQL Server Product Samples: End to End: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Engine: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Integration Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Replication: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Reporting Services: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Scripts: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: Service Broker: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...Microsoft SQL Server Product Samples: XML: SQL Server 2008R2 RTM: Microsoft SQL Server 2008R2 (RTM) This release is dedicated to the samples that ship for Microsoft SQL Server 2008R2. For many of these samples y...NLog - Advanced .NET Logging: Nightly Build 2010.05.25.001: Changes since the last build:2010-05-24 23:08:47 Jarek Kowalski Fixed base constructor invocation to ensure consistency. Added tests for common wra...NTFS parser classes: NTFS parser lib 0.55: 0.55openrs: Revision 3: Things that have been added since last release: Vector expanding Dynamic vectors Vector put method chaining Basic ISAAC implementation Wor...Page Action Web Part for SharePoint 2007: Page Action Web Part v1.0.0.0: First release of the Page Action Web Part v1.0.0.0.Productivity Hub Solutions: Silverlight Bookshelf: The Silverlight Bookshelf component of the 2010 Productivity Hub provides 4 accordion-style vertical tabs dispalying Featured Video, Featured Conte...Productivity Hub Solutions: Silverlight Product Carousel: The Product Carousel Silverlight component provides a rich navigation experience to the home page of the 2010 Productivity Hub - presenting the pro...Rawr: Rawr 2.3.18: >Rawr3 Public Beta has been released! Click here for details.< - Fix for bug in parsing characters with certain abnormal characters in their data. ...Runtime Intelligence Data Visualizer: RI Data Visualizer Release 1: This release of the RI Data Visualizer contains both a WPF client that displays application usage data and a Silverlight client that displays featu...sGSHOPedit: sGSHOPedit v1.1a: Fixed: bug in parsing description from "itemextdesc.txt" Fixed: surface change event Fixed: range for numeric values Added: search featureSLManagers: SlManagers: 实现简单的组件动态下载 使用Mef技术Sudoku (Multiplayer in RnD): Sudoku (Multiplayer in RnD) 1.0.1.0 program: Sudoku project was to practice on C# by making a desktop application using some algorithm Idea: The basic idea of algorithm is from http://www.ac...Sudoku (Multiplayer in RnD): Sudoku (Multiplayer in RnD) 1.0.1.0 source: user-interface, multi-threading, formatting Sudoku project was to practice on C# by making a desktop application using some algorithm Idea: The...Tauck Public API: XML Package 1.0: Current Release of XML dataTeach.Net: Teach.Net 1.0 Alpha: First alpha version. It should work, but there's gonna be bugs. Also, no intellisense documentation (or any other sort of documentation) yet. I'm w...VCC: Latest build, v2.1.30525.0: Automatic drop of latest buildVista Media Center TCP/IP Controller: Win7 64 and 32 bit Alpha - button command fix: button command fix , button-play, button-pause, button-skip back, button-skip fwd. Confirmed works on x64. Has not been tested on x32XsltDb - DotNetNuke Module Universal Building Block: 01.01.21: ASP.NET controls TreeView and TextEditor usage Live demo site Attention This release requires DNN 5.2 or higher as it using Telerik classes.in...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)patterns & practices – Enterprise LibraryMicrosoft SQL Server Community & SamplesPHPExcelASP.NETMost Active ProjectsAStar.netpatterns & practices – Enterprise Librarypatterns & practices: Windows Azure Security GuidanceRawrSqlServerExtensionsMono.AddinsBlogEngine.NETGMap.NET - Great Maps for Windows Forms & PresentationCodeReviewCaliburn: An Application Framework for WPF and Silverlight

    Read the article

  • CodePlex Daily Summary for Friday, November 02, 2012

    CodePlex Daily Summary for Friday, November 02, 2012Popular ReleasesVST.NET: VST.NET 1.0: Long overdue, but here is version 1.0! The zip contains the Debug and Release binaries for x86 and x64 as well as .NET 2.0 and .NET 4.0. Note that the samples sources are not included in the release. Refer to "Building the Source Code" to get them working in the Visual Studio Express editions. The documentation will be uploaded later... Changes: Removed nuget and fixed samples copy operation (*.exe). Finalized build automation support. Bug fixes in VS templates. Compiled and Packaged n...DevTreks -social budgeting that improves lives and livelihoods: DevTreks Version 1.0: This is the first production release.Mouse Jiggler: MouseJiggle-1.3: This adds the much-requested minimize-to-tray feature to Mouse Jiggler.Umbraco CMS: Umbraco 4.10.0 Release Candidate: This is a Release Candidate, which means that if we do not find any major issues in the next week, we will release this version as the final release of 4.10.0 on November 9th, 2012. The documentation for the MVC bits still lives in the Github version of the docs for now and will be updated on our.umbraco.org with the final release of 4.10.0. Browse the documentation here: https://github.com/umbraco/Umbraco4Docs/tree/4.8.0/Documentation/Reference/Mvc If you want to do only MVC then make sur...Skype Auto Recorder: SkypeAutoRecorder 1.3.4: New icon and images. Reworked settings window. Implemented high-quality sound encoding. Implemented a possibility to produce stereo records. Added buttons with system-wide hot keys for manual starting and canceling of recording. Added buttons for opening folder with records. Added Help button. Fixed an issue when recording is continuing after call end. Fixed an issue when recording doesn't start. Fixed several bugs and improved stability. Major refactoring and optimization...Access 2010 Application Platform - Build Your Own Database: Application Platform - 0.0.1: Initial Release This is the first version of the database. At the moment is all contained in one file to make development easier, but the obvious idea would be to split it into Front and Back End for a production version of the tool. The features it contains at the moment are the "Core" features.Python Tools for Visual Studio: Python Tools for Visual Studio 1.5: We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RTM. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. For a quick overview of the general IDE experience, please watch this video There are a number of exciting improvement in this release comp...Devpad: 4.25: Whats new for Devpad 4.25: New Theme support New Export Wordpress Minor Bug Fix's, improvements and speed upsAssaultCube Reloaded: 2.5.5: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: Fixed potential bot bugs: Map change, OpenAL...Edi: Edi 1.0 with DarkExpression: Added DarkExpression theme (dialogs and message boxes are not completely themed, yet)DirectX Tool Kit: October 30, 2012 (add WP8 support): October 30, 2012 Added project files for Windows Phone 8MCEBuddy 2.x: MCEBuddy 2.3.6: Changelog for 2.3.6 (32bit and 64bit) 1. Fixed a bug in multichannel audio conversion failure. AAC does not support 6 channel audio, MCEBuddy now checks for it and force the output to 2 channel if AAC codec is specified 2. Fixed a bug in Original Broadcast Date and Time. Original Broadcast Date and Time is reported in UTC timezone in WTV metadata. TVDB and MovieDB dates are reported in network timezone. It is assumed the video is recorded and converted on the same machine, i.e. local timezone...MVVM Light Toolkit: MVVM Light Toolkit V4.1 for Visual Studio 2012: This version only supports Visual Studio 2012 (and all Express editions too). If you use Visual Studio 2010, please stay tuned, we will publish an update in a few days with support for VS10. V4.1 supports: Windows Phone 8 Windows 8 (Windows RT) Silverlight 5 Silverlight 4 WPF 4.5 WPF 4 WPF 3.5 And the following development environments: Visual Studio 2012 (Pro, Premium, Ultimate) Visual Studio 2012 Express for Windows 8 Visual Studio 2012 Express for Windows Phone 8 Visual...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.73: Fix issue in Discussion #401101 (unreferenced var in a for-in statement was getting removed). add the grouping operator to the parsed output so that unminified parsed code is closer to the original. Will still strip unneeded parens later, if minifying. more cleaning of references as they are minified out of the code.RiP-Ripper & PG-Ripper: PG-Ripper 1.4.03: changes NEW: Added Support for the phun.org forum FIXED: Kitty-Kats new Forum UrlLiberty: v3.4.0.1 Release 28th October 2012: Change Log -Fixed -H4 Fixed the save verification screen showing incorrect mission and difficulty information for some saves -H4 Hopefully fixed the issue where progress did not save between missions and saves would not revert correctly -H3 Fixed crashes that occurred when trying to load player information -Proper exception dialogs will now show in place of crashesPlayer Framework by Microsoft: Player Framework for Windows 8 (Preview 7): This release is compatible with the version of the Smooth Streaming SDK released today (10/26). Release 1 of the player framework is expected to be available next week. IMPROVEMENTS & FIXESIMPORTANT: List of breaking changes from preview 6 Support for the latest smooth streaming SDK. Xaml only: Support for moving any of the UI elements outside the MediaPlayer (e.g. into the appbar). Note: Equivelent changes to the JS version due in coming week. Support for localizing all text used in t...Send multiple SMS via Way2SMS C#: SMS 1.1: Added support for 160by2Quick Launch: Quick Launch 1.0: A Lightweight and Fast Way to Manage and Launch Thousands of Tools and ApplicationsPress Win+Q and start to search and run. http://www.codeplex.com/Download?ProjectName=quicklaunch&DownloadId=523536Orchard Project: Orchard 1.6: Please read our release notes for Orchard 1.6: http://docs.orchardproject.net/Documentation/Orchard-1-6-Release-Notes Please do not post questions as reviews. Questions should be posted in the Discussions tab, where they will usually get promptly responded to. If you post a question as a review, you will pollute the rating, and you won't get an answer.New ProjectsAnother Green World: Another Green WorldApplication Data across Web Farm: A library containing a new version of the HttpApplicationState class that allow the synchronization of data across a web farm. With this library, on the web farm Data can be: • Shared • Auto synchronized • Locked for a useras3 game: it's as3 framework . Context: UIFramework,GameFrameworkBit Moose: Bit Moose is a bitcoin mining assistant program. It allows miners to run under a background windows service. Includes a GUI and console host.BUMO: A TFS Build MOnitoring Tool: BUMO is short name of Build Monitoring tool for TFS. BUMO provides a platform to Monitor TFS builds, Statistics of Builds,View Build History, Clone Builds.cantinho: cantinhoCMTS: CMTSCP866U Encoding: A simple implementation of cp866u encoding class written in C#.D3D9Client: This is a DirectX 9 Graphics Client for Orbiter SpaceFlight SimulatorDatabase Exporter for DotNetNuke By IowaComputerGurus Inc.: Database Exporter is a customized SQL module designed for selects and supports export to XML/CSV of query resultsDistributed Systems at TUM: The initial project is a simple client connecting to a server through sockets. The client may send a message and the server will echo back that message.Enneract Project: Enneract makes it easier for LDS Leadership works in theirs responsibilities through BI techniques.EntityFramework Reverse POCO Code First Generator: Reverse engineers an existing database and generates EntityFramework Code First POCO classes, DbContext and Configuration mappingsfastCSharp: ?????????????????,????????????????????,??????。 ??:????????????????,???????????,???????????。 ??? http://www.51nod.com/topic/index.html#!topicId=100000056Fault Logger!: Fault Logger is a simple web application ideal for schools and small business which allows staff to report computer faults and allows technicians to keep logs.Flow Sequencer: Simple task sequencer using Johnson's algorithm for two and three machines. Client side is made in wpf technology. Free Template Filler (FTF): Tool for generating text from templates, when given parameters. Good for generating code, etc.GameTrakXNA: This project aims to create a simple library to use the unique GameTrak controller within XNA and Flash.Greg DNN Task Manager: This is a test project that I am using to learn about DNN Module Development.GroupToolbar: A Toolbar that can group your items and is totally templatable :)Helper Project: Helpdesk project, using C# .NET, Linq, with Nhibernate ORM and interface written using ExtJS 4.HIPO Legacy Systems Flowcharter: Creates Visio HIPO flowcharts from legacy system batch files.huangli1101: jabbr projectHusqvarna Svishtov: The idea is to create content management system for web site of a small town store.Hybrid Lab Workflow: This workflow allows you to incorporate snapshots into a Build-Deploy-Test using a Standard Environment that is composed of Virtual Machines in TFS 2012.jas: Project to make an application for Jeugdondernemingen Aartselaar Service...keleyi: MD5 C# WinFormKnockout.js Declarations for TypeScript: A set of declarations for intellisense and type completion for Knockout.js in TypeScript. Last edited Today at 8:39 AM by jnosek, version 2List Rollup web part: Hi I was wondering that how to show a list from other site on my home page. But I didn’t get any proper & free solution. So I decided to create a Cross Site LisMidlands Community Management Solution (MCMS): This project is to develop an open source residential community management solution. This initiative has been taken by the IT guys of Srijan Midlands Community.MultipartHttpClient for Windows Phone 7: This is a simple MultipartHttpClient for windows phone 7MyGProject: GprojectN2F Yverdon FirePHP Extension: Extension to add FirePHP support to your N2F Yverdon projects.netbee: ???Over Look Pa Controller: Real World Application for the Collection of Credit Card, Gift Card and Driver License information. Controls access to an Observation Tower via a turnstile.PureSystems DotNetNuke GoSquared tracking module: A DotNetNuke module which adds the GoSquared tracking code to your pages.Quiz Module for DotNetNuke by IowaComputerGurus Inc.: The IowaComputerGurus DotNetNuke Quiz Module is a free extension that allows users to quickly and easily create custom quizzes for their site.RadioSmart: ? ??d??a? t?? radiosmartRSA ID Validation for SQL Server: Solution for validating South African identity numbers. Provides SQL Server CLR bindings which allow identity numbers to be efficiently validated within T-SQLSales Visualization Web App: Sales VisualizationsSonar Connector (Wagga Wagga Christian College Network connector): A network settings manager for Wagga Wagga christian college. Supports switching back and forth settings and can be used for personal use.Stopwatch - Windows Phone: This project is a Stopwatch for windows phone app. Now, it had published in Windows phone store.Study: Study for ExtJS! Come on!Team Foundation Server Test Management Tools: Team Foundation Server Test Management Tools ? Team Foundation Server ???? ???? Visual Studio ??????????????????????????。testdd11012012git01: ttesttom11012012tfs02: gdgf dgf dTimBazinga EVoting: Undergrad project - designing an e-voting software system.TNT Scripts: TNT ScriptsTrombone: Trombone makes it easier for Windows Mobile Professional users to automate status reply through SMS. It's developed in Visual C# 2008.TurbofilmTV: Turbofilm metro is the greate metro app for turbofilm users.VinculacionMicrosoft: Vinculacion Microsoft is a project for distributing Dreamsparks and Faculty Connection codes to students and professors. It is developed in ASP .Net and designed for Universities in Mexico interested in the different benefits that Microsoft has for them. Visual Studio Reference Swap: Winforms app and Nant task that will handle swapping out project references to file references.??????: ??????????: ??

    Read the article

  • Project Management Helps AmeriCares Deliver International Aid

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Alison Weiss Handle with Care Sound project management helps AmeriCares bring international aid to those in need. The stakes are always high for AmeriCares. On a mission to restore health and save lives during times of disaster, the nonprofit international relief and humanitarian aid organization delivers donated medicines, medical supplies, and humanitarian aid to people in the U.S. and around the globe. Founded in 1982 with the express mission of responding as quickly and efficiently as possible to help people in need, the Stamford, Connecticut-based AmeriCares has delivered more than US$10.5 billion in aid to 147 countries over the past three decades. Launch the Slideshow “It’s critically important to us that we steward all the donations and that the medical supplies and medicines get to people as quickly as possible with no loss,” says Kate Sears, senior vice president for finance and technology at AmeriCares. “Whether we’re shipping IV solutions to victims of cholera in Haiti or antibiotics to Somali famine victims, we need to get the medicines there sooner because it means more people will be helped and lives improved or even saved.” Ten years ago, the tracking systems used by AmeriCares associates were paper-based. In recent years, staff started using spreadsheets, but the tracking processes were not standardized between teams. “Every team was tracking completely different information,” says Megan McDermott, senior associate, Sub-Saharan Africa partnerships, at AmeriCares. “It was just a few key things. For example, we tracked the date a shipment was supposed to arrive and the date we got reports from our partner that a hospital received aid on their end.” While the data was accurate, much detail was being lost in the process. AmeriCares management knew it could do a better job of tracking this enterprise data and in 2011 took a significant step by implementing Oracle’s Primavera P6 Professional Project Management. “It’s a comprehensive solution that has helped us improve the monitoring and controlling processes. It has allowed us to do our distribution better,” says Sears. In addition, the implementation effort has been a change agent, helping AmeriCares leadership rethink project management across the entire organization. Initially, much of the focus was on standardizing processes, but staff members also learned the importance of thinking proactively to prevent possible problems and evaluating results to determine if goals and objectives are truly being met. Such data about process efficiency and overall results is critical not only to AmeriCares staff but also to the donors supporting the organization’s life-saving missions. Efficiency Saves Lives One of AmeriCares’ core operations is to gather product donations from the private sector, establish where the most-urgent needs are, and solicit monetary support to send the aid via ocean cargo or airlift to welfare- and health-oriented nongovernmental organizations, hospitals, health networks, and government ministries based in areas in need. In 2011 alone, AmeriCares sent more than 3,500 shipments to 95 countries in response to both ongoing humanitarian needs and more than two dozen emergencies, including deadly tornadoes and storms in the U.S. and the devastating tsunami in Japan. When it comes to nonprofits in general, donors want to know that the charitable organizations they support are using funds wisely. Typically, nonprofits are evaluated by donors in terms of efficiency, an area where AmeriCares has an excellent reputation: 98 percent of expenses go directly to supporting programs and less than 2 percent represent administrative and fundraising costs. Donors, however, should look at more than simple efficiency, says Peter York, senior partner and chief research and learning officer at TCC Group, a nonprofit consultancy headquartered in New York, New York. They should also look at whether organizations have the systems in place to sustain their missions and continue to thrive. An expert on nonprofit organizational management, York has spent years studying sustainable charitable organizations. He defines them as nonprofits that are able to achieve the ongoing financial support to stay relevant and continue doing core mission work. In his analysis of well over 2,500 larger nonprofits, York has found that many are not sustaining, and are actually scaling back in size. “One of the biggest challenges of nonprofit sustainability is the general public’s perception that every dollar donated has to go only to the delivery of service,” says York. “What our data shows is that there are some fundamental capacities that have to be there in order for organizations to sustain and grow.” York’s research highlights the importance of data-driven leadership at successful nonprofits. “You’ve got to have the tools, the systems, and the technologies to get objective information on what you do, the people you serve, and the results you’re achieving,” says York. “If leaders don’t have the knowledge and the data, they can’t make the strategic decisions about programs to take organizations to the next level.” Historically, AmeriCares associates have used time-tested and cost-effective strategies to ship and then track supplies from donation to delivery to their destinations in designated time frames. When disaster strikes, AmeriCares ships by air and generally pulls out all the stops to deliver the most urgently needed aid within the first few days and weeks. Then, as situations stabilize, AmeriCares turns to delivering sea containers for the postemergency and ongoing aid so often needed over the long term. According to McDermott, getting a shipment out the door is fairly complicated, requiring as many as five different AmeriCares teams collaborating together. The entire process can take months—from when products are received in the warehouse and deciding which recipients to allocate supplies to, to getting customs and governmental approvals in place, actually shipping products, and finally ensuring that the products are received in-country. Delivering that aid is no small affair. “Our volume exceeds half a billion dollars a year worth of donated medicines and medical supplies, so it’s a sizable logistical operation to bring these products in and get them out to the right place quickly to have the most impact,” says Sears. “We really pride ourselves on our controls and efficiencies.” Adding to that complexity is the fact that the longer it takes to deliver aid, the more dire the human need can be. Any time AmeriCares associates can shave off the complicated aid delivery process can translate into lives saved. “It’s really being able to track information consistently that will help us to see where are the bottlenecks and where can we work on improving our processes,” says McDermott. Setting a Standard Productivity and information management improvements were key objectives for AmeriCares when staff began the process of implementing Oracle’s Primavera solution. But before configuring the software, the staff needed to take the time to analyze the systems already in place. According to Greg Loop, manager of database systems at AmeriCares, the organization received guidance from several consultants, including Rich D’Addario, consulting project manager in the Primavera Global Business Unit at Oracle, who was instrumental in shepherding the critical requirements-gathering phase. D’Addario encouraged staff to begin documenting shipping processes by considering the order in which activities occur and which ones are dependent on others to get accomplished. This exercise helped everyone realize that to be more efficient, they needed to keep track of shipments in a more standard way. “The staff didn’t recognize formal project management methodology,” says D’Addario. “But they did understand what the most important things are and that if they go wrong, an entire project can go off course.” Before, if a boatload of supplies was being sent to Haiti and there was a problem somewhere, a lot of time was taken up finding out where the problem was—because staff was not tracking things in a standard way. As a result, even more time was needed to find possible solutions to the problem and alert recipients that the aid might be delayed. “For everyone to put on the project manager hat and standardize the way every single thing is done means that now the whole organization is on the same page as to what needs to occur from the time a hurricane hits Haiti and when a boat pulls in to unload supplies,” says D’Addario. With so much care taken to put a process foundation firmly in place, configuring the Primavera solution was actually quite simple. Specific templates were set up for different types of shipments, and dashboards were implemented to provide executives with clear overviews of every project in the system. AmeriCares’ Loop reports that system planning, refining, and testing, followed by writing up documentation and training, took approximately four months. The system went live in spring 2011 at AmeriCares’ Connecticut headquarters. While the nonprofit has an international presence, with warehouses in Europe and offices in Haiti, India, Japan, and Sri Lanka, most donated medicines come from U.S. entities and are shipped from the U.S. out to the rest of the world. In addition, all shipments are tracked from the U.S. office. AmeriCares doesn’t expect the Primavera system to take months off the shipping time, especially for sea containers. However, any time saved is still important because it will allow aid to be delivered to people more quickly at a lower overall cost. “If we can trim a day or two here or there, that can translate into lives that we’re saving, especially in emergency situations,” says Sears. A Cultural Change Beyond the measurable benefits that come with IT-driven process improvement, AmeriCares management is seeing a change in culture as a result of the Primavera project. One change has been treating every shipment of aid as a project, and everyone involved with facilitating shipments as a project manager. “This is a revolutionary concept for us,” says McDermott. “Before, we were used to thinking we were doing logistics—getting a container from point A to point B without looking at it as one project and really understanding what it meant to manage it.” AmeriCares staff is also happy to report that collaboration within the organization is much more efficient. When someone creates a shipment in the Primavera system, the same shared template is used, which means anyone can log in to the system to see the status of a shipment. Knowledgeable staff can access a shipment project to help troubleshoot a problem. Management can easily check the status of projects across the organization. “Dashboards are really useful,” says McDermott. “Instead of going into the details of each project, you can just see the high-level real-time information at a glance.” The new system is helping team members focus on proactively managing shipments rather than simply reacting when problems occur. For example, when a container is shipped, documents must be included for customs clearance. Now, the shipping template has built-in reminders to prompt team members to ask for copies of these documents from freight forwarders and to follow up with partners to discover if a shipment is on time. In the past, staff may not have worked on securing these documents until they’d been notified a shipment had arrived in-country. Another benefit of capturing and adopting best practices within the Primavera system is that staff training is easier. “Capturing the processes in documented steps and milestones allows us to teach new staff members how to do their jobs faster,” says Sears. “It provides them with the knowledge of their predecessors so they don’t have to keep reinventing the wheel.” With the Primavera system already generating positive results, management is eager to take advantage of advanced capabilities. Loop is working on integrating the company’s proprietary inventory management system with the Primavera system so that when logistics or warehousing operators input data, the information will automatically go into the Primavera system. In the past, this information had to be manually keyed into spreadsheets, often leading to errors. Mining Historical Data Another feature on the horizon for AmeriCares is utilizing Primavera P6 Professional Project Management reporting capabilities. As the system begins to include more historical data, management soon will be able to draw on this information to conduct analysis that has not been possible before and create customized reports. For example, at the beginning of the shipment process, staff will be able to use historical data to more accurately estimate how long the approval process should take for a particular country. This could help ensure that food and medicine with limited shelf lives do not get stuck in customs or used beyond their expiration dates. The historical data in the Primavera system will also help AmeriCares with better planning year to year. The nonprofit’s staff has always put together a plan at the beginning of the year, but this has been very challenging simply because it is impossible to predict disasters. Now, management will be able to look at historical data and see trends and statistics as they set current objectives and prepare for future need. In addition, this historical data will provide AmeriCares management with the ability to review year-end data and compare actual project results with goals set at the beginning of the year—to see if desired outcomes were achieved and if there are areas that need improvement. It’s this type of information that is so valuable to donors. And, according to York, project management software can play a critical role in generating the data to help nonprofits sustain and grow. “It is important to invest in systems to help replicate, expand, and deliver services,” says York. “Project management software can help because it encourages nonprofits to examine program or service changes and how to manage moving forward.” Sears believes that AmeriCares donors will support the return on investment the organization will achieve with the Primavera solution. “It won’t be financial returns, but rather how many more people we can help for a given dollar or how much more quickly we can respond to a need,” says Sears. “I think donors are receptive to such arguments.” And for AmeriCares, it is all about the future and increasing results. The project management environment currently may be quite simple, but IT staff plans to expand the complexity and functionality as the organization grows in its knowledge of project management and the goals it wants to achieve. “As we use the system over time, we’ll continue to refine our best practices and accumulate more data,” says Sears. “It will advance our ability to make better data-driven decisions.”

    Read the article

< Previous Page | 28 29 30 31 32 33  | Next Page >