Search Results

Search found 9154 results on 367 pages for 'job market'.

Page 144/367 | < Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >

  • CodePlex Daily Summary for Tuesday, July 03, 2012

    CodePlex Daily Summary for Tuesday, July 03, 2012Popular ReleasesMini SQL Query: Mini SQL Query: Just a bug fix release for when the connections try to refresh after an edit. Make sure you read the http://pksoftware.net/Content/MiniSqlQuery/Help/MiniSqlQueryQuickStart.docx for an introduction.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.58: Fix for Issue #18296: provide "ALL" value to the -ignore switch to ignore all error and warning messages. Fix for issue #18293: if encountering EOF before a function declaration or expression is properly closed, throw an appropriate error and don't crash. Adjust the variable-renaming algorithm so it's very specific when renaming variables with the same number of references so a single source file ends up with the same minified names on different platforms. add the ability to specify kno...LogExpert: 1.4 build 4566: This release for the 1.4 version line contains various fixes which have been made some times ago. Until now these fixes were only available in the 1.5 alpha versions. It also contains a fix for: 710. Column finder (press F8 to show) Terminal server issues: Multiple sessions with same user should work now Settings Export/Import available via Settings Dialog still incomple (e.g. tab colors are not saved) maybe I change the file format one day no command line support yet (for importin...DynamicToSql: DynamicToSql 1.0.0 (beta): 1.0.0 beta versionCommonLibrary.NET: CommonLibrary.NET 0.9.8.5 - Final Release: A collection of very reusable code and components in C# 4.0 ranging from ActiveRecord, Csv, Command Line Parsing, Configuration, Holiday Calendars, Logging, Authentication, and much more. FluentscriptCommonLibrary.NET 0.9.8 contains a scripting language called FluentScript. Releases notes for FluentScript located at http://fluentscript.codeplex.com/wikipage?action=Edit&title=Release%20Notes&referringTitle=Documentation Fluentscript - 0.9.8.5 - Final ReleaseApplication: FluentScript Versio...SharePoint 2010 Metro UI: SharePoint 2010 Metro UI8: Please review the documentation link for how to install. Installation takes some basic knowledge of how to upload and edit SharePoint Artifact files. Please view the discussions tab for ongoing FAQsBack-Propagation Neural Networks Simulation: Back-Propagation Neural Networks Simulation: This is the first release application for Back-Propagation Neural Networks Simulation. It is required .NET Framework 4.0. Check this to use http://backpronn.codeplex.comnopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.60: Highlight features & improvements: • Significant performance optimization. • Use AJAX for adding products to the cart. • New flyout mini-shopping cart. • Auto complete suggestions for product searching. • Full-Text support. • EU cookie law support. To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).THE NVL Maker: The NVL Maker Ver 3.51: http://download.codeplex.com/Download?ProjectName=nvlmaker&DownloadId=371510 ????:http://115.com/file/beoef05k#THE-NVL-Maker-ver3.51-sim.7z ????:http://www.mediafire.com/file/6tqdwj9jr6eb9qj/THENVLMakerver3.51tra.7z ======================================== ???? ======================================== 3.51 beta ???: ·?????????????????????? ·?????????,?????????0,?????????????????????? ·??????????????????????????? ·?????????????TJS????(EXP??) ·??4:3???,???????????????,??????????? ·?????????...????: ????2.0.3: 1、???????????。 2、????????。 3、????????????。 4、bug??,????。Generic enumeration: Enumeration.zip: Generic enumeration dll with a test console application.Apworks: Apworks (v2.5.4563.21309, 30JUN2012): Installation Prerequisites: 1. Microsoft .NET Framework 4.0 SP1 2. Microsoft Visual Studio 2010 SP1 3. Other required libraries & assemblies are now included in the installation package so no more prerequisites needed Functional Updates: 1. Refactor the identity field of the IEntity interface from 'Id' to 'ID' 2. Changed the MySql Storage to use the MySql NetConnector version 6.4.4. 3. Implemented the paging support for the repositories. 4. Added the Eager Loading Property specification t...AssaultCube Reloaded: 2.5 Intrepid: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, download the Linux package. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) You should delete /home/config/saved.cfg to reset binds/other stuff If you us...WallSwitch: WallSwitch 1.0.4: Version 1.0.4 Changes: Changes in previous version broke hotkey support; fixed this. When minimizing the window to tray, if there are unsaved changes, then prompt to save first. When changing the path on a folder location, reset the file list. Fixed app not shutting down properly when closed externally. Improved installer to shut down app instead of requiring reboot. Clear History action now clears all history, rather than just the current theme.List Form Manipulation Framework (LFMF) for SharePoint 2010: LFMF v1.1: Added new SPFieldIO field type to reference filenamesSystem.Net.FtpClient: API Reference 2012.06.29: Updated with new file listing example as well as any API changes since the last release.Magelia WebStore Open-source Ecommerce software: Magelia WebStore 2.0: User Right Licensing ContentType version 2.0.267.1Designing Windows 8 Applications with C# and XAML: Chapters 1 - 7 Release Preview: Source code for all examples from Chapters 1 - 7 for the Release PreviewSQL Server FineBuild: Version 3.1.0: Top SQL Server FineBuild Version 3.1.0This is the stable version of FineBuild for SQL Server 2012, 2008 R2, 2008 and 2005 Documentation FineBuild Wiki containing details of the FineBuild process Known Issues Limitations with this release FineBuild V3.1.0 Release Contents List of changes included in this release Please DonateFineBuild is free, but please donate what you think FineBuild is worth as everything goes to charity. Tearfund is one of the UK's leading relief and de...EasySL: RapidSL V2: Rewrite RapidSL UI Framework, Using Silverlight 5.0 EF4.1 Code First Ria Service SP2 + Lastest Silverlight Toolkit.New Projects2DoTasks: This is a simple Dnn Module for Project/Tasks/time management. I hope you all found it useful.AOPify: AOPify lightweight fluent AOP framework that provides basic AOP features both before,after,onerror etc with fluent syntax..Capacitacion: This is my first Project - mvillasecoCLFiling: Client filing with search functionsContoso University MVC NTier: This is first version reference application for ASP.NET MVC NTier.DynamicToSql: DynamicToSql is a lightweight API that maps dynamically typed entities to a database and vice versa. It works against any database that has ADO.NET drivers.Enlightener: Deobfuscator target at Confuser. Child project of http://netdeob0.codeplex.com/Excel Adjacency List to Dot: Excel add-in to generate graphs in GraphViz dot file format from an adjacency list in Excel.FFXIV Job Bar: This is a little utility to help with Job changes in FFXIV. Gallery: A simple javascript/jquery based image gallery. Just include the Gallery.js and jquery in your html page.Grocery Droid: Transfer your P.C. designed shopping list to androidLOCOL: Locol is a internet medium to provide local information and services to district local residents.Meter: Log communal meters dataNthDownload: NthDownload is yet another attempt at a download manager.Paginacion de un objeto en un gridview: Como paginar un objeto con multiples resultados en un gridviewRunman game: "Runman" is port of "Pac-Man" game writing on C# and XNASB01: Project will be a startup for a warehouse management and easy to extend by any developer. Plug n play kind of setup.SC2Ranks API: A .NET 4.0 SC2Ranks APISharePoint 2010 Cookie Approval Control: If you need to comply with EU cookie law this control displays a message to each user that cookies are in use on the site and allows them to accept the messageSharePoint Advanced Visibility Options: Advanced Visibility Options - is an extension to SharePoint providing configurable List Field Iterator with PowerShell scripts.smtools2: Add-on tools for Service MonsterSparqube Picture Column Lite for SharePoint 2010: Sparqube Picture Column Lite is simple component for uploading and displaying images in SharePoint 2010 lists.SpudloFizzBuzz: This project is part of a job application process.SwiftMVVM: An extremely fast, easily refactorable implementation of INotifyPropertyChanged/ing, using dynamic proxy generation, as well as a robust change tracking engine.ZPL Converter: ZPL Converter converts Zune Playlists (.ZPL) easily and fast into other playlist formats, e.g. '.M3U' or '.PLS'.

    Read the article

  • Oracle on Oracle: Is that all?

    - by Darin Pendergraft
    On October 17th, I posted a short blog and a podcast interview with Chirag Andani, talking about how Oracle IT uses its own IDM products. Blog link here. In response, I received a comment from reader Jaime Cardoso ([email protected]) who posted: “- You could have talked about how by deploying Oracle's Open standards base technology you were able to integrate any new system in your infrastructure in days. - You could have talked about how by deploying federation you were enabling the business side to keep all their options open in terms of companies to buy and sell while maintaining perfect employee and customer's single view. - You could have talked about how you are now able to cut response times to your audit and security teams into 1/10th of your former times Instead you spent 6 minutes talking about single sign on and self provisioning? If I didn't knew your IDM offer so well I would now be wondering what its differences from Microsoft's offer was. Sorry for not giving a positive comment here but, please your IDM suite is very good and, you simply aren't promoting it well enough” So I decided to send Jaime a note asking him about his experience, and to get his perspective on what makes the Oracle products great. What I found out is that Jaime is a very experienced IDM Architect with several major projects under his belt. Darin Pendergraft: Can you tell me a bit about your experience? How long have you worked in IT, and what is your IDM experience? Jaime Cardoso: I started working in "serious" IT in 1998 when I became Netscape's technical specialist in Portugal. Netscape Portugal didn't exist so, I was working for their VAR here. Most of my work at the time was with Netscape's mail server and LDAP server. Since that time I've been bouncing between the system's side like Sun resellers, Solaris stuff and even worked with Sun's Engineering in the making of an Hierarchical Storage Product (Sun CIS if you know it) and the application's side, mostly in LDAP and IDM. Over the years I've been doing support, service delivery and pre-sales / architecture design of IDM solutions in most big customers in Portugal, to name a few projects: - The first European deployment of Sun Access Manager (SAPO – Portugal Telecom) - The identity repository of 5/5 of the Biggest Portuguese banks - The Portuguese government federation of services project DP: OK, in your blog response, you mentioned 3 topics: 1. Using Oracle's standards based architecture; (you) were able to integrate any new system in days: can you give an example? What systems, how long did it take, number of apps/users/accounts/roles etc. JC: It's relatively easy to design a user management strategy for a static environment, or if you simply assume that you're an <insert vendor here> shop and all your systems will bow to that vendor's will. We've all seen that path, the use of proprietary technologies in interoperability solutions but, then reality kicks in. As an ISP I recall that I made the technical decision to use Active Directory as a central authentication system for the entire IT infrastructure. Clients, systems, apps, everything was there. As a good part of the systems and apps were running on UNIX, then a connector became needed in order to have UNIX boxes to authenticate against AD. And, that strategy worked but, each new machine required the component to be installed, monitoring had to be made for that component and each new app had to be independently certified. A self care user portal was an ongoing project, AD access assumes the client is inside the domain, something the ISP's customers (and UNIX boxes) weren't nor had any intention of ever being. When the Windows 2008 rollout was done, Microsoft changed the Active Directory interface. The Windows administrators didn't have enough know-how about directories and the way systems outside the MS world behaved so, on the go live, things weren't properly tested and a general outage followed. Several hours and 1 roll back later, everything was back working. But, the ISP still had to change all of its applications to work with the new access methods and reset the effort spent on the self service user portal. To keep with the same strategy, they would also have to trust Microsoft not to change interfaces again. Simply by putting up an Oracle LDAP server in the middle and replicating the user info from the AD into LDAP, most of the problems went away. Even systems for which no AD connector existed had PAM in them so, integration was made at the OS level, fully supported by the OS supplier. Sun Identity Manager already had a self care portal, combined with a user workflow so, all the clearances had to be given before the account was created or updated. Adding a new system as a client for these authentication services was simply a new checkbox in the OS installer and, even True64 systems were, for the first time integrated also with a 5 minute work of a junior system admin. True, all the windows clients and MS apps still went to the AD for their authentication needs so, from the start everybody knew that they weren't 100% free of migration pains but, now they had a single point of problems to look at. If you're looking for numbers: - 500K directory entries (users) - 2-300 systems After the initial setup, I personally integrated about 20 systems / apps against LDAP in 1 day while being watched by the different IT teams. The internal IT staff did the rest. DP: 2. Using Federation allows the business to keep options open for buying and selling companies, and yet maintain a single view for both employee and customer. What do you mean by this? Can you give an example? JC: The market is dynamic. The company that's being bought today tomorrow will be sold again. Companies that spread on different markets may see the regulator forcing a sale of part of a company due to monopoly reasons and companies that are in multiple countries have to comply with different legislations. Our job, as IT architects, while addressing the customers and employees authentication services, is quite hard and, quite contrary. On one hand, we need to give access to all of our employees to the relevant systems, apps and resources and, we already have marketing talking with us trying to find out who's a customer of the bough company but not from ours to address. On the other hand, we have to do that and keep in mind we may have to break up all that effort and that different countries legislation may became a problem with a full integration plan. That's a job for user Federation. you don't want to be the one who's telling your President that he will sell that business unit without it's customer's database (making the deal worth a lot less) or that the buyer will take with him a copy of your entire customer's database. Federation enables you to start controlling permissions to users outside of your traditional authentication realm. So what if the people of that company you just bought are keeping their old logins? Do you want, because of that, to have a dedicated system for their expenses reports? And do you want to keep their sales (and pre-sales) people out of the loop in terms of your group's path? Control the information flow, establish a Federation trust circle and give access to your apps to users that haven't (yet?) been brought into your internal login systems. You can still see your users in a unified view, you obviously control if a user has access to any particular application, either that user is in your local database or stored in a directory on the other side of the world. DP: 3. Cut response times of audit and security teams to 1/10. Is this a real number? Can you give an example? JC: No, I don't have any backing for this number. One of the companies I did system Administration for has a SOX compliance policy in place (I remind you that I live in Portugal so, this definition of SOX may be somewhat different from what you're used to) and, every time the audit team says they'll do another audit, we have to negotiate with them the size of the sample and we spend about 15 man/days gathering all the required info they ask. I did some work with Sun's Identity auditor and, from what I've been seeing, Oracle's product is even better and, I've seen that most of the information they ask would have been provided in a few hours with the help of this tool. I do stand by what I said here but, to be honest, someone from Identity Auditor team would do a much better job than me explaining this time savings. Jaime is right: the Oracle IDM products have a lot of business value, and Oracle IT is using them for a lot more than I was able to cover in the short podcast that I posted. I want to thank Jaime for his comments and perspective. We want these blog posts to be informative and honest – so if you have feedback for the Oracle IDM team on any topic discussed here, please post your comments below.

    Read the article

  • How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions

    - by Eric Z Goodnight
    Have a huge folder of images needing tweaks? A few hundred adjustments may seem like a big, time consuming job—but read one to see how Photoshop can do repetitive tasks automatically, even if you don’t know how to program! Photoshop Actions are a simple way to program simple routines in Photoshop, and are a great time saver, allowing you to re-perform tasks over and over, saving you minutes or hours, depending on the job you have to work on. See how any bunch of images and even some fairly complicated photo tweaking can be done automatically to even hundreds of images at once. When Can I use Photoshop Actions? Photoshop actions are a way of recording the tools, menus, and keys pressed while using the program. Each time you use a tool, adjust a color, or use the brush, it can be recorded and played back over any file Photoshop can open. While it isn’t perfect and can get very confused if not set up correctly, it can automate editing hundreds of images, saving you hours and hours if you have big jobs with complex edits. The image illustrated above is a template for a polaroid-style picture frame. If you had several hundred images, it would actually be a simple matter to use Photoshop Actions to create hundreds of new images inside the frame in almost no time at all. Let’s take a look at how a simple folder of images and some Image editing automation can turn lots of work into a simple and easy job. Creating a New Action Actions is a default part of the “Essentials” panel set Photoshop begins with as a default. If you can’t see the panel button under the “History” button, you can find Actions by going to Window > Actions or pressing Alt + F9. Click the in the Actions Panel, pictured in the previous illustration on the left. Choose to create a “New Set” in order to begin creating your own custom Actions. Name your action set whatever you want. Names are not relevant, you’ll simply want to make it obvious that you have created it. Click OK. Look back in the layers panel. You’ll see your new Set of actions has been added to the list. Click it to highlight it before going on. Click the again to create a “New Action” in your new set. If you care to name your action, go ahead. Name it after whatever it is you’re hoping to do—change the canvas size, tint all your pictures blue, send your image to the printer in high quality, or run multiple filters on images. The name is for your own usage, so do what suits you best. Note that you can simplify your process by creating shortcut keys for your actions. If you plan to do hundreds of edits with your actions, this might be a good idea. If you plan to record an action to use every time you use Photoshop, this might even be an invaluable step. When you create a new Action, Photoshop automatically begins recording everything you do. It does not record the time in between steps, but rather only the data from each step. So take your time when recording and make sure you create your actions the way you want them. The square button stops recording, and the circle button starts recording again. With these basics ready, we can take a look at a sample Action. Recording a Sample Action Photoshop will remember everything you input into it when it is recording, even specific photographs you open. So begin recording your action when your first photo is already open. Once your first image is open, click the record button. If you’re already recording, continue on. Using the File > Place command to insert the polaroid image can be easier for Actions to deal with. Photoshop can record with multiple open files, but it often gets confused when you try it. Keep your recordings as simple as possible to ensure your success. When the image is placed in, simply press enter to render it. Select your background layer in your layers panel. Your recording should be following along with no trouble. Double click this layer. Double clicking your background layer will create a new layer from it. Allow it to be renamed “Layer 0” and press OK. Move the “polaroid” layer to the bottom by selecting it and dragging it down below “Layer 0” in the layers panel. Right click “Layer 0” and select “Create Clipping Mask.” The JPG image is cropped to the layer below it. Coincidentally, all actions described here are being recorded perfectly, and are reproducible. Cursor actions, like the eraser, brush, or bucket fill don’t record well, because the computer uses your mouse movements and coordinates, which may need to change from photo to photo. Click the to set your Photograph layer to a “Screen” blending mode. This will make the image disappear when it runs over the white parts of the polaroid image. With your image layer (Layer 0) still selected, navigate to Edit > Transform > Scale. You can use the mouse to resize your Layer 0, but Actions work better with absolute numbers. Visit the Width and Height adjustments in the top options panel. Click the chain icon to link them together, and adjust them numerically. Depending on your needs, you may need to use more or less than 30%. Your image will resize to your specifications. Press enter to render, or click the check box in the top right of your application. + Click on your bottom layer, or “polaroid” in this case. This creates a selection of the bottom layer. Navigate to Image > Crop in order to crop down to your bottom layer selection Your image is now resized to your bottommost layer, and Photoshop is still recording to that effect. For additional effect, we can navigate to Image > Image Rotation > Arbitrary to rotate our image by a small tilt. Choosing 3 degrees clockwise , we click OK to render our choice. Our image is rotated, and this step is recorded. Photoshop will even record when you save your files. With your recording still going, find File > Save As. You can easily tell Photoshop to save in a new folder, other than the one you have been working in, so that your files aren’t overwritten. Navigate to any folder you wish, but do not change the filename. If you change the filename, Photoshop will record that name, and save all your images under whatever you type. However, you can change your filetype without recording an absolute filename. Use the pulldown tab and select a different filetype—in this instance, PNG. Simply click “Save” to create a new PNG based on your actions. Photoshop will record the destination and the change in filetype. If you didn’t edit the name of your file, it will always use the variable filename of any image you open. (This is very important if you want to edit hundreds of images at once!) Click File > Close or the red “X” in the corner to close your filetype. Photoshop can record that as well. Since we have already saved our image as a JPG, click “NO” to not overwrite your original image. Photoshop will also record your choice of “NO” for subsequent images. In your Actions panel, click the stop button to complete your action. You can always click the record button to add more steps later, if you want. This is how your new action looks with its steps expanded. Curious how to put it into effect? Read on to see how simple it is to use that recording you just made. Editing Lots of Images with Your New Action Open a large number of images—as many as you care to work with. Your action should work immediately with every image on screen, although you may have to test and re-record, depending on how you did. Actions don’t require any programming knowledge, but often can get confused or work in a counter-intuitive way. Record your action until it is perfect. If it works once without errors, it’s likely to work again and again! Find the “Play” button in your Actions Panel. With your custom action selected, click “Play” and your routine will edit, save, and close each file for you. Keep bashing “Play” for each open file, and it will keep saving and creating new files until you run out of work you need to do. And in mere moments, a complicated stack of work is done. Photoshop actions can be very complicated, far beyond what is illustrated here, and can even be combined with scripts and other actions, creating automated creation of potentially very complex files, or applying filters to an entire portfolio of digital photos. Have questions or comments concerning Graphics, Photos, Filetypes, or Photoshop? Send your questions to [email protected], and they may be featured in a future How-To Geek Graphics article. Image Credits: All images copyright Stephanie Pragnell and author Eric Z Goodnight, protected under Creative Commons. Latest Features How-To Geek ETC How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) How To Remove People and Objects From Photographs In Photoshop Ask How-To Geek: How Can I Monitor My Bandwidth Usage? Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Smart Taskbar Is a Thumb Friendly Android Task Launcher Comix is an Awesome Comics Archive Viewer for Linux Get the MakeUseOf eBook Guide to Speeding Up Windows for Free Need Tech Support? Call the Star Wars Help Desk! [Video Classic] Reclaim Vertical UI Space by Adding a Toolbar to the Left or Right Side of Firefox Androidify Turns You into an Android-style Avatar

    Read the article

  • The Business of Winning Innovation: An Exclusive Blog Series

    - by Kerrie Foy
    "The Business of Winning Innovation” is a series of articles authored by Oracle Agile PLM experts on what it takes to make innovation a successful and lucrative competitive advantage. Our customers have proven Agile PLM applications to be enormously flexible and comprehensive, so we’ve launched this article series to showcase some of the most fascinating, value-packed use cases. In this article by Keith Colonna, we kick-off the series by taking a look at the science side of innovation within the Consumer Products industry and how PLM can help companies innovate faster, cheaper, smarter. This article will review how innovation has become the lifeline for growth within consumer products companies and how certain companies are “winning” by creating a competitive advantage for themselves by taking a more enterprise-wide,systematic approach to “innovation”.   Managing the Science of Innovation within the Consumer Products Industry By: Keith Colonna, Value Chain Solution Manager, Oracle The consumer products (CP) industry is very mature and competitive. Most companies within this industry have saturated North America (NA) with their products thus maximizing their NA growth potential. Future growth is expected to come from either expansion outside of North America and/or by way of new ideas and products. Innovation plays an integral role in both of these strategies, whether you’re innovating business processes or the products themselves, and may cause several challenges for the typical CP company, Becoming more innovative is both an art and a science. Most CP companies are very good at the art of coming up with new innovative ideas, but many struggle with perfecting the science aspect that involves the best practice processes that help companies quickly turn ideas into sellable products and services. Symptoms and Causes of Business Pain Struggles associated with the science of innovation show up in a variety of ways, like: · Establishing and storing innovative product ideas and data · Funneling these ideas to the chosen few · Time to market cycle time and on-time launch rates · Success rates, or how often the best idea gets chosen · Imperfect decision making (i.e. the ability to kill projects that are not projected to be winners) · Achieving financial goals · Return on R&D investment · Communicating internally and externally as more outsource partners are added globally · Knowing your new product pipeline and project status These challenges (and others) can be consolidated into three root causes: A lack of visibility Poor data with limited access The inability to truly collaborate enterprise-wide throughout your extended value chain Choose the Right Remedy Product Lifecycle Management (PLM) solutions are uniquely designed to help companies solve these types challenges and their root causes. However, PLM solutions can vary widely in terms of configurability, functionality, time-to-value, etc. Business leaders should evaluate PLM solution in terms of their own business drivers and long-term vision to determine the right fit. Many of these solutions are point solutions that can help you cure only one or two business pains in the short term. Others have been designed to serve other industries with different needs. Then there are those solutions that demo well but are owned by companies that are either unable or unwilling to continuously improve their solution to stay abreast of the ever changing needs of the CP industry to grow through innovation. What the Right PLM Solution Should Do for You Based on more than twenty years working in the CP industry, I recommend investing in a single solution that can help you solve all of the issues associated with the science of innovation in a totally integrated fashion. By integration I mean the (1) integration of the all of the processes associated with the development, maintenance and delivery of your product data, and (2) the integration, or harmonization of this product data with other downstream sources, like ERP, product catalogues and the GS1 Global Data Synchronization Network (or GDSN, which is now a CP industry requirement for doing business with most retailers). The right PLM solution should help you: Increase Revenue. A best practice PLM solution should help a company grow its revenues by consolidating product development cycle-time and helping companies get new and improved products to market sooner. PLM should also eliminate many of the root causes for a product being returned, refused and/or reclaimed (which takes away from top-line growth) by creating an enterprise-wide, collaborative, workflow-driven environment. Reduce Costs. A strong PLM solution should help shave many unnecessary costs that companies typically take for granted. Rationalizing SKU’s, components (ingredients and packaging) and suppliers is a major opportunity at most companies that PLM should help address. A natural outcome of this rationalization is lower direct material spend and a reduction of inventory. Another cost cutting opportunity comes with PLM when it helps companies avoid certain costs associated with process inefficiencies that lead to scrap, rework, excess and obsolete inventory, poor end of life administration, higher cost of quality and regulatory and increased expediting. Mitigate Risk. Risks are the hardest to quantify but can be the most costly to a company. Food safety, recalls, line shutdowns, customer dissatisfaction and, worst of all, the potential tarnishing of your brands are a few of the debilitating risks that CP companies deal with on a daily basis. These risks are so uniquely severe that they require an enterprise PLM solution specifically designed for the CP industry that safeguards product information and processes while still allowing the art of innovation to flourish. Many CP companies have already created a winning advantage by leveraging a single, best practice PLM solution to establish an enterprise-wide, systematic approach to innovation. Oracle’s Answer for the Consumer Products Industry Oracle is dedicated to solving the growth and innovation challenges facing the CP industry. Oracle’s Agile Product Lifecycle Management for Process solution was originally developed with and for CP companies and is driven by a specialized development staff solely focused on maintaining and continuously improving the solution per the latest industry requirements. Agile PLM for Process helps CP companies handle all of the processes associated with managing the science of the innovation process, including: specification management, new product development/project and portfolio management, formulation optimization, supplier management, and quality and regulatory compliance to name a few. And as I mentioned earlier, integration is absolutely critical. Many Oracle CP customers, both with Oracle ERP systems and non-Oracle ERP systems, report benefits from Oracle’s Agile PLM for Process. In future articles we will explain in greater detail how both existing Oracle customers (like Gallo, Smuckers, Land-O-Lakes and Starbucks) and new Oracle customers (like ConAgra, Tyson, McDonalds and Heinz) have all realized the benefits of Agile PLM for Process and its integration to their ERP systems. More to Come Stay tuned for more articles in our blog series “The Business of Winning Innovation.” While we will also feature articles focused on other industries, look forward to more on how Agile PLM for Process addresses innovation challenges facing the CP industry. Additional topics include: Innovation Data Management (IDM), New Product Development (NPD), Product Quality Management (PQM), Menu Management,Private Label Management, and more! . Watch this video for more info about Agile PLM for Process

    Read the article

  • Building dynamic OLAP data marts on-the-fly

    - by DrJohn
    At the forthcoming SQLBits conference, I will be presenting a session on how to dynamically build an OLAP data mart on-the-fly. This blog entry is intended to clarify exactly what I mean by an OLAP data mart, why you may need to build them on-the-fly and finally outline the steps needed to build them dynamically. In subsequent blog entries, I will present exactly how to implement some of the techniques involved. What is an OLAP data mart? In data warehousing parlance, a data mart is a subset of the overall corporate data provided to business users to meet specific business needs. Of course, the term does not specify the technology involved, so I coined the term "OLAP data mart" to identify a subset of data which is delivered in the form of an OLAP cube which may be accompanied by the relational database upon which it was built. To clarify, the relational database is specifically create and loaded with the subset of data and then the OLAP cube is built and processed to make the data available to the end-users via standard OLAP client tools. Why build OLAP data marts? Market research companies sell data to their clients to make money. To gain competitive advantage, market research providers like to "add value" to their data by providing systems that enhance analytics, thereby allowing clients to make best use of the data. As such, OLAP cubes have become a standard way of delivering added value to clients. They can be built on-the-fly to hold specific data sets and meet particular needs and then hosted on a secure intranet site for remote access, or shipped to clients' own infrastructure for hosting. Even better, they support a wide range of different tools for analytical purposes, including the ever popular Microsoft Excel. Extension Attributes: The Challenge One of the key challenges in building multiple OLAP data marts based on the same 'template' is handling extension attributes. These are attributes that meet the client's specific reporting needs, but do not form part of the standard template. Now clearly, these extension attributes have to come into the system via additional files and ultimately be added to relational tables so they can end up in the OLAP cube. However, processing these files and filling dynamically altered tables with SSIS is a challenge as SSIS packages tend to break as soon as the database schema changes. There are two approaches to this: (1) dynamically build an SSIS package in memory to match the new database schema using C#, or (2) have the extension attributes provided as name/value pairs so the file's schema does not change and can easily be loaded using SSIS. The problem with the first approach is the complexity of writing an awful lot of complex C# code. The problem of the second approach is that name/value pairs are useless to an OLAP cube; so they have to be pivoted back into a proper relational table somewhere in the data load process WITHOUT breaking SSIS. How this can be done will be part of future blog entry. What is involved in building an OLAP data mart? There are a great many steps involved in building OLAP data marts on-the-fly. The key point is that all the steps must be automated to allow for the production of multiple OLAP data marts per day (i.e. many thousands, each with its own specific data set and attributes). Now most of these steps have a great deal in common with standard data warehouse practices. The key difference is that the databases are all built to order. The only permanent database is the metadata database (shown in orange) which holds all the metadata needed to build everything else (i.e. client orders, configuration information, connection strings, client specific requirements and attributes etc.). The staging database (shown in red) has a short life: it is built, populated and then ripped down as soon as the OLAP Data Mart has been populated. In the diagram below, the OLAP data mart comprises the two blue components: the Data Mart which is a relational database and the OLAP Cube which is an OLAP database implemented using Microsoft Analysis Services (SSAS). The client may receive just the OLAP cube or both components together depending on their reporting requirements.  So, in broad terms the steps required to fulfil a client order are as follows: Step 1: Prepare metadata Create a set of database names unique to the client's order Modify all package connection strings to be used by SSIS to point to new databases and file locations. Step 2: Create relational databases Create the staging and data mart relational databases using dynamic SQL and set the database recovery mode to SIMPLE as we do not need the overhead of logging anything Execute SQL scripts to build all database objects (tables, views, functions and stored procedures) in the two databases Step 3: Load staging database Use SSIS to load all data files into the staging database in a parallel operation Load extension files containing name/value pairs. These will provide client-specific attributes in the OLAP cube. Step 4: Load data mart relational database Load the data from staging into the data mart relational database, again in parallel where possible Allocate surrogate keys and use SSIS to perform surrogate key lookup during the load of fact tables Step 5: Load extension tables & attributes Pivot the extension attributes from their native name/value pairs into proper relational tables Add the extension attributes to the views used by OLAP cube Step 6: Deploy & Process OLAP cube Deploy the OLAP database directly to the server using a C# script task in SSIS Modify the connection string used by the OLAP cube to point to the data mart relational database Modify the cube structure to add the extension attributes to both the data source view and the relevant dimensions Remove any standard attributes that not required Process the OLAP cube Step 7: Backup and drop databases Drop staging database as it is no longer required Backup data mart relational and OLAP database and ship these to the client's infrastructure Drop data mart relational and OLAP database from the build server Mark order complete Start processing the next order, ad infinitum. So my future blog posts and my forthcoming session at the SQLBits conference will all focus on some of the more interesting aspects of building OLAP data marts on-the-fly such as handling the load of extension attributes and how to dynamically alter the structure of an OLAP cube using C#.

    Read the article

  • Taking web sites offline for demonstration

    While working in software development in general, and in web development for a couple of customers it is quite common that it is necessary to provide a test bed where the client is able to get an image, or better said, a feeling for the visions and ideas you are talking about. Usually here at IOS Indian Ocean Software Ltd. we set up a demo web site on one of our staging servers, and provide credentials to the customer to access and review our progress and work ad hoc. This gives us the highest flexibility on both sides, as the test bed is simply online and available 24/7. We can update the structure, the UI and data at any time, and the client is able to view it as it suits best for her/him. Limited or lack of online connectivity But what is going to happen when your client is not capable to be online - no matter for what reasons; here are some more obvious ones: No internet connection (permanently or temporarily) Expensive connection, ie. mobile data package, stay at a hotel, etc. Presentation devices at an exhibition, ie. using tablets or iPads Being abroad for a certain time, and only occasionally online No network coverage, especially on mobile Bad infrastructure, like ie. in Third World countries Providing a catalogue on CD or USB pen drive Anyway, it doesn't matter really. We should be able to provide a solution for the circumstances of our customers. Presentation during an exhibition Recently, we had the following request from a customer: Is it possible to let us have a desktop version of ResortWork.co.uk that we can use for demo purposes at the forthcoming Ski Shows? It would allow us to let stand visitors browse the sites on an iPad to view jobs and training directory course listings. Yes, sure we can do that. Eventually, you might think why don't they simply use 3G enabled iPads for that purpose? As stated above, there might be several reasons for that - low coverage, expensive data packages, etc. Anyway, it is not a question on how to circumvent the request but to deliver a solution to that. Possible solutions... or not? We already did offline websites earlier, and even established complete mirrors of one or two web sites on our systems. There are actually several possibilities to handle this kind of request, and it mainly depends on the system or device where the offline site should be available on. Here, it is clearly expressed that we have to address this on an Apple iPad, well actually, I think that they'd like to use multiple devices during their exhibitions. Following is an overview of possible solutions depending on the technology or device in use, and how it can be done: Replication of source files and database The above mentioned web site is running on ASP.NET, IIS and SQL Server. In case that a laptop or slate runs a Windows OS, the easiest way would be to take a snapshot of the source files and database, and transfer them as local installation to those Windows machines. This approach would be fully operational on the local machine. Saving pages for offline usage This is actually a quite tedious job but still practicable for small web sites Tool based approach to 'harvest' the web site There quite some tools in the wild that could handle this job, namely wget, httrack, web copier, etc. Screenshots bundled as PDF document Not really... ;-) Creating screencast or video Simply navigate through your website and record your desktop session. Actually, we are using this kind of approach to track down difficult problems in order to see and understand exactly what the user was doing to cause an error. Of course, this list isn't complete and I'd love to get more of your ideas in the comments section below the article. Preparations for offline browsing The original website is dynamically and data-driven by ASP.NET, and looks like this: As we have to put the result onto iPads we are going to choose the tool-based approach to 'download' the whole web site for offline usage. Again, depending on the complexity of your web site you might have to check which of the applications produces the best results for you. My usual choice is to use wget but in this case, we run into problems related to the rewriting of hyperlinks. As a consequence of that we opted for using HTTrack. HTTrack comes in different flavours, like console application but also as either GUI (WinHTTrack on Windows) or Web client (WebHTTrack on Linux/Unix/BSD). Here's a brief description taken from the original website about HTTrack: HTTrack is a free (GPL, libre/free software) and easy-to-use offline browser utility. It allows you to download a World Wide Web site from the Internet to a local directory, building recursively all directories, getting HTML, images, and other files from the server to your computer. HTTrack arranges the original site's relative link-structure. Simply open a page of the "mirrored" website in your browser, and you can browse the site from link to link, as if you were viewing it online. And there is an extensive documentation for all options and switches online. General recommendation is to go through the HTTrack Users Guide By Fred Cohen. It covers all the initial steps you need to get up and running. Be aware that it will take quite some time to get all the necessary resources down to your machine. Actually, for our customer we run the tool directly on their web server to avoid unnecessary traffic and bandwidth. After a couple of runs and some additional fine-tuning - explicit inclusion or exclusion of various external linked web sites - we finally had a more or less complete offline version available. A very handsome feature of HTTrack is the error/warning log after completing the download. It contains some detailed information about errors that appeared on the pages and the links within the pages that have been processed. Error: "Bad Request" (400) at link www.resortwork.co.uk/job-details_Ski_hire:tech_or_mgr_or_driver_37854.aspx (from www.resortwork.co.uk/Jobs_A_to_Z.aspx)Error: "Not Found" (404) at link www.247recruit.net/images/applynow.png (from www.247recruit.net/css/global.css)Error: "Not Found" (404) at link www.247recruit.net/activate.html (from www.247recruit.net/247recruit_tefl_jobs_network.html) In our situation, we took the records of HTTP 400/404 errors and passed them to the web development department. Improvements are to be expected soon. ;-) Quality assurance on the full-featured desktop Unfortunately, the generated output of HTTrack was still incomplete but luckily there were only images missing. Being directly on the web server we simply copied the missing images from the original source folder into our offline version. After that, we created an archive and transferred the file securely to our local workspace for further review and checks. From that point on, it wasn't necessary to get any more files from the original web server, and we could focus ourselves completely on the process of browsing and navigating through the offline version to isolate visual differences and functional problems. As said, the original web site runs on ASP.NET Web Forms and uses Postback calls for interaction like search, pagination and partly for navigation. This is the main field of improving the offline experience. Of course, same as for standard web development it is advised to test with various browsers, and strangely we discovered that the offline version looked pretty good on Firefox, Chrome and Safari, but not in Internet Explorer. A quick look at the HTML source shed some light on this, and there are conditional CSS inclusions based on the user agent. HTTrack is not acting as Internet Explorer and so we didn't have the necessary overrides for this browser. Not problematic after all in our case, but you might have to pay attention to this and get the IE-specific files explicitly. And while having a view at the source code, we also found out that HTTrack actually modifies the generated HTML output. In several occasions we discovered that <div> elements were converted into <table> constructs for no obvious reason; even nested structures. Search 'e'nd destroy - sed (or Notepad++) to the rescue During our intensive root cause analysis for a couple of HTML/CSS problems that needed some extra attention it is very helpful to be familiar with any editor that allows search and replace over multiple files like, ie. sed - stream editor for filtering and transforming text on Linux or my personal favourite Notepad++ on Windows. This allowed us to quickly fix a lot of anchors with onclick attributes and Javascript code that was addressed to ASP.NET files instead of their generated HTML counterparts, like so: grep -lr -e '.aspx' * | xargs sed -e 's/.aspx/.html?/g' The additional question mark after the HTML extension helps to separate the query string from the actual target and solved all our missing hyperlinks very fast. The same can be done in Notepad++ on Windows, too. Just use the 'Replace in files' feature and you are settled. Especially, in combination with Regular Expressions (regex). Landscape of browsers Okay, after several runs of HTML/CSS code analysis, searching and replacing some strings in a pool of more than 4.000 files, we finally had a very good match of an offline browsing experience in Firefox and Chrome on Linux. Next, we transferred that modified set of files to a Windows 8 machine for review on Firefox, Chrome and Internet Explorer 7 to 10, and a Mac mini running Mac OS X 10.7 to check the output on Safari and again on Chrome. Besides IE, for reasons already mentioned above, the results were identical. And last but not least it was about to check web site on tablets. Please continue to read on the following articles: Taking web sites offline for demonstration on Galaxy Tablet Taking web sites offline for demonstration on iPad

    Read the article

  • Augmenting your Social Efforts via Data as a Service (DaaS)

    - by Mike Stiles
    The following is the 3rd in a series of posts on the value of leveraging social data across your enterprise by Oracle VP Product Development Don Springer and Oracle Cloud Data and Insight Service Sr. Director Product Management Niraj Deo. In this post, we will discuss the approach and value of integrating additional “public” data via a cloud-based Data-as-as-Service platform (or DaaS) to augment your Socially Enabled Big Data Analytics and CX Management. Let’s assume you have a functional Social-CRM platform in place. You are now successfully and continuously listening and learning from your customers and key constituents in Social Media, you are identifying relevant posts and following up with direct engagement where warranted (both 1:1, 1:community, 1:all), and you are starting to integrate signals for communication into your appropriate Customer Experience (CX) Management systems as well as insights for analysis in your business intelligence application. What is the next step? Augmenting Social Data with other Public Data for More Advanced Analytics When we say advanced analytics, we are talking about understanding causality and correlation from a wide variety, volume and velocity of data to Key Performance Indicators (KPI) to achieve and optimize business value. And in some cases, to predict future performance to make appropriate course corrections and change the outcome to your advantage while you can. The data to acquire, process and analyze this is very nuanced: It can vary across structured, semi-structured, and unstructured data It can span across content, profile, and communities of profiles data It is increasingly public, curated and user generated The key is not just getting the data, but making it value-added data and using it to help discover the insights to connect to and improve your KPIs. As we spend time working with our larger customers on advanced analytics, we have seen a need arise for more business applications to have the ability to ingest and use “quality” curated, social, transactional reference data and corresponding insights. The challenge for the enterprise has been getting this data inline into an easily accessible system and providing the contextual integration of the underlying data enriched with insights to be exported into the enterprise’s business applications. The following diagram shows the requirements for this next generation data and insights service or (DaaS): Some quick points on these requirements: Public Data, which in this context is about Common Business Entities, such as - Customers, Suppliers, Partners, Competitors (all are organizations) Contacts, Consumers, Employees (all are people) Products, Brands This data can be broadly categorized incrementally as - Base Utility data (address, industry classification) Public Master Reference data (trade style, hierarchy) Social/Web data (News, Feeds, Graph) Transactional Data generated by enterprise process, workflows etc. This Data has traits of high-volume, variety, velocity etc., and the technology needed to efficiently integrate this data for your needs includes - Change management of Public Reference Data across all categories Applied Big Data to extract statics as well as real-time insights Knowledge Diagnostics and Data Mining As you consider how to deploy this solution, many of our customers will be using an online “cloud” service that provides quality data and insights uniformly to all their necessary applications. In addition, they are requesting a service that is: Agile and Easy to Use: Applications integrated with the service can obtain data on-demand, quickly and simply Cost-effective: Pre-integrated into applications so customers don’t have to Has High Data Quality: Single point access to reference data for data quality and linkages to transactional, curated and social data Supports Data Governance: Becomes more manageable and cost-effective since control of data privacy and compliance can be enforced in a centralized place Data-as-a-Service (DaaS) Just as the cloud has transformed and now offers a better path for how an enterprise manages its IT from their infrastructure, platform, and software (IaaS, PaaS, and SaaS), the next step is data (DaaS). Over the last 3 years, we have seen the market begin to offer a cloud-based data service and gain initial traction. On one side of the DaaS continuum, we see an “appliance” type of service that provides a single, reliable source of accurate business data plus social information about accounts, leads, contacts, etc. On the other side of the continuum we see more of an online market “exchange” approach where ISVs and Data Publishers can publish and sell premium datasets within the exchange, with the exchange providing a rich set of web interfaces to improve the ease of data integration. Why the difference? It depends on the provider’s philosophy on how fast the rate of commoditization of certain data types will occur. How do you decide the best approach? Our perspective, as shown in the diagram below, is that the enterprise should develop an elastic schema to support multi-domain applicability. This allows the enterprise to take the most flexible approach to harness the speed and breadth of public data to achieve value. The key tenet of the proposed approach is that an enterprise carefully federates common utility, master reference data end points, mobility considerations and content processing, so that they are pervasively available. One way you may already be familiar with this approach is in how you do Address Verification treatments for accounts, contacts etc. If you design and revise this service in such a way that it is also easily available to social analytic needs, you could extend this to launch geo-location based social use cases (marketing, sales etc.). Our fundamental belief is that value-added data achieved through enrichment with specialized algorithms, as well as applying business “know-how” to weight-factor KPIs based on innovative combinations across an ever-increasing variety, volume and velocity of data, will be where real value is achieved. Essentially, Data-as-a-Service becomes a single entry point for the ever-increasing richness and volume of public data, with enrichment and combined capabilities to extract and integrate the right data from the right sources with the right factoring at the right time for faster decision-making and action within your core business applications. As more data becomes available (and in many cases commoditized), this value-added data processing approach will provide you with ongoing competitive advantage. Let’s look at a quick example of creating a master reference relationship that could be used as an input for a variety of your already existing business applications. In phase 1, a simple master relationship is achieved between a company (e.g. General Motors) and a variety of car brands’ social insights. The reference data allows for easy sort, export and integration into a set of CRM use cases for analytics, sales and marketing CRM. In phase 2, as you create more data relationships (e.g. competitors, contacts, other brands) to have broader and deeper references (social profiles, social meta-data) for more use cases across CRM, HCM, SRM, etc. This is just the tip of the iceberg, as the amount of master reference relationships is constrained only by your imagination and the availability of quality curated data you have to work with. DaaS is just now emerging onto the marketplace as the next step in cloud transformation. For some of you, this may be the first you have heard about it. Let us know if you have questions, or perspectives. In the meantime, we will continue to share insights as we can.Photo: Erik Araujo, stock.xchng

    Read the article

  • CodePlex Daily Summary for Monday, October 08, 2012

    CodePlex Daily Summary for Monday, October 08, 2012Popular ReleasesSofire Suite v1.6: XSqlModelGenerator.AddIn: 1、?? VS2010/2012 2、?? .NET FRAMEWORK 2.0 3、?? SOFIRE V1.6Sofire XSql: XSqlFormDemo: ???? MSSQL ???????ClosedXML - The easy way to OpenXML: ClosedXML 0.68.0: ClosedXML now resolves formulas! Yes it finally happened. If you call cell.Value and it has a formula the library will try to evaluate the formula and give you the result. For example: var wb = new XLWorkbook(); var ws = wb.AddWorksheet("Sheet1"); ws.Cell("A1").SetValue(1).CellBelow().SetValue(1); ws.Cell("B1").SetValue(1).CellBelow().SetValue(1); ws.Cell("C1").FormulaA1 = "\"The total value is: \" & SUM(A1:B2)"; var...Picturethrill: Version 2.10.7.0: Major FeaturesWindows 8 Support!!! Picturethrill is fully tested on latest Windows 8. Try it now.New AboutPicturethrill Dialog Available from "Advanced" window. Describes Version, Developers and other infoFull Installation Information in Control Panel Picturethrill in ControlPanel has full information, including Version and Size. Minor Bug FixesImproved Logging AppDomain Unhandled Exception Logging Delete WallpaperDownload Folder (Saves Diskspace) Advanced Settings "Close" button bug ...VidCoder: 1.4.3 Beta: Fixed crash on switching container types. Updated subtitle selection to only allow one PGS subtitle to be selected under an MP4 container as PGS passthrough in MP4 is not supported and the sub must be burned in.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.68: Fixes for issues: 17110 - added support for -moz-calc or -webkit-calc to previous fix for calc support. 18652 - if being run under Mono runtime, don't mistake UNIX-style paths for command-line switches. 18659 - added capability to specify JS settings for CSS files that have JS embedded in expressions. Did a lot of internal house-keeping: add DLL support for -pponly switch. Add support for -js:expr, -js:evt, and -js:json to allow easy parsing for expressions, event handlers, and JSON code...Json.NET: Json.NET 4.5 Release 10: New feature - Added Portable build to NuGet package New feature - Added GetValue and TryGetValue with StringComparison to JObject Change - Improved duplicate object reference id error message Fix - Fixed error when comparing empty JObjects Fix - Fixed SecAnnotate warnings Fix - Fixed error when comparing DateTime JValue with a DateTimeOffset JValue Fix - Fixed serializer sometimes not using DateParseHandling setting Fix - Fixed error in JsonWriter.WriteToken when writing a DateT...Readable Passphrase Generator: KeePass Plugin 0.7.2: Changes: Tested against KeePass 2.20.1 Tested under Ubuntu 12.10 (and KeePass 2.20) Added GenerateAsUtf8 method returning the encrypted passphrase as a UTF8 byte array.JSLint for Visual Studio 2010: 1.4.2: 1.4.2GPdotNET - artificial intelligence tool: GPdotNET v2 BETA3: This is hopefully the last beta before final version.RiP-Ripper & PG-Ripper: PG-Ripper 1.4.02: changes NEW: Added Support Big Naturals Only forum NEW: Added Setting to enable/disable "Show last download image"patterns & practices: Prism: Prism for .NET 4.5: This is a release does not include any functionality changes over Prism 4.1 Desktop. These assemblies target .NET 4.5. These assemblies also were compiled against updated dependencies: Unity 3.0 and Common Service Locator (Portable Class Library).WinRT XAML Toolkit: WinRT XAML Toolkit - 1.3.2: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Snoop, the WPF Spy Utility: Snoop 2.8.0: Snoop 2.8.0Announcing Snoop 2.8.0! It's been exactly six months since the last release, and this one has a bunch of goodies in it. In particular, there is now a PowerShell scripting tab, compliments of Bailey Ling. With this tab, the possibilities are limitless. It basically lets you automate/script the application that you are Snooping. Bailey has a couple blog posts (one and two) on his tab already, and I am sure more is to come. Please note that if you do not have PowerShell installed, y....NET Micro Framework: .NET MF 4.3 (Beta): This is the 4.3 Beta version of the .NET Micro Framework. Feature List for v4.3 Support for Visual Studio 2012 (including the Windows Desktop Express version) All v4.2 QFEs features and bug fixes (PWM enhancements, lwIP and network driver reliability improvements, Analog Output, WinUSB and latest GCC support) Improved diagnostic information for deployment Decreased boot time Bug fixes Work Item 1736 - Create link for MFDeploy under start menu Work Item 1504 - Customizing lwIP o...MCEBuddy 2.x: MCEBuddy 2.3.1: 2.3.1All new Remote Client Server architecture. Reccomended Download. The Remote Client Installation is OPTIONAL, you can extract the files from the zip archive into a local folder and run MCEBuddy.GUI directly. 2.2.15 was the last standalone release. Changelog for 2.3.1 (32bit and 64bit) 1. All remote MCEBuddy Client Server architecture (GUI runs remotely/independently from engine now) 2. Fixed bug in Audio Offset 3. Added support for remote MediaInfo (right click on file in queue to get ...D3 Loot Tracker: 1.5: Support for session upload to website. Support for theme change through general settings. Time played counter will now also display a count for days. Tome of secrets are no longer logged as items.Team Foundation Server Word Add-in: Version 1.0.12.0622: Welcome to the Visual Studio Team Foundation Server Word Add-in Supported Environments Microsoft Office Word 2007 and 2010 X86 (32-bit) Team Foundation Server 2010 Object Model TFS 2010, 2012 and TFS Service supported, using TFS OM / Explorer 2010. Quality-Bar Details Tool has been reviewed by Visual Studio ALM Rangers Tool has been through an independent technical and quality review All critical bugs have been resolved Known Issues / Bugs WI#43553 - The Acceptance criteria is not pu...DirectX Tool Kit: October 2012: October 2, 2012 Added ScreenGrab module Added CreateGeoSphere for drawing a geodesic sphere Put DDSTextureLoader and WICTextureLoader into the DirectX C++ namespace Renamed project files for better naming consistency Updated WICTextureLoader for Windows 8 96bpp floating-point formats Win32 desktop projects updated to use Windows Vista (0x0600) rather than Windows 7 (0x0601) APIs Tweaked SpriteBatch.cpp to workaround ARM NEON compiler codegen bugCRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1002.3): Visual Ribbon Editor 1.3.1002.3 What's New: Multi-language support for Labels/Tooltips for custom buttons and groups Support for base language other than English (1033) Connect dialog will not require organization name for ADFS / IFD connections Automatic creation of missing labels for all provisioned languages Minor connection issues fixed Notes: Before saving the ribbon to CRM server, editor will check Ribbon XML for any missing <Title> elements inside existing <LocLabel> elements...New Projectsamplifi: This project is still under construction. We will add more information here as soon as it is available.autoclubpigue: Proyecto para el auto club de piguecoiffeurprj: coiffeurprjerutufym: erutufym is gnol eht gge.Express AOP: A .NET AOP ToolsFile Slurpee: The purpouse of this application is to help security professionals during their penetration tests. Given a configuration this application will scour the target Guardian - Google Authenticator: Guardian is a windows client application for Google Authenticator - a software based two-factor authentication token developed by Google.ManagedCuda Galaxy Simulator: This project is a test of ManagedCuda and graphics interop to OpenTK to simulate a simple galaxy on the GPU.mostafanote: ????? ???? ??????? ?? ???? ?? ?? ?? ? ?? ??????? ?? ??????? ??? ?? ??? ?? ?? ???????? ??????? ??? ??? ?? ???? ?????? NXT Controller: A simple controller for the LEGO Mindstorms NXTPowerShell Security: PoshSec is short for PowerShell security, a module provided to allow PowerShell users testing, analysis and reporting on Window securityRevenant Raiders - Sunstrider: Tool to provide information about Revenant Raiders - Sunstrider.Run: animal runSharedDeploy: Simple ASP.MVC app , that use 2 types of view : mobile and desktop. In core WebApi, jQuery, jQuery.mobileSkyCoffee: Cafeteria ApplicationSOSA Analysis Module: Analysis program for data from SOSA psychological experiment software.SQL Connector: SQL Connector is a .DLL file that make it easy to comunicate with SQL Server Works in : Windows Desktop Applications ASP.Net Web (MVC - WEB Services) Windows cSQL Job Scripter: SQL Job Scripter is a command line utility that produces scripts of SQL Agent jobs. It will script either to a single file or to one file per job. SQL Server Watcher: A tool for collecting and monitoring information about sql server.Team Build Inspector: An add-in for Visual Studio Team Explorer that monitors the status of build definitions by its latest build, latest good build & underlying source code status.TED Talk Download Manager: "TED Talk Download Manager" provides a central point in which to download multiple TED talks, while keeping track of any talk previously downloaded.Temp project mycollection: My collection temporary projectUnreal Unit Converter: A Simple UDK <-> 3DS Max <-> Meters Unit Converter - with some other stuff.Watermelon Site: This Site is The site of Watermelon Inc. In This Site: Downloads News about Watermelon And more... WiX Extended Bootstrapper Application: An extended WiX bootstrapper Application based on the WiX standard bootstrapper application.Wrox NHibernate with ASP.NET Problem-Design-Solution -C# version: sample code of Wrox NHibernate with ASP.NET Problem-Design-Solution (http://nhibernateasp.codeplex.com/) in C# which is done as a practice

    Read the article

  • Projected Results

    - by Sylvie MacKenzie, PMP
    Excerpt from PROFIT - ORACLE - by Monica Mehta Yasser Mahmud has seen a revolution in project management over the past decade. During that time, the former Primavera product strategist (who joined Oracle when his company was acquired in 2008) has not only observed a transformation in the way IT systems support corporate projects but the role project portfolio management (PPM) plays in the enterprise. “15 years ago project management was the domain of project management office (PMO),” Mahmud recalls of earlier days. “But over the course of the past decade, we've seen it transform into a mission critical enterprise discipline, that has made Primavera indispensable in the board room. Now, as a senior manager, a board member, or a C-level executive you have direct and complete visibility into what’s kind of going on in the organization—at a level of detail that you're going to consume that information.” Now serving as Oracle’s vice president of product strategy and industry marketing, Mahmud shares his thoughts on how Oracle’s Primavera solutions have evolved and how best-in-class project portfolio management systems can help businesses stay competitive. Profit: What do you feel are the market dynamics that are changing project management today? Mahmud: First, the data explosion. We're generating data at twice the rate at which we can actually store it. The same concept applies for project-intensive organizations. A lot of data is gathered, but what are we really doing with it? Are we turning data into insight? Are we using that insight and turning it into foresight with analytics tools? This is a key driver that will separate the very good companies—the very competitive companies—from those that are not as competitive. Another trend is centered on the explosion of mobile computing. By the year 2013, an estimated 35 percent of the world’s workforce is going to be mobile. That’s one billion people. So the question is not if you're going to go mobile, it’s how fast you are going to go mobile. What kind of impact does that have on how the workforce participates in projects? What worked ten to fifteen years ago is not going to work today. It requires a real rethink around the interfaces and how data is actually presented. Profit: What is the role of project management in this new landscape? Mahmud: We recently conducted a PPM study with the Economist Intelligence Unit centered to determine how important project management is considered within organizations. Our target was primarily CFOs, CIOs, and senior managers and we discovered that while 95 percent of participants believed it critical to their business, only six percent were confident that projects were delivered on time and on budget. That’s a huge gap. Most organizations are looking for efficiency, especially in these volatile financial times. But senior management can’t keep track of every project in a large organization. As a result, executives are attempting to inventory the work being conducted under their watch. What is often needed is a very high-level assessment conducted at the board level to say, “Here are the 50 initiatives that we have underway. How do they line up with our strategic drivers?” This line of questioning can provide early warning that work and strategy are out of alignment; finding the gap between what the business needs to do and the actual performance scorecard. That’s low-hanging fruit for any executive looking to increase efficiency and save money. But it can only be obtained through proper assessment of existing projects—and you need a project system of record to get that done. Over the next decade or so, project management is going to transform into holistic work management. Business leaders will want make sure key projects align with corporate strategy, but also the ability to drill down into daily activity and smaller projects to make sure they line up as well. Keeping employees from working on tasks—even for a few hours—that don’t line up with corporate goals will, in many ways, become a competitive differentiator. Profit: How do all of these market challenges and shifting trends impact Oracle’s Primavera solutions and meeting customers’ needs? Mahmud: For Primavera, it’s a transformation from being a project management application to a PPM system in the enterprise. Also making that system a mission-critical application by connecting to other key applications within the ecosystem, such as the enterprise resource planning (ERP), supply chain, and CRM systems. Analytics have also become a huge component. Business analytics have made Oracle’s Primavera applications pertinent in the boardroom. Now, as a senior manager, a board member, a CXO, CIO, or CEO, you have direct visibility into what’s going on in the organization at a level that you're able to consume that information. In addition, all of this information pairs up really well with your financials and other data. Certainly, when you're an Oracle shop, you have that visibility that you didn’t have before from a project execution perspective. Profit: What new strategies and tools are being implemented to create a more efficient workplace for users? Mahmud: We believe very strongly that just because you call something an enterprise project portfolio management system doesn’t make it so—you have to get people to want to participate in the system. This can’t be mandated down from the top. It simply doesn’t work that way. A truly adoptable solution is one that makes it super easy for all types users to participate, by providing them interfaces where they live. Keeping that in mind, a major area of development has been alternative user interfaces. This is increasingly resulting in the creation of lighter weight, targeted interfaces such as iOS applications, and smartphones interfaces such as for iPhone and Android platform. Profit: How does this translate into the development of Oracle’s Primavera solutions? Mahmud: Let me give you a few examples. We recently announced the launch of our Primavera P6 Team Member application, which is a native iOS application for the iPhone. This interface makes it easier for team members to do their jobs quickly and effectively. Similarly, we introduced the Primavera analytics application, which can be consumed via mobile devices, and when married with Oracle Spatial capabilities, users can get a geographical view of what’s going on and which projects are occurring in various locations around the world. Lastly, we introduced advanced email integration that allows project team members to status work via E-mail. This functionality leverages the fact that users are in E-mail system throughout the day and allows them to status their work without the need to launch the Primavera application. It comes back to a mantra: provide as many alternative user interfaces as possible, so you can give people the ability to work, to participate, to raise issues, to create projects, in the places where they live. Do it in such a way that it’s non-intrusive, do it in such a way that it’s easy and intuitive and they can get it done in a short amount of time. If you do that, workers can get back to doing what they're actually getting paid for.

    Read the article

  • Developer’s Life – Every Developer is a Batman

    - by Pinal Dave
    Batman is one of the darkest superheroes in the fantasy canon.  He does not come to his powers through any sort of magical coincidence or radioactive insect, but through a lot of psychological scarring caused by witnessing the death of his parents.  Despite his dark back story, he possesses a lot of admirable abilities that I feel bear comparison to developers. Batman has the distinct advantage that his alter ego, Bruce Wayne is a millionaire (or billionaire in today’s reboots).  This means that he can spend his time working on his athletic abilities, building a secret lair, and investing his money in cool tools.  This might not be true for developers (well, most developers), but I still think there are many parallels. So how are developers like Batman? Well, read on my list of reasons. Develop Skills Batman works on his skills.  He didn’t get the strength to scale Gotham’s skyscrapers by inheriting his powers or suffering an industrial accident.  Developers also hone their skills daily.  They might not be doing pull-ups and scaling buldings, but I think their skills are just as impressive. Clear Goals Batman is driven to build a better Gotham.  He knows that the criminal who killed his parents was a small-time thief, not a super villain – so he has larger goals in mind than simply chasing one villain.  He wants his city as a whole to be better.  Developers are also driven to make things better.  It can be easy to get hung up on one problem, but in the end it is best to focus on the well-being of the system as a whole. Ultimate Teamplayers Batman is the hero Gotham needs – even when that means appearing to be the bad guys.  Developers probably know that feeling well.  Batman takes the fall for a crime he didn’t commit, and developers often have to deliver bad news about the limitations of their networks and servers.  It’s not always a job filled with glory and thanks, but someone has to do it. Always Ready Batman and the Boy Scouts have this in common – they are always prepared.  Let’s add developers to this list.  Batman has an amazing tool belt with gadgets and gizmos, and let’s not even get into all the functions of the Batmobile!  Developers’ skills might be the knowledge and skills they have developed, not tools they can carry in a utility belt, but that doesn’t make them any less impressive. 100% Dedication Bruce Wayne cultivates the personality of a playboy, never keeping the same girlfriend for long and spending his time partying.  Even though he hides it, his driving force is his deep concern and love for his friends and the city as a whole.  Developers also care a lot about their company and employees – even when it is driving them crazy.  You do your best work when you care about your job on a personal level. Quality Output Batman believes the city deserves to be saved.  The citizens might have a love-hate relationship with both Batman and Bruce Wayne, and employees might not always appreciate developers.  Batman and developers, though, keep working for the best of everyone. I hope you are all enjoying reading about developers-as-superheroes as much as I am enjoying writing about them.  Please tell me how else developers are like Superheroes in the comments – especially if you know any developers who are faster than a speeding bullet and can leap tall buildings in a single bound. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: Developer, Superhero

    Read the article

  • Required Skill Sets Of A Software Architect

    The question has been asked as to what is the required skill sets of a software architect. The answer to this is that it truly depends. When I state that it depend, it depends on the organization, industry, and skill sets available on the open market and internally within a company. With open ended skill sets even Napoleon Dynamite could be an architect. Napoleon Dynamite’s Skills Pedro: Have you asked anybody yet? Napoleon Dynamite: No, but who would? I don't even have any good skills. Pedro: What do you mean? Napoleon Dynamite: You know, like nunchuck skills, bow hunting skills, computer hacking skills... Girls only want boyfriends who have great skills. Pedro: Aren't you pretty good at drawing, like animals and warriors and stuff? This example might be a little off base but it does illustrate a point. What are the real required skills of a software architect? In my opinion, an architect needs to demonstrate the knowledge of the following three main skill set categories so that they are successful. General Skill Sets of an Architect Basic Engineering Skills Organizational  Skills Interpersonal Skills Basic Engineering Skills are a very large part of what a software architect deal with on a daily bases when designing or updating systems. Think about it, how good would a lead mechanic be if they did not know how to fix or repair cars? They would not be, and that is my point that architects need to have at least some basic skills regarding engineering. The skills listed below are generic in nature because they change from job to job, so in this discussion I am trying to focus more on generalities so that anyone can apply this information to their individual situation. Common Basic Engineering Skills Data Modeling Code Creation Configuration Testing Deployment/Publishing System and Environment Knowledge Organizational Skills If an Architect works for or with an origination then they will need strong organization skills to survive. An architect is no use to a project if the project is missed managed. Additionally, budgets and timelines can really affect a company and their products when established deadlines are repeated not meet. By not meeting these timelines a company is forced to cancel the project and waste all the money and time spent or spend more money until it is completed, if it is ever completed. Common Organizational Skills Project Management Estimation (Cost and Time) Creation and Maintenance of Accepted Standards Interpersonal Skills For me personally Interpersonal skill ranks above the other types of skill sets because an architect can quickly pick up the other two skill sets by communicating with other team/project members so that they are quickly up to speed on a project. Additionally, in order for an architect to manage a project or even derive rough estimates they will more than likely have to consult with others actually working on the code (Programmers/Software Engineers) to get there estimates since they will be the ones actually working on the changes to be implemented. Common Interpersonal Skills Good Communicator Focus on projects success over personal Honors roles within a team Reference: Taylor, R. N., Medvidovic, N., & Dashofy, E. M. (2009). Software architecture: Foundations, theory, and practice Hoboken, NJ: John Wiley & Sons

    Read the article

  • How To Rip an Audio CD to FLAC with Foobar2000

    - by Mysticgeek
    Foobar2000 is a great audio player that is fully customizable, is light on system resources, and contains a lot of tools and features. Today we show you how to use it to rip an audio CD to FLAC format. Note: For this tutorial we’re going to assume this is the first time you’re ripping a disc with Foobar2000. We’re running it on Windows 7 Ultimate 64-bit. Install Foobar2000 and FLAC First download and install Foobar2000 (link below). The main thing you’ll want to make sure to enable during the install process is Audio CD Support… And the freedb Tagger which are located under Optional Features, then continue through the rest of the install wizard. Next you need to install the latest version of the FLAC codec (link below) following the defaults. Rip Audio CD To rip a CD, place it in your CDROM drive, launch Foobar2000 and click File \ Open Audio CD. Select the appropriate CD drive and click the Rip button. Next you’ll want to lookup the disc information with freedb…or you can manually enter in the track data if it’s a custom disc. Select the proper tag information in the freedb tagger window, then click Update files. The data will be entered in, make sure the radio button next to Go to the Converter Setup dialog is selected, and click the Rip button. In the Converter Setup screen, here you can select the output format, where in our case we’re selecting FLAC. In this window you can choose several other options like the output path, merging the tracks into one or individual files…etc. When you have those settings completed click OK. Next you’ll need to find flac.exe which is located wherever you installed it. On our 64-bit Windows 7 system the default path is C:\Program Files (x86)\FLAC Now wait while your CD is ripped and converted to FLAC. You’ll get a Converter Status Report…after you’ve checked it over you can close out of it. If you set the option to show the output files after conversion you can take a look, make sure all tracks were converted, and play them right away if you want. You can play the tracks in Foobar2000 or any player that supports FLAC. If you want to use WMC or WMP see our article on how to play FLAC files in Windows 7 Media Center or Player. That’s all there is to it! If you’re a fan of Foobar2000 and enjoy your music converted to FLAC format, Foobar2000 does the job quite well. There are a lot of customizations and tools you can use in Foobar2000 that we’ll be taking a look at in future articles. For more information check out our look at this fully customizable music player. Foobar2000 run on XP, Vista, and Windows 7 Links Download Foobar2000 Download FLAC Similar Articles Productive Geek Tips Using Ubuntu: What Package Did This File Come From?Easily Change Audio File Formats with XRECODEFoobar2000 is a Fully Customizable Music PlayerConvert Virtually Any Audio Format with XRECODE IIExtract Audio from a Video File with Pazera Free Audio Extractor TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Download Free MP3s from Amazon Awe inspiring, inter-galactic theme (Win 7) Case Study – How to Optimize Popular Wordpress Sites Restore Hidden Updates in Windows 7 & Vista Iceland an Insurance Job? Find Downloads and Add-ins for Outlook

    Read the article

  • Is HR/Recruitment Really Ready For Innovative Candidates

    - by david.talamelli
    Before I begin this blog post, I want to acknowledge that there are some great HR/Recruitment people out there who are innovative and are leading the way in using new means to successfully attract and connect with talented people. For those of you who fit in this category, please keep thinking outside the square - just because what you do may not be the norm doesn't mean it is bad. Ok, with that acknowledgment out of the way - Earlier this morning (I started this post Friday morning) I came across this online profile via a tweet from Philip Tusing I love the information that Jason has put on his web-pages. From his work Jason clearly demonstrates not only his skills/experience but also I love how he relates his experience and shows how it will help an employer and what the value add of having him on your team is. Looking at Jason's profile makes me think though, is HR/Recruitment in general terms ready to deal with innovative candidates. Sure most Recruiters are online in some form or another, but how many actually have a process that is flexible enough to deal with someone who may not fit into your processes. Is your company's recruitment practice proactive enough to find Jason's web-pages? I am not sure what he is doing in terms of a job search, but if he is not mailing a resume or replying to ads on a Job Board - hopefully Jason comes up on some of the candidate searching you are doing. Once you find this information, would the information Jason provides fit nicely into your Applicant Tracking System or your Database? If not, how much of the intangible information are you losing and potentially not passing on to a Hiring Manager. I think what has worked in the past will not necessarily work in the future. Candidates want to work somewhere they will be challenged and learn and grow. If your HR/Recruitment team displays processes that take don't necessarily convey this message, this potentially could turn people away who were once interested in your company. For example (and I have to admit I still do some of these things myself), once calling up and having a talk to a candidate a company may say: 1) HR Question: Send me in a copy of your resume - Candidate Reply - you actually already have my resume, the web-page is http:// 2) HR Question:Come in for a chat so we can get to know you - Candidate Reply - if this is the basis of a meeting, you already know me and my thoughts by looking at my online links (blog, portfolio, homepage, etc...) These questions if not handled properly could potentially turn a candidate from being interested in your company to not being interested in your company. It potentially could demonstrate that your company is not social media savvy or maybe give the impression of not really being all that innovative. A candidate may think, if this company isn't able to take information I have provided in the public forum and use it, is it really a company I want to work for? I think when liaising with candidates a company should utilise the information the person has provided in the public domain. A candidate may inadvertantly give you answers to many of the questions you are seeking on their online presence and save everyone time instead of having to fill out forms or paperwork. If you build this into your conversations with your candidates it becomes a much more individualised service you are providing and really demonstrates to a candidate you are thinking of them as an individual. Yes I know we need to have processes in place and I am not saying don't work to those processes, but don't let process take away a candidates individuality. Don't let your process inadvertently scare away the top candidates that you may want in your company. This article was originally posted on David Talamelli's Blog - David's Journal on Tap

    Read the article

  • 11 Types of Developers

    - by Lee Brandt
    Jack Dawson Jack Dawson is the homeless drifter in Titanic. At one point in the movie he says, “I figure life’s a gift, and I don’t intend on wasting it.” He is happy to wander wherever life takes him. He works himself from place to place, making just enough money to make it to his next adventure. The “Jack Dawson” developer clings on to any new technology as the ‘next big thing’, and will find ways to shoe-horn it in to places where it is not a fit. He is very appealing to the other developers because they want to try the newest techniques and tools too, He will only stay until the new technology either bores him or becomes problematic. Jack will also be hard to find once the technology has been implemented, because he will be on to the next shiny thing. However, having a Jack Dawson on your team can be beneficial. Jack can be a great ally when attempting to convince a stodgy, corporate entity to upgrade. Jack usually has an encyclopedic recall of all the new features of the technology upgrade and is more than happy to interject them in any conversation. Tom Smykowski Tom is the neurotic employee in Office Space, and is deathly afraid of being fired. He will do only what is necessary to keep the status quo. He believes as long as nothing changes, his job is safe. He will scoff at anything new and be the naysayer during any change initiative. Tom can be useful in off-setting Jack Dawson. Jack will constantly be pushing for change and Tom will constantly be fighting it. When you see that Jack is getting kind of bored with a new technology and Tom has finally stopped wetting himself at the mere mention of it, then it is probably the sweet spot of beginning to implement that new technology (providing it is the right tool for the job). Ray Consella Ray is the guy who built the Field of Dreams. He took a risk. Sometimes he screwed it up, but he knew he didn’t want to end up regretting not attempting it. He constantly doubted himself, but he knew he had to keep going. Granted, he was doing what the voices in his head were telling him to do, but my point is he was driven to do something that most people considered crazy. Even when his friends, his wife and even he told himself he was crazy, somewhere inside himself, he knew it was the right thing to do. These are the innovators. These are the Bill Gates and Steve Jobs of the world. The take risks, they fail, they learn and the get better. Obviously, this kind of person thrives in start-ups and smaller companies, but that is due to their natural aversion to bureaucracy. They want to see their ideas put into motion quickly, and withdrawn quickly if it doesn’t work. Short feedback cycles are essential to Ray. He wants to know if his idea is working or not. He wants to modify or reverse his idea if it is not working or makes things worse. These are the agilistas. May I always be one.

    Read the article

  • From the Tips Box: Pre-installation Prep Work Makes Service Pack Upgrades Smoother

    - by Jason Fitzpatrick
    Last month Microsoft rolled out Windows 7 Service Pack 1 and, like many SP releases, quite a few people are hanging back to see what happens. If you want to update but still error on the side of caution, reader Ron Troy  offers a step-by-step guide. Ron’s cautious approach does an excellent job minimizing the number of issues that could crop up in a Service Pack upgrade by doing a thorough job updating your driver sets and clearing out old junk before you roll out the update. Read on to see how he does it: Just wanted to pass on a suggestion for people worried about installing Service Packs.  I came up with a ‘method’ a couple years back that seems to work well. Run Windows / Microsoft Update to get all updates EXCEPT the Service Pack. Use Secunia PSI to find any other updates you need. Use CCleaner or the Windows disk cleanup tools to get rid of all the old garbage out there.  Make sure that you include old system updates. Obviously, back up anything you really care about.  An image backup can be real nice to have if things go wrong. Download the correct SP version from Microsoft.com; do not use Windows / Microsoft Update to get it.  Make sure you have the 64 bit version if that’s what you have installed on your PC. Make sure that EVERYTHING that affects the OS is up to date.  That includes all sorts of drivers, starting with video and audio.  And if you have an Intel chipset, use the Intel Driver Utility to update those drivers.  It’s very quick and easy.  For the video and audio drivers, some can be updated by Intel, some by utilities on the vendor web sites, and some you just have to figure out yourself.  But don’t be lazy here; old drivers and Windows Service Packs are a poor mix. If you have 3rd party software, check to see if they have any updates for you.  They might not say that they are for the Service Pack but you cut your risk of things not working if you do this. Shut off the Antivirus software (especially if 3rd party). Reboot, hitting F8 to get the SafeMode menu.  Choose SafeMode with Networking. Log into the Administrator account to ensure that you have the right to install the SP. Run the SP.  It won’t be very fancy this way.  Maybe 45 minutes later it will reboot and then finish configuring itself, finally letting you log in. Total installation time on most of my PC’s was about 1 hour but that followed hours of preparation on each. On a separate note, I recently got on the Nvidia web site and their utility told me I had a new driver available for my GeForce 8600M GS.  This laptop had come with Vista, now has Win 7 SP1.  I had a big surprise from this driver update; the Windows Experience Score on the graphics side went way up.  Kudo’s to Nvidia for doing a driver update that actually helps day to day usage.  And unlike ATI’s updates (which I need for my AGP based system), this update was fairly quick and very easy.  Also, Nvidia drivers have never, as I can recall, given me BSOD’s, many of which I’ve gotten from ATI (TDR errors).How to Enable Google Chrome’s Secret Gold IconHTG Explains: What’s the Difference Between the Windows 7 HomeGroups and XP-style Networking?Internet Explorer 9 Released: Here’s What You Need To Know

    Read the article

  • 2010 April Fools Joke

    - by Dane Morgridge
    I started at my current job at the end of March last year and there were some pretty funny April fools jokes.  Nothing super crazy, but pretty funny.  One guy came in and there was a tree in his cube.  We (me and the rest of my team) were planning for a couple of weeks on what we could do that would be just awesome.  We had a lot of really good ideas but nothing was spectacular.  Then Steve Andrews had a brilliant idea (yes it's true).  Since we have internal DNS servers we could redirect DNS to our internal servers for a site such as cnn.com.  Then we would lift the code from the site and create our own home page that would contain news about people in the company.  Steve was actually laughing so hard when he thought of the idea that it took him almost 30 minutes to spit it out. I thought, "this is perfect". I had enlisted a couple of people to help come up with the stories and at the same time we were trying to figure out how to get everybody to the site the morning of the 1st.  Then it hit me.  We could have the main article be one of my getting picked up by the FBI on hacking charges.  Then Chris (my boss) could send an email out telling everyone that I would not be there today and direct them to the site.  That would for sure get everyone to go to cnn.com first thing and see our prank.  I begun the process of looking for photos I could crop myself into and found the perfect one.  Then my wife took a good pic with our Canon 40D and I went to work.  The night before I didn't have any other stories due to everyone being really busy at work, but I decided to go ahead with just the FBI bust on it's own.  I got everything working and tested and coordinated with Chris for me to come in late so no one would see me at the office until after everyone had seen the joke. And so the morning of April fools came and I was waiting at home and the email was perfect.  Chris told everyone that I wouldn't be in and that not to answer any questions if you got any calls from anybody.  The Photoshop job I did was not perfect, but good enough and I even wrote an article with it that went into more detail about how I had been classified as a terrorist and all kinds of stuff. People at work started getting the emails and a few people didn't realize it was a joke (as I had hoped), including some from senior management (one person in particular who shall remain nameless in this post).  Emails started flying around about how to contain the situation and how to handle bad PR.  He basically bought it hook, line and sinker and then went in to crisis mode.  It was awesome! He did finally realize it was a joke and I will likely print and frame the email he sent out.  In short, April fools this year was a huge success.

    Read the article

  • SQL SERVER – Sends backups to a Network Folder, FTP Server, Dropbox, Google Drive or Amazon S3

    - by pinaldave
    Let me tell you about one of the most useful SQL tools that every DBA should use – it is SQLBackupAndFTP. I have been using this tool since 2009 – and it is the first program I install on a SQL server. Download a free version, 1 minute configuration and your daily backups are safe in the cloud. In summary, SQLBackupAndFTP Creates SQL Server database and file backups on schedule Compresses and encrypts the backups Sends backups to a network folder, FTP Server, Dropbox, Google Drive or Amazon S3 Sends email notifications of job’s success or failure SQLBackupAndFTP comes in Free and Paid versions (starting from $29) – see version comparison. Free version is fully functional for unlimited ad hoc backups or for scheduled backups of up to two databases – it will be sufficient for many small customers. What has impressed me from the beginning – is that I understood how it works and was able to configure the job from a single form (see Image 1 – Main form above) Connect to you SQL server and select databases to be backed up Click “Add backup destination” to configure where backups should go to (network, FTP Server, Dropbox, Google Drive or Amazon S3) Enter your email to receive email confirmations Set the time to start daily full backups (or go to Settings if you need Differential or  Transaction Log backups on a flexible schedule) Press “Run Now” button to test You can get to this form if you click “Settings” buttons in the “Schedule section”. Select what types of backups and how often you want to run them and you will see the scheduled backups in the “Estimated backup plan” list A detailed tutorial is available on the developer’s website. Along with SQLBackupAndFTP setup gives you the option to install “One-Click SQL Restore” (you can install it stand-alone too) – a basic tool for restoring just Full backups. However basic, you can drag-and-drop on it the zip file created by SQLBackupAndFTP, it unzips the BAK file if necessary, connects to the SQL server on the start, selects the right database, it is smart enough to restart the server to drop open connections if necessary – very handy for developers who need to restore databases often. You may ask why is this tool is better than maintenance tasks available in SQL Server? While maintenance tasks are easy to set up, SQLBackupAndFTP is still way easier and integrates solution for compression, encryption, FTP, cloud storage and email which make it superior to maintenance tasks in every aspect. On a flip side SQLBackupAndFTP is not the fanciest tool to manage backups or check their health. It only works reliably on local SQL Server instances. In other words it has to be installed on the SQL server itself. For remote servers it uses scripting which is less reliable. This limitations is actually inherent in SQL server itself as BACKUP DATABASE command  creates backup not on the client, but on the server itself. This tool is compatible with almost all the known SQL Server versions. It works with SQL Server 2008 (all versions) and many of the previous versions. It is especially useful for SQL Server Express 2005 and SQL Server Express 2008, as they lack built in tools for backup. I strongly recommend this tool to all the DBAs. They must absolutely try it as it is free and does exactly what it promises. You can download your free copy of the tool from here. Please share your experience about using this tool. I am eager to receive your feedback regarding this article. Reference: Pinal Dave (http://blog.SQLAuthority.com)   Filed under: PostADay, SQL, SQL Authority, SQL Backup and Restore, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, SQLServer, T SQL, Technology

    Read the article

  • Top Tier, A-Game Talent - How to Land em'

    - by GeekAgilistMercenary
    Recently the question came up from a close friend of mine, "will my PhD help me attain a higher income in the north west?"  I had to tell him, that it might get him a little more, but it won't get him in the top income brackets for the occupation.  Another time, a few days later, someone else asked this too.  Then again, I see a job posting that requires a Bachelors Degree and some other nonsense.  The job posting even states they want "A-Game" talent. I am almost shocked at how poorly part of this industry doesn't realize how unimportant a degree is to getting real top tier, a-game talent.  (and yes, I get a little riled up about this matter) You Can't Make Good Software Developers.  No college out there is going to train someone to be in the top 10%, and absolutely not to be in the top 5% of skill levels.  Colleges can NOT do this.  It is up to the individual, and the individual alone.  If top tier talent seems to come from a college, one should check their premise and look at the motivations the individuals have to go to that school.  There is most likely a reason that top tier talent appears to be made there.  The college however, can only guide or assist, but I repeat that "top tier talent is a very individualistic endeavor". Some might say, well a group is needed, support is needed, this and that are needed.  True, an individual needs a support system and a college can provide that, but it generally ends there.  The support group helps, provides a sounding wall, and provides correlation to good ideas for the a-game top tier geek.  But again, the endeavor is the individuals desire. top tier talent is a very individualistic endeavor - Me Hiring Top Tier, A-Game Talent There are a few things when trying to hire this level of game player. The first thing is to not require a degree of any sort.  Sure, it looks good, but it won't dictate anything other than the individual was able to go through the regimented steps of college. List the skills and ideas that you would like to find in an individual.  Think of two people meeting for the first time, what do you want to know about the other individual.  Team fit is absolutely fundamental for top tier talent.  That support group that I mentioned above, top tier talent works best with a solid group of players. Keep your technology up to date, moving forward, and don't bore your top talent if you manage to get it.  If the company slows down, they will leave.  The more valuable they find out they are, the lower tolerance they'll have for this.  For managers, directors, and leaders in an organization this is THE challenge for them. Provide opportunities not just for advancement, but ways for them to advance their knowledge such as training, a book budget, or other means.  Even if some software they want to use isn't used ton the project, get it for them (within reason of course ? couple $100 or even a few $1000 for a good software license to MSDN, Tellerik, or other suite of software is ideal). Don't push them to, and don't let them overwork themselves into burnout.  This, as a leader in an organization is easy to do if one finds themselves actually hiring top talent.  Because top talent just provides results and more results.  But they are human, they will break, don't be the cause of that or you'll lose your talent. For now, that is it from me on this topic, back to the revenue, code, projects, and pushing forward. For the original entry, check out my personal blog with other juicy tech tidbits, rants, raves, and the like. Agilist Mercenary

    Read the article

  • Monitoring the Application alongside SQL Server

    - by Tony Davis
    Sometimes, on Simple-Talk, it takes a while to spot strange and unexpected patterns of user activity, or small bugs. For example, one morning we spotted that an article’s comment count had leapt to 1485, but that only four were displayed. With some rooting around in Google Analytics, and the endlessly annoying Community Server admin-interface, we were able to work out that a few days previously the article had been subject to a spam attack and that the comment count was for some reason including both accepted and unaccepted comments (which in turn uncovered a bug in the SQL). This sort of incident made us a lot keener on monitoring Simple-talk website usage more effectively. However, the metrics we wanted are troublesome, because they are far too specific for Google Analytics to measure, and the SQL Server backend doesn’t keep sufficient information to enable us to plot trends. The latter could provide, for example, the total number of comments made on, or votes cast for, articles, over all time, but not the number that occur by hour over a set time. We lacked a baseline, in other words. We couldn’t alter the database, as it is a bought-in package. We had neither the resources nor inclination to build-in dedicated application monitoring. Possibly, we could investigate a third-party tool to do the job; but then it occurred to us that we were already using a monitoring tool (SQL Monitor) to keep an eye on the database. It stored data, made graphs and sent alerts. Could we get it to monitor some aspects of the application as well? Of course, SQL Monitor’s single purpose is to check and monitor SQL Server, over time, rather than to monitor applications that use SQL Server. However, how different is the business of gathering and plotting SQL Server Wait Stats, from gathering and plotting various aspects of user activity on the site? Not a lot, it turns out. The latest version allows us to write our own custom monitoring scripts, meaning that we could now monitor any metric in the application that returns an integer. It took little time to write a simple SQL Query that collects basic metrics of the total number of subscribers, votes cast, comments made, or views of articles, over time. The SQL Monitor database polls Simple-Talk every second or so in order to get the latest totals, and can then store and plot this information, or even correlate SQL Server usage to application usage. You can see the live data by visiting monitor.red-gate.com. Click the "Analysis" tab, and select one of the "Simple-talk:" entries in the "Show" box and an appropriate data range (e.g. last 30 days). It’s nascent, and we’re still working on it, but it’s already given us more confidence that we’ll spot quickly trends, bugs, or bursts of ‘abnormal’ activity. If there is a sudden rise in comments, we get an alert, and if it’s due to a spam attack, we can moderate or ban the perpetrator very quickly. We’ve often argued that a tool should perform a single job well rather than turn into a Swiss-army knife, but ironically we’ve rather appreciated being able to make best use of what’s there anyway for a slightly different purpose. Is this a good or common practice? What do you think? Cheers, Tony.

    Read the article

  • How can I back up my ubuntu system?

    - by Eloff
    I'm sure there's a lot of questions on here similar to this, and I've been reading them, but I still feel this warrants a new question. I want nightly, incremental backups (full disk images would waste a lot of space - unless compressed somehow.) Preferably rotating or deleting old backups when running out of space or after a fixed number of backups. I want to be able to quickly and painlessly restore my system from these backups. This is my first time running ubuntu as my main development machine and I know from my experience with it as a server and in virtual machines that I regularly manage to make it unbootable or damage it to the point of being unable to rescue it. So how would you recommend I do this? There are so many options out there I really don't know where to start. There seems to be a vocal school of thought that it's sufficient to backup your home directory and the list of installed packages from the package manager. I've already installed lots of things from source, or outside of the package manager (development tools, ides, compilers, graphics drivers, etc.) So at the very least, if I do not back up the operating system itself I need to grab all config files, all program binaries, all created but required files, etc. I'd rather backup too much than too little - an ubuntu install is tiny anyway. Also this drastically reduces the restore time, which would cost me more in my time than the extra storage space. I tried using Deja Dup to backup the root partition, excluding some things like /mnt /media /dev /proc etc. Although many websites assured me you can backup a running linux system this way - that seems to be false as it complained that it could not backup the following files: /boot/System.map-3.0.0-17-generic /boot/System.map-3.2.0-22-generic /boot/vmcoreinfo-3.0.0-17-generic /boot/vmlinuz-3.0.0-17-generic /boot/vmlinuz-3.2.0-22-generic /etc/.pwd.lock /etc/NetworkManager/system-connections/LAN Connection /etc/apparmor.d/cache/lightdm-guest-session /etc/apparmor.d/cache/sbin.dhclient /etc/apparmor.d/cache/usr.bin.evince /etc/apparmor.d/cache/usr.lib.telepathy /etc/apparmor.d/cache/usr.sbin.cupsd /etc/apparmor.d/cache/usr.sbin.tcpdump /etc/apt/trustdb.gpg /etc/at.deny /etc/ati/inst_path_default /etc/ati/inst_path_override /etc/chatscripts /etc/cups/ssl /etc/cups/subscriptions.conf /etc/cups/subscriptions.conf.O /etc/default/cacerts /etc/fuse.conf /etc/group- /etc/gshadow /etc/gshadow- /etc/mtab.fuselock /etc/passwd- /etc/ppp/chap-secrets /etc/ppp/pap-secrets /etc/ppp/peers /etc/security/opasswd /etc/shadow /etc/shadow- /etc/ssl/private /etc/sudoers /etc/sudoers.d/README /etc/ufw/after.rules /etc/ufw/after6.rules /etc/ufw/before.rules /etc/ufw/before6.rules /lib/ufw/user.rules /lib/ufw/user6.rules /lost+found /root /run/crond.reboot /run/cups/certs /run/lightdm /run/lock/whoopsie/lock /run/udisks /var/backups/group.bak /var/backups/gshadow.bak /var/backups/passwd.bak /var/backups/shadow.bak /var/cache/apt/archives/lock /var/cache/cups/job.cache /var/cache/cups/job.cache.O /var/cache/cups/ppds.dat /var/cache/debconf/passwords.dat /var/cache/ldconfig /var/cache/lightdm/dmrc /var/crash/_usr_lib_x86_64-linux-gnu_colord_colord.102.crash /var/lib/apt/lists/lock /var/lib/dpkg/lock /var/lib/dpkg/triggers/Lock /var/lib/lightdm /var/lib/mlocate/mlocate.db /var/lib/polkit-1 /var/lib/sudo /var/lib/urandom/random-seed /var/lib/ureadahead/pack /var/lib/ureadahead/run.pack /var/log/btmp /var/log/installer/casper.log /var/log/installer/debug /var/log/installer/partman /var/log/installer/syslog /var/log/installer/version /var/log/lightdm/lightdm.log /var/log/lightdm/x-0-greeter.log /var/log/lightdm/x-0.log /var/log/speech-dispatcher /var/log/upstart/alsa-restore.log /var/log/upstart/alsa-restore.log.1.gz /var/log/upstart/console-setup.log /var/log/upstart/console-setup.log.1.gz /var/log/upstart/container-detect.log /var/log/upstart/container-detect.log.1.gz /var/log/upstart/hybrid-gfx.log /var/log/upstart/hybrid-gfx.log.1.gz /var/log/upstart/modemmanager.log /var/log/upstart/modemmanager.log.1.gz /var/log/upstart/module-init-tools.log /var/log/upstart/module-init-tools.log.1.gz /var/log/upstart/procps-static-network-up.log /var/log/upstart/procps-static-network-up.log.1.gz /var/log/upstart/procps-virtual-filesystems.log /var/log/upstart/procps-virtual-filesystems.log.1.gz /var/log/upstart/rsyslog.log /var/log/upstart/rsyslog.log.1.gz /var/log/upstart/ureadahead.log /var/log/upstart/ureadahead.log.1.gz /var/spool/anacron/cron.daily /var/spool/anacron/cron.monthly /var/spool/anacron/cron.weekly /var/spool/cron/atjobs /var/spool/cron/atspool /var/spool/cron/crontabs /var/spool/cups

    Read the article

  • BIP BIServer Query Debug

    - by Tim Dexter
    With some help from Bryan, I have uncovered a way of being able to debug or at least log what BIServer is doing when BIP sends it a query request. This is not for those of you querying the database directly but if you are using the BIServer and its datamodel to fetch data for a BIP report. If you have written or used the query builder against BIServer and when you run the report it chokes with a cryptic message, that you have no clue about, read on. When BIP runs a piece of BIServer logical SQL to fetch data. It does not appear to validate it, it just passes it through, so what is BIServer doing on its end? As you may know, you are not writing regular physical sql its actually logical sql e.g. select Jobs."Job Title" as "Job Title", Employees."Last Name" as "Last Name", Employees.Salary as Salary, Locations."Department Name" as "Department Name", Locations."Country Name" as "Country Name", Locations."Region Name" as "Region Name" from HR.Locations Locations, HR.Employees Employees, HR.Jobs Jobs The tables might not even be a physical tables, we don't care, that's what the BIServer and its model are for. You have put all the effort into building the model, just go get me the data from where ever it might be. The BIServer takes the logical sql and uses its vast brain to work out what the physical SQL is, executes it and passes the result back to BIP. select distinct T32556.JOB_TITLE as c1, T32543.LAST_NAME as c2, T32543.SALARY as c3, T32537.DEPARTMENT_NAME as c4, T32532.COUNTRY_NAME as c5, T32577.REGION_NAME as c6 from JOBS T32556, REGIONS T32577, COUNTRIES T32532, LOCATIONS T32569, DEPARTMENTS T32537, EMPLOYEES T32543 where ( T32532.COUNTRY_ID = T32569.COUNTRY_ID and T32532.REGION_ID = T32577.REGION_ID and T32537.DEPARTMENT_ID = T32543.DEPARTMENT_ID and T32537.LOCATION_ID = T32569.LOCATION_ID and T32543.JOB_ID = T32556.JOB_ID ) Not a very tough example I know but you get the idea. How do I know what the BIServer is up to? How can I find out what the issue might be if BIServer chokes on my query? There are a couple of steps: In the Administrator tool you need to set the logging level for the Administrator user to something greater than the default '0'. '7' is going to give you the max. Just remember to take it back down after you have finished the debug. I needed to bounce my BIServer service Now here's the secret sauce. Prefix the following to your BIP query set variable LOGLEVEL = 7; Set the log level to that you have in the admin tool Now run your BIP report. With the prefix in place; BIServer will write to the NQQuery.log file. This is located in the ./OracleBI/server/Log directory. In there you are going to find the complete process the BIServer has gone through to try and get the data back for you A quick note, if the BIServer can, its going to hit that great BIEE cache to get your data and you may not see the full log. IF this is the case. Get inot hte Administration page (via the browser login) and clear out your BIP report cursor. Then re-run. This will hopefully help out if you are trying to debug that annoying BIP report that will not run or is getting some strange data. Don't forget to turn that logging level back down once you are done. This will avoid the DBA screaming at you for sucking up all the disk space on the system.

    Read the article

  • What’s the Difference Between Succession Management and Talent Reviews?

    - by HCM-Oracle
    By Marcie Van Houten Is there a difference or are they pieces of one holistic strategic talent process? And can you have one without the other?  First, let me give a quick definition of each.  Succession planning (or management) is about creating succession slates or talent pools in support of a critical job or position or sets thereof. And then using those plans to help mitigate risk and plan talent needs for the organization.  Talent reviews (known by other names often) are sets of meetings where managers and executives come together to review, discuss and often heatedly debate the merits and potential of their employees, and then place and sometimes calibrate that talent on a performance to potential matrix.  These are some of the most strategic conversations happening in conference rooms across the globe. I speak with a lot of organizations about their practices in this area and the answers to these questions are as varied and nuanced as there are organizations thinking about them.  Some are passionate about their talent review processes and have a very evolved and thoughtful approach.  They really know their people, where their talent is, and the opportunities they plan to offer them.  And to them that is their succession process.  They may never create a slate of named candidates for a job or assign employees to formal talent pools.   On the flip side there are other organizations that create slates and slates and often multiple talent pools to support their strategic positions.  Through these, they are able to mitigate the risk associated with having a key player leave their organization.  And for them, that is their succession process.  Some will start from the lower levels of their organization and roll up their succession plans, while other organizations only cover their top 200 executives and key positions with plans.  And then there are organizations that leverage some of all of these.  Ultimately, the goals are to increase employee engagement, reduce talent-related risk, ensure the right talent is aligned to the strategic initiatives and to drive business value.  The approaches are as unique as the organizations they represent and the business opportunities they are looking to seize upon.   And that's ok.  It's great in fact. Because one thing that is common is the recognition that the need to know your people and align your top talent to the future needs of the organization is mission critical. Sure, there are a set of commonly recognized best practices and guiding principles for all of this.  There is no one right or perfect answer.  And that is what makes this all so much darn fun.  With Talent Review and Succession Management from Oracle HCM Cloud, we’ve blended the ability to support your strategic talent review conversations with both succession plans and talent pools allowing for one very seamless and interactive process. So whether you create a lot of succession plans, only focus on talent pools, have a robust talent review process, or all of the above, Oracle has you covered. I’m looking forward to spending time with our customers at the upcoming OHUG Global Conference 2014 happening June 9-13 in Las Vegas.  It’s an opportunity for me to talk to customers about their business and how they are doing strategic talent processes like talent reviews and succession.  I hope to see you there. Marcie Van Houten brings over 20 years of management consulting, information systems and human capital management experience to her role as director of product strategy at Oracle. Ms. Van Houten has spent the past several years at Oracle working closely with customers to help drive the direction of the company's talent and succession management applications. Additionally, she spent nine years at PeopleSoft as Director of Information Systems leading human capital management implementation projects. Marcie Van Houten lives in Walnut Creek, California, and holds a MBA from Southern Methodist University in Dallas, Texas.  You can follow her on Twitter: @MarcieVH

    Read the article

  • Using the RSSBus Salesforce Excel Add-In From Excel Macros (VBA)

    - by dataintegration
    The RSSBus Salesforce Excel Add-In makes it easy to retrieve and update data from Salesforce from within Microsoft Excel. In addition to the built-in wizards that make data manipulation possible without code, the full functionality of the RSSBus Excel Add-Ins is available programmatically with Excel Macros (VBA) and Excel Functions. This article shows how to write an Excel macro that can be used to perform bulk inserts into Salesforce. Although this article uses the Salesforce Excel Add-In as an example, the same process can be applied to any of the Excel Add-Ins available on our website. Step 1: Download and install the RSSBus Excel Add-In available on our website. Step 2: Open Excel and create place holder cells for the connection details that are needed from the macro. In this article, a spreadsheet will be created for batch inserts, and these cells will store the connection details, and will be used to report the job Id, the batch Id, and the batch status. Step 3: Switch to the Developer tab in Excel. Add a new button on the spreadsheet, and create a new macro associated with it. This macro will contain the code needed to insert a batch of rows into Salesforce. Step 4: Add a reference to the Excel Add-In by selecting Tools --> References --> RSSBus Excel Add-In. The macro functions of the Excel Add-In will be available once the reference has been added. The following code shows how to call a Stored Procedure. In this example, a job is created to insert Leads by calling the CreateJob stored procedure. CreateJob returns a jobId that can be used to upload a large number of Leads in one transaction. Note the use of cells B1, B2, B3, and B4 that were created in Step 2 to read the connection settings from the Excel SpreadSheet and to write out the status of the procedure. methodName = "CreateJob" module.SetProviderName ("Salesforce") nameArray = Array("ObjectName", "Action", "ConcurrencyMode") valueArray = Array("Lead", "insert", "Serial") user = Range("B1").value pass = Range("B2").value atoken = Range("B3").value If (Not user = "" And Not pass = "" And Not atoken = "") Then module.SetConnectionString ("User=" + user + ";Password=" + pass + ";Access Token=" + atoken + ";") If module.CallSP(methodName, nameArray, valueArray) Then Dim ColumnCount As Integer ColumnCount = module.GetColumnCount Dim idIndex As Integer For Count = 0 To ColumnCount - 1 Dim colName As String colName = module.GetColumnName(Count) If module.GetColumnName(Count) = "id" Then idIndex = Count End If Next While (Not module.EOF) Range("B4").value = module.GetValue(idIndex) module.MoveNext Wend Else MsgBox "The CreateJob query failed." End If Exit Sub Else MsgBox "Please specify the connection details." Exit Sub End If Error: MsgBox "ERROR: " & Err.Description Step 5: Add the code to your macro. If you use the code above, you can check the results at Salesforce.com. They can be seen at Administration Setup -> Monitoring -> Bulk Data Load Jobs. Download the attached sample file for a more complete demo. Distributing an Excel File With Macros An Excel file with macros is saved using the .xlms extension. The code for the macro remains in the Excel file, and you can distribute your Excel file to any machine where the RSSBus Salesforce Excel Add-In is already installed. Macro Sample File Please download the fully functional sample excel file that includes the code referenced here. You will also need the RSSBus Excel Add-In to make the connection. You can download a free trial here. Note: You may get an error message stating: "Can't find project or library." in Excel 2007, since this example is made using Excel 2010. To resolve this, navigate to Tools -> References and uncheck the "MISSING: RSSBus Excel Add-In", then scroll down and check the "RSSBus Excel Add-In" listed below it.

    Read the article

  • A quick look at: sys.dm_os_buffer_descriptors

    - by Jonathan Allen
    SQL Server places data into cache as it reads it from disk so as to speed up future queries. This dmv lets you see how much data is cached at any given time and knowing how this changes over time can help you ensure your servers run smoothly and are adequately resourced to run your systems. This dmv gives the number of cached pages in the buffer pool along with the database id that they relate to: USE [tempdb] GO SELECT COUNT(*) AS cached_pages_count , CASE database_id WHEN 32767 THEN 'ResourceDb' ELSE DB_NAME(database_id) END AS Database_name FROM sys.dm_os_buffer_descriptors GROUP BY DB_NAME(database_id) , database_id ORDER BY cached_pages_count DESC; This gives you results which are quite useful, but if you add a new column with the code: …to convert the pages value to show a MB value then they become more relevant and meaningful. To see how your server reacts to queries, start up SSMS and connect to a test server and database – mine is called AdventureWorks2008. Make sure you start from a know position by running: -- Only run this on a test server otherwise your production server's-- performance may drop off a cliff and your phone will start ringing. DBCC DROPCLEANBUFFERS GO Now we can run a query that would normally turn a DBA’s hair white: USE [AdventureWorks2008] go SELECT * FROM [Sales].[SalesOrderDetail] AS sod INNER JOIN [Sales].[SalesOrderHeader] AS soh ON [sod].[SalesOrderID] = [soh].[SalesOrderID] …and then check our cache situation: A nice low figure – not! Almost 2000 pages of data in cache equating to approximately 15MB. Luckily these tables are quite narrow; if this had been on a table with more columns then this could be even more dramatic. So, let’s make our query more efficient. After resetting the cache with the DROPCLEANBUFFERS and FREEPROCCACHE code above, we’ll only select the columns we want and implement a WHERE predicate to limit the rows to a specific customer. SELECT [sod].[OrderQty] , [sod].[ProductID] , [soh].[OrderDate] , [soh].[CustomerID] FROM [Sales].[SalesOrderDetail] AS sod INNER JOIN [Sales].[SalesOrderHeader] AS soh ON [sod].[SalesOrderID] = [soh].[SalesOrderID] WHERE [soh].[CustomerID] = 29722 …and check our effect cache: Now that is more sympathetic to our server and the other systems sharing its resources. I can hear you asking: “What has this got to do with logging, Jonathan?” Well, a smart DBA will keep an eye on this metric on their servers so they know how their hardware is coping and be ready to investigate anomalies so that no ‘disruptive’ code starts to unsettle things. Capturing this information over a period of time can lead you to build a picture of how a database relies on the cache and how it interacts with other databases. This might allow you to decide on appropriate schedules for over night jobs or otherwise balance the work of your server. You could schedule this job to run with a SQL Agent job and store the data in your DBA’s database by creating a table with: IF OBJECT_ID('CachedPages') IS NOT NULL DROP TABLE CachedPages CREATE TABLE CachedPages ( cached_pages_count INT , MB INT , Database_Name VARCHAR(256) , CollectedOn DATETIME DEFAULT GETDATE() ) …and then filling it with: INSERT INTO [dbo].[CachedPages] ( [cached_pages_count] , [MB] , [Database_Name] ) SELECT COUNT(*) AS cached_pages_count , ( COUNT(*) * 8.0 ) / 1024 AS MB , CASE database_id WHEN 32767 THEN 'ResourceDb' ELSE DB_NAME(database_id) END AS Database_name FROM sys.dm_os_buffer_descriptors GROUP BY database_id After this has been left logging your system metrics for a while you can easily see how your databases use the cache over time and may see some spikes that warrant your attention. This sort of logging can be applied to all sorts of server statistics so that you can gather information that will give you baseline data on how your servers are performing. This means that when you get a problem you can see what statistics are out of their normal range and target you efforts to resolve the issue more rapidly.

    Read the article

  • SQLAuthority News – Learning Never Ends – Becoming Student Again

    - by pinaldave
    From my past few blog posts you may see a pattern – learning.  I finished my own college education a few years ago, but I firmly believe that learning should never stop.  We can learn on the job, or from outside reading, but we should always try to be learning new things.  It keeps the brain sharp!  In fact, I often find myself learning new things from reviewing old material.  If you have been reading my blog lately, you will recognize the name Koenig Solutions. You might also be rolling your eyes at me and my enthusiasm for learning and training.  College was hard work, why continue it?  Didn’t we all get educations so that we could get jobs and go on vacation? Of course, having a job means that you cannot take vacations all the time.  I have often asked my friend who owns Koenig, jokingly, when he is going to open a Koenig center in Bangalore. I relocated to Bangalore 1.5 years ago, so I wanted a center I could walk to anytime.  Last week I was very happy to hear that they have opened a center in Bangalore. Pinal Dave at Friend’s Company I could not let a new center open without visiting it and congratulating my friend, so I recently stopped by.  I was immediately taken by the desire to go back to “school” and learn something new.  I have signed up to take a continuing education course through the new Koenig center and here is the exciting part: I will be blogging about it so that you all can be inspired to learn, too!  Keep checking back here for further updates and blog posts about my learning experience. However, what is the fun to attend the session in the town where you stay. I indeed visited their center in Bangalore but I have opted to learn the course in another city. Well, more information about the same in near future. Pinal Dave is going to be a student again Honestly, why not learn new things and become more confident?  When we have more education we will become better at our jobs, which can lead to more confidence and efficiency, but may also have more physical rewards – like a raise or promotion.  We don’t always have to focus on shallow rewards like money and recognition, so think about how much more you will enjoy your work when you know more about it.  Koenig is offering training for new certificates in SQL Server 2012, and I am planning on investigating these for sure. I feel good that I am going to be a student again and will be learning new stuff. As I said I will blog my experience as I go. I hope that my continuing education blog posts will inspire you, my readers, to go out and learn more.  I am serious about my education and my goal is to prove how serious I am here, on my blog. I am a big fan of Learning and Sharing and I hope this series will inspire you to learn new technology which can help you progress in your career and help balance your life with work. Note: This blog post is about what inspired me to sign up for learning course. Becoming student should be the attitude of a lifetime. This post is not about a career change. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

< Previous Page | 140 141 142 143 144 145 146 147 148 149 150 151  | Next Page >