Search Results

Search found 3253 results on 131 pages for 'progress bars'.

Page 99/131 | < Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >

  • Silverlight Cream for November 13, 2011 -- #1166

    - by Dave Campbell
    In this Issue: Pontus Wittenmark, Jeff Blankenburg(-2-), Colin Eberhardt, Charles Petzold, Dhananjay Kumar, Igor, Beth Massi, Kunal Chowdhury(-2-), Shawn Wildermuth, XAMLNinja, and Peter Kuhn(-2-). Above the Fold: Silverlight: "Silverlight Page Navigation Framework - Learn about UriMapper" Kunal Chowdhury WP7: "31 Days of Mango" Jeff Blankenburg WinRT/Metro/W8: "An Introduction to Semantic Zoom in Windows 8 Metro" Colin Eberhardt LightSwitch: "Common Validation Rules in LightSwitch Business Applications" Beth Massi Shoutouts: Michael Palermo's latest Desert Mountain Developers is up Michael Washington's latest Visual Studio #LightSwitch Daily is up From SilverlightCream.com: 10 tips about porting Silverlight apps to WinRT/Metro style apps (Part 1) Pontus Wittenmark spent some time porting his Silverlight game to WinRT and says it was easier than expected. He has posted 10 tips for porting... and promises more 31 Days of Mango Looks like Jeff Blankenburg started another 31 days series... this one on Mango dev... and looks like I'm late to the party, but that's ok, gives me more stuff to blog about... this time you can get the posts by email, and he has a hashtag for discussion too 31 Days of Mango | Day #1: The New Windows Phone Emulator Tools Day 1 of Jeff Blankenburg's journey is this post on what's new in the emulator tools. An Introduction to Semantic Zoom in Windows 8 Metro This is Colin Eberhardt's latest ... getting familiar with semantic zoom oin Metro by creating a WP7-stylke jumplist experience.... check out the video on his blogpost for a better idea of what he's up to .NET Streams and Windows 8 IStreams In his first real post on his new series writing an EPUB viewer for W8, Charles Petzold described using IInputStream to get the contents of a disk file... and source for the project in progress Video on How to work with Page Navigation and Back Button in Windows Phone 7 Dhananjay Kumar has a video tutorial up on Page Navigation and Back Button usage in WP7 Screen capture to media library instead of isolated storage Igor discusses a class that lets you save screen captures for use in your application and also saving them to the media library on the phone Common Validation Rules in LightSwitch Business Applications Beth Massi's latest is this LightSwitch post on Validation rules... showing how to define declarative rules and also write custom validation code. Silverlight Page Navigation Framework - Learn about UriMapper Kunal Chowdhury continues his Page Navigation discussion with this post on the UriMapper, and how to hide the actual URL of the page you're navigating to How to use PlaySoundAction Behavior in WP7 Application? Kunal Chowdhury also has this post up on using the PlaySoundAction Behavior in WP7 ... nice tutorial on using Blend to get the job done What Win8 Should Learn from Windows Phone After spending time with Windows 8, Shawn Wildermuth has this post up about features from WP7 that should be brought over to Windows 8, and finishes with features that WP8 (?) could learn from Win8 too WP7Contrib – FindaPad and the fastest list in the west XAMLNinja discusses the WP7 App FindaPad which spawned the creation of WP7Contrib and uses the app to describe some nuances that may not be readily obvious. Windows Phone 7: The kind of bug you don't want to discover Peter Kuhn discusses a problem he came across while programming WP7, interestingly enough, only in the emulator, and has to do with a Uint64 cast. He does offer a workaround. Announcing: Your Last About Dialog (YLAD) Peter Kuhn also has this post up that's a take-off on a post by Jeff Wilcox about a generic About Dialog. Peter has some great additions.. and he's right... it may be your last About Dialog... get it via NuGet, too! Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Convert .3GP and .3G2 Files to AVI / MPEG for Free

    - by DigitalGeekery
    3GP and .3G2 are common video capture formats used on many mobile phones, but they may not be supported by your favorite media player. Today we’ll show you a quick and easy way to convert those files to AVI or MPG format with the free Windows application, Pazera Free 3GP to AVI Converter. Download the Pazera Free 3GP to AVI Converter. You’ll have to unzip the download folder, but there is no need to install the application. Just double-click the 3gptoavi.exe file to run the application. To add your 3GP or 3G2 files to the queue to be converted, click on the Add files  button at the top left. Browse for your file, and click Open.   Your video will be added to the Queue. You can add multiple files to the queue and convert them all at one time.   Most users will find it preferable to use one of the pre-configured profiles for their conversion settings. To load a profile, choose one from the Profile drop down list and then click the Load button. You will see the profile update the settings in the panels at the bottom of the application. We tested Pazera Free 3GP to AVI Converter with 3GP files recorded on a Motorola Droid, and found the AVI H.264 Very High Q. profile to return the best results for AVI output, and the MPG – DVD NTSC: MPEG-2 the best results for MPG output. Other profiles produced smaller file sizes, but at a cost of reduced quality video output.   More advanced users may tweak video and audio settings to their liking in the lower panels. Click on the AVI button under Output file format / Video settings to adjust settings AVI… Or the MPG button to adjust the settings for MPG output. By default, the converted file will be output to the same location as the input directory. You can change it by clicking the text box input radio button and browsing for a different folder. When you’ve chosen your settings, click Convert to begin the conversion process.   A conversion output box will open and display the progress. When finished, click Close. Now you’re ready to enjoy your video in your favorite media player. Pazera Free 3GP to AVI Converter isn’t the most robust media conversion tool, but it does what it is intended to do. It handles the task of 3GP to AVI / MPG conversion very well. It’s easy enough for the beginner to manage without much trouble, but also has enough options to please more experienced users. Download Pazera Free 3GP to AVI Converter Similar Articles Productive Geek Tips How To Convert Video Files to MP3 with VLCEasily Change Audio File Formats with XRECODEConvert PDF Files to Word Documents and Other FormatsConvert Video and Remove Commercials in Windows 7 Media Center with MCEBuddy 1.1Compress Large Video Files with DivX / Xvid and AutoGK TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Install, Remove and HIDE Fonts in Windows 7 Need Help with Your Home Network? Awesome Lyrics Finder for Winamp & Windows Media Player Download Videos from Hulu Pixels invade Manhattan Convert PDF files to ePub to read on your iPad

    Read the article

  • Upgrading from MVC 1.0 to MVC2 in Visual Studio 2010 and VS2008.

    - by Sam Abraham
    With MVC2 officially released, I was involved in a few conversations regarding the feasibility of upgrading existing MVC 1.0 projects to quickly leverage the newly introduced MVC features. Luckily, Microsoft has proactively addressed this question for both Visual Studio 2008 and 2010 and many online resources discussing the upgrade process are a "Bing/Google Search" away. As I will happen to be speaking about MVC2 and Visual Studio 2010 at the Ft Lauderdale ArcSig .Net User Group Meeting on April 20th 2010 (Check http://www.fladotnet.com for more info.), I decided to include a quick demo on upgrading the NerdDinner project (which I consider the "Hello MVC World" project) from MVC 1.0 to MVC2 using Visual studio 2010 to demonstrate how simple the upgrade process is. In the next few lines, I will be briefly touching on upgrading to MVC2 for Visual Studio 2008 then discussing, in more detail, the upgrade process using Visual Studio 2010 while highlighting the advantage of its multi-targeting support. Using Visual Studio 2008 SP1 For upgrading to MVC2 Using VS2008 SP1, a Microsoft White Paper [1] presents two approaches:  1- Using a provided automated upgrade tool, 2-Manually upgrading the project. I personally prefer using the automated tool although it comes with an "AS IS" disclaimer. For those brave souls, or those who end up with no luck using the tool, detailed manual upgrade steps are also provided as a second option. Backing up the project in question is a must regardless of which route one would take to upgrade. Using Visual Studio 2010 Life is much easier for developers who already adopted Visual Studio 2010. Simply opening the MVC 1.0 solution file brings up the upgrade wizard as shown in figures 1, 2, 3 and 4. As we proceed with the upgrade process, the wizard requests confirmation on whether we choose to upgrade our target framework version to .Net 4.0 or keep the existing .Net 3.5 (Figure 5). VS2010 does a good job with multi-targeting where we can still develop .Net 3.5 applications while leveraging all the new bells and whistles that VS2010 brings to the table (Multi-targeting enables us to develop with as early as .Net 2.0 in VS2010) Figure 1 - Open Solution File Using VS2010   Figure 2 - VS2010 Conversion Wizard Figure 3- Ready To Convert To VS2010 Confirmation Screen Figure 4 - VS2010 Solution Conversion Progress Figure 5 - Confirm Target Framework Upgrade In an attempt to make my demonstration realistic, I decided to opt to keep the project targeted to the .Net 3.5 Framework.  After the successful completion of the conversion process,  a quick sanity check revealed that the NerdDinner project is still targeted to the .Net 3.5 framework as shown in figure 6. Inspecting the Web.Config revealed that the MVC DLL version our code compiles against has been successfully upgraded to 2.0 (Figure 7) and hence we should now be able to leverage the newly introduced features in MVC2 and VS2010 with no effort or time invested on modifying existing code. Figure 6- Confirm Target Framework Remained .Net 3.5  Figure 7 - Confirm MVC DLL Version Has Been Upgraded In Conclusion, Microsoft has empowered developers with the tools necessary to quickly and seamlessly upgrade their MVC solutions to the newly released MVC2. The multi-targeting feature in Visual Studio 2010 enables us to adopt this latest and greatest development tool while supporting development in as early as .Net 2.0. References 1. "Upgrading an ASP.NET MVC 1.0 Application to ASP.NET MVC 2" http://www.asp.net/learn/whitepapers/aspnet-mvc2-upgrade-notes

    Read the article

  • regarding the Windows Phone 7 series, XNA and Visual Basic

    - by Chris Williams
    as long as we're talking about VB... I figured I would share this as well. Hi everyone, I'm about to express a sentiment that might ruffle a few feathers, but I think most of you know me well enough to know I love like accept VB for what it is and that what I'm about to say is with good intentions. (The rest of you, who don't know me, please take my word for it.) The world is full of VB developers, I was one of them for a long time. I think it's safe to assume that none of us are ignorant people who require handholding. We're working professionals, making a living by using our skills as developers. I'm also willing to bet that quite a few of us are fluent in C# as well as VB. It may not be your preferred language, but many of you can do it and you prove that nearly every day. Honestly, I don't know ANY developers or consultants that have only known ONE language ever. So it pains me greatly when I see the word "CAN'T" being tossed around like a crutch... as in "we CAN'T develop for the windows phone or we CAN'T develop XNA games." At MIX, Microsoft hath decreed that C# is the language of choice for developing for the Windows Phone 7. I think it's a safe bet that you won't see VB support if it isn't there already. (Just like XNA... which is up to version 4.0 by now.)  So what? (Yeah... I said it.) I think everyone here can agree that actual coding is only one part of software design and development. There is nothing stopping ANY of you from beginning the process of designing your killer phone app, writing up specs, requirements, doing UI design, workflow, mockups, storyboards, art, etc.... None of these things are language dependent. IF by the time you've got that stuff out of the way, and there's still no VB support, then start doing some rapid prototyping of your app in C# (I know, I know... heresy!)  You still have to spend time learning how the phone does things, what UI tricks do what, what paradigms make sense, how to use to accelerometer and the tilt and the multitouch functionality. I can guarantee you that time spent doing this is a great investment, no matter WHAT extension your code files have. Eventually, you may have a working prototype. IF by this time, there's STILL no VB support... fret not, you've made significant progress on your app. You've designed it, prototyped it, figured out how to use the phone specific features... so you might as well finish it and pat yourself on the back for learning something new... and possibly being first to market with your new app. I'll be happy to argue any and all of these points online or off with anyone who cares to do so, but there is one undeniable point that you simply can't argue:  Your potential customers do not care AT ALL what programming language you used to write the app they are about to purchase. They care that it works. If your biggest concern is being first to market, than stop complaining and get busy because you're running out of time and the 3000+ people who were at MIX certainly aren't waiting for you. They've already started working on their apps.

    Read the article

  • 256 Worker Role 3D Rendering Demo is now a Lab on my Azure Course

    - by Alan Smith
    Ever since I came up with the crazy idea of creating an Azure application that would spin up 256 worker roles (please vote if you like it ) to render a 3D animation created using the Kinect depth camera I have been trying to think of something useful to do with it. I have also been busy working on developing training materials for a Windows Azure course that I will be delivering through a training partner in Stockholm, and for customers wanting to learn Windows Azure. I hit on the idea of combining the render demo and a course lab and creating a lab where the students would create and deploy their own mini render farms, which would participate in a single render job, consisting of 2,000 frames. The architecture of the solution is shown below. As students would be creating and deploying their own applications, I thought it would be fun to introduce some competitiveness into the lab. In the 256 worker role demo I capture the rendering statistics for each role, so it was fairly simple to include the students name in these statistics. This allowed the process monitor application to capture the number of frames each student had rendered and display a high-score table. When I demoed the application I deployed one instance that started rendering a frame every few minutes, and the challenge for the students was to deploy and scale their applications, and then overtake my single role instance by the end of the lab time. I had the process monitor running on the projector during the lab so the class could see the progress of their deployments, and how they were performing against my implementation and their classmates. When I tested the lab for the first time in Oslo last week it was a great success, the students were keen to be the first to build and deploy their solution and then watch the frames appear. As the students mostly had MSDN suspicions they were able to scale to the full 20 worker role instances and before long we had over 100 worker roles working on the animation. There were, however, a few issues who the couple of issues caused by the competitive nature of the lab. The first student to scale the application to 20 instances would render the most frames and win; there was no way for others to catch up. Also, as they were competing against each other, there was no incentive to help others on the course get their application up and running. I have now re-written the lab to divide the student into teams that will compete to render the most frames. This means that if one developer on the team can deploy and scale quickly, the other team still has a chance to catch up. It also means that if a student finishes quickly and puts their team in the lead they will have an incentive to help the other developers on their team get up and running. As I was using “Sharks with Lasers” for a lot of my demos, and reserved the sharkswithfreakinlasers namespaces for some of the Azure services (well somebody had to do it), the students came up with some creative alternatives, like “Camels with Cannons” and “Honey Badgers with Homing Missiles”. That gave me the idea for the teams having to choose a creative name involving animals and weapons. The team rendering architecture diagram is shown below.   Render Challenge Rules In order to ensure fair play a number of rules are imposed on the lab. ·         The class will be divided into teams, each team choses a name. ·         The team name must consist of a ferocious animal combined with a hazardous weapon. ·         Teams can allocate as many worker roles as they can muster to the render job. ·         Frame processing statistics and rendered frames will be vigilantly monitored; any cheating, tampering, and other foul play will result in penalties. The screenshot below shows an example of the team render farm in action, Badgers with Bombs have taken a lead over Camels with Cannons, and both are  leaving the Sharks with Lasers standing. If you are interested in attending a scheduled delivery of my Windows Azure or Windows Azure Service bus courses, or would like on-site training, more details are here.

    Read the article

  • Extract Audio from a Video File with Pazera Free Audio Extractor

    - by DigitalGeekery
    Have you ever wanted to extract some or all of the audio from a video file?  Today we’ll take a look at Pazera Free Audio Extractor. A simple audio converter that specializes in that very task. Download the Pazera Free Audio Extractor. (See download link below) You’ll need to unzip the download folder, but there is no need to install the application. Simply double-click the AudioExtractor.exe file to run the application. To add your video files to the queue to be converted, click on the Add files  button at the top left. You can add multiple files to the queue and convert them all at one time. Browse for your video file, and click Open.   Your video will be added to the Queue for processing.   Under Output directory you can choose to output to a folder of your choice. Outputting to the same folder as the input folder is the default.   Pazera Free Audio Extractor includes pre-configured profiles that will simplify the process of choosing conversion settings. To load a profile, choose one from the Profile drop down list and then click the Load button. You can choose to output to MP3, AAC, AC3, WMA, FLAC, OGG or WAV file format.   You will see the profile update the Audio settings in the panels at the lower left of the application. If you wish, you may also select your own custom settings. Advanced Settings The Advanced settings can be used if you want to extract only a portion of the the audio, such as a clip of dialog or a song from a movie. To extract only a portion of the audio, set the start time by selecting the Start time offset check box, then entering the time in the video clip where the audio begins. To set the end time, begin by selecting the Duration check box. Now, you can either select the Duration radio button and enter the amount of time for which you would like to extract the audio, or you can select the End time offset radio button and enter the time in the video clip where the audio ends. When you are ready to convert, click the CONVERT button on the menu at the top of the screen.   An output box will open and display the conversion progress. When finished, click Close.   Now you are ready to enjoy your audio clip. Pazera Free Audio Extractor is a basic audio tool that is easy enough for everyone to use. It runs on Windows only and supports most common video formats including AVI, FLV, MP4, MPG, MOV, 3GP, and WMV. Download Free Audio Extractor 1.3 Similar Articles Productive Geek Tips Eufony Free Audio Player – Resource Gentle Audio PlayerConvert .3GP and .3G2 Files to AVI / MPEG for FreeTurn Off Auto-Play of Audio and Video CDs and DVDs in UbuntuHow to Make/Edit a movie with Windows Movie Maker in Windows VistaEasily Change Audio File Formats with XRECODE TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Use Printflush to Solve Printing Problems Icelandic Volcano Webcams Open Multiple Links At One Go NachoFoto Searches Images in Real-time Office 2010 Product Guides Google Maps Place marks – Pizza, Guns or Strip Clubs

    Read the article

  • Deploy Oracle Management Agent using RPM File

    - by cristiano.toni
    Normal 0 21 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tableau Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Times","serif"; mso-ansi-language:FR;} 1) Create a rpm package on Enterprise Manager 12c a) as Root : # yum install rpmbuild # mkdir /usr/lib/oracle b) as oracle user # cd $<OMS_HOME>/bin/ # emcli get_supported_platforms ----------------------------------------------- Version = 12.1.0.3.0  Platform = Linux x86-64 ----------------------------------------------- Platforms list displayed successfully. #  emcli get_agentimage_rpm -destination=/tmp/agentRPM -platform="Linux x86-64" \ -version=12.1.0.3.0 Platform:Linux x86-64 Destination:/tmp/agentRPM Exalogic:false  Checking for disk space requirements...  === Partition Detail === Space free : 6 GB Space required : 1 GB RPM creation in progress ... Check the logs at /Oracle/gc_inst/em/EMGC_OMS1/sysman/emcli/setup/.emcli/get_agentimage_rpm_date-PM.log Copying agent image from software library to /tmp/agentRPM Setting property ORACLE_HOME to:/Oracle/middleware/oms calling pulloneoffs with arguments:/Oracle/middleware/oms/Oracle/middleware/oms/sysman/agent/ \ 12.1.0.3.0_AgentCore_226.zip12.1.0.3.0Linux x86-64/tmp/agentRPMtrue Agent Image copied successfully... Creation of RPM started... RPM creation successful. Agent image to rpm conversion completed successfully 2) Copy it on all new hosts and install it.  As Root user : c) check and install rpm file # rpm -ivh --test oracle-agt-12.1.0.3.0-1.0.x86_64.rpm  Preparing...                ########################################### [100%] # rpm -ivh oracle-agt-12.1.0.3.0-1.0.x86_64.rpm  Preparing...                ########################################### [100%] Running the prereq    1:oracle-agt             ########################################### [100%] Agent RPM installation is completed successfully. Now to configure the agent follow the below steps: 1. Edit the properties file: /usr/lib/oracle/agent/agent.properties with the correct values 2. Execute the script /etc/init.d/oracle-agt RESPONSE_FILE=/usr/lib/oracle/agent/agent.properties d) create a user for the agent: # useradd -m -d /home/em12adm -s /bin/bash -g dba -G oinstall em12adm # passwd em12adm e) Edit file /usr/lib/oracle/agent/agent.properties # vi /usr/lib/oracle/agent/agent.properties  OMS_HOST=<host_Enterprise_Manager> OMS_PORT=<HTTPS Upload Port > AGENT_REGISTRATION_PASSWORD=oracle AGENT_USERNAME=em12adm AGENT_GROUP=dba ORACLE_HOSTNAME=oraclevm-mgmt # chown -R em12adm:dba /usr/lib/oracle/agent/ Start agent and register the new host server on EM12c   #  /etc/init.d/oracle-agt RESPONSE_FILE=/usr/lib/oracle/agent/agent.properties Now you have registered on EM12C your new target host.

    Read the article

  • Introducing Oracle User Productivity Kit (UPK) 12.1 Thursday 26th June 2014 – Oracle, Reading, Berkshire

    - by Kathryn Lustenberger
    Join Oracle UPK Product Management and Product Development In conjunction with Larmer Brown Register Now v\:* {behavior:url(#default#VML);} o\:* {behavior:url(#default#VML);} w\:* {behavior:url(#default#VML);} .shape {behavior:url(#default#VML);} Normal 0 false false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} table.MsoTableGrid {mso-style-name:"Table Grid"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-priority:59; mso-style-unhide:no; border:solid windowtext 1.0pt; mso-border-alt:solid windowtext .5pt; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-border-insideh:.5pt solid windowtext; mso-border-insidev:.5pt solid windowtext; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} UPK Client Event – Introducing v12.1 Thursday 26th June 2014 Oracle Thames Valley Park, Reading, Berkshire Agenda Time Session 10.00am Registration and Coffee 10.30am Introductions and Objectives TWIN TRACK SESSION 10.45am Introduction to UPK (Standard) Version 12.1 Overview and Demonstration for delegates new to UPK Upgrading to UPK (Standard) Version 12.1 Demonstration of the latest release, for delegates with experience of UPK 12.25pm Q&A An opportunity for delegates to raise specific questions about the tool Q&A An opportunity for delegates to raise specific questions about the latest release 12.45pm Lunch 1.30pm Larmer Brown Development Tracker Larmer Brown’s Development Tracker addresses the challenge of ensuring that a Content Development Project will meet agreed deadlines, identifying risks with sufficient notice to take action 1.50pm Case Study How the Development Tracker addressed this client’s requirement to track, monitor and report progress on a large-scale implementation Project 2.10pm Larmer Brown Library Content for UPK This session will showcase some of Larmer Brown’s content library and consider how pre-built content can be used to your advantage 2.30pm Coffee Break 2.45pm Making the most of UPK Professional This presentation and demonstration seeks to unlock the potential of UPK Professional for those that may not be fully utilising the tool   3.20pm Case Study How this client has utilised the tracking and reporting features within UPK Professional 3.40pm Summary and Conclusions 4.00pm Close

    Read the article

  • Moving the Oracle User Experience Forward with the New Release 7 Simplified UI for Oracle Sales Cloud

    - by mvaughan
    By Kathy Miedema, Oracle Applications User ExperienceIn September 2013, Release 7 for Oracle Cloud Applications became generally available for Oracle Sales Cloud and HCM Cloud. This significant release allowed the Oracle Applications User Experience (UX) team to finally talk freely about Simplified UI, a user experience project in the works since Oracle OpenWorld 2012. Simplified UI represents the direction that the Oracle user experience – for all of its enterprise applications – is heading. Oracle’s Apps UX team began by building a Simplified UI for sales representatives. You can find that today in Release 7, and it was demoed extensively during OpenWorld 2013 in San Francisco. This screenshot shows how Opportunities appear in the new Simplified UI for Oracle Sales Cloud, a user interface built for sales reps.Analyst Rebecca Wettemann, vice president of Nucleus Research, saw Simplified UI at Oracle Openworld 2013 and talked about it with CRM Buyer in “Oracle Revs Its Cloud Engines for a Better Customer Experience.” Wettemann said there are distinct themes to the latest release: "One is usability. Oracle Sales Cloud, for example, is designed to have zero training for onboarding sales reps, which it does," she explained. "It is quite impressive, actually -- the intuitive nature of the application and the design work they have done with this goal in mind."The software uses as few buttons and fields as possible, she pointed out. "The sales rep doesn't have to ask, 'what is the next step?' because she can see what it is."In fact, there are three themes driving the usability that Wettemann noted. They are simplicity, mobility, and extensibility, and we write more about them on the Usable Apps web site. These three themes embody the strategy for Oracle’s cloud applications user experiences.  Simplified UI for Oracle Sales CloudIn developing a Simplified UI for Oracle Sales Cloud, Oracle’s UX team concentrated on the tasks that sales reps need to do most frequently, and are most important. “Knowing that the majority of their work lives are spent on the road and on the go, they need to be able to quickly get in and qualify and convert their leads, monitor and progress their opportunities, update their customer and contact information, and manage their schedule,” Jeremy Ashley, Vice President of the Applications UX team, said.Ashley said the Apps UX team has a good reason for creating a Simplified UI that focuses on self-service. “Sales people spend the day selling stuff,” he said. “The only reason they use software is because the company wants to track what they’re doing.” Traditional systems of tracking that information include filling in a spreadsheet of leads or sales. Oracle wants to automate this process for the salesperson, and enable that person to keep everyone who needs to know up-to-date easily and quickly. Simplified UI addresses that problem by providing light-touch input.  “It has to be useful to the salesperson,” Ashley said about the Sales Cloud user experience. Simplified UI can tell sales reps about key opportunities, or provide information about a contact in just a click or two. Customer information is accessible quickly and easily with Simplified UI for the Oracle Sales Cloud.Simplified UI for Sales Cloud can also be extended easily, Ashley said. Users usually just need to add various business fields or create and modify analytical reports. The way that Simplified UI is constructed allows extensibility to happen by hiding or showing a few necessary fields. The Settings user interface, starting in release 7, allows for the simple configuration of the most important visual elements. “With Sales cloud, we identified a need to make the application useful and very simple,” Ashley said. Simplified UI meets that need. Where can you find out more?To find out more about the simplified UI and Oracle’s ongoing investment in applications user experience innovations, come to one of our sessions at a user group conference near you. Stay tuned to the Voice of User Experience (VoX) blog – the next post will be about Simplified UI and HCM Cloud.

    Read the article

  • Python Coding standards vs. productivity

    - by Shroatmeister
    I work for a large humanitarian organisation, on a project building software that could help save lives in emergencies by speeding up the distribution of food. Many NGOs desperately need our software and we are weeks behind schedule. One thing that worries me in this project is what I think is an excessive focus on coding standards. We write in python/django and use a version of PEP0008, with various modifications e.g. line lengths can go up to 160 chars and all lines should go that long if possible, no blank lines between imports, line wrapping rules that apply only to certain kinds of classes, lots of templates that we must use, even if they aren't the best way to solve a problem etc. etc. One core dev spent a week rewriting a major part of the system to meet the then new coding standards, throwing away several suites of tests in the process, as the rewrite meant they were 'invalid'. We spent two weeks rewriting all the functionality that was lost, and fixing bugs. He is the lead dev and his word carries weight, so he has convinced the project manager that these standards are necessary. The junior devs do as they are told. I sense that the project manager has a strong feeling of cognitive dissonance about all this but nevertheless agrees with it vehemently as he feels unsure what else to do. Today I got in serious trouble because I had forgotten to put some spaces after commas in a keyword argument. I was literally shouted at by two other devs and the project manager during a Skype call. Personally I think coding standards are important but also think that we are wasting a lot of time obsessing with them, and when I verbalized this it provoked rage. I'm seen as a troublemaker in the team, a team that is looking for scapegoats for its failings. Since the introduction of the coding standards, the team's productivity has measurably plummeted, however this only reinforces the obsession, i.e. the lead dev simply blames our non-adherence to standards for the lack of progress. He believes that we can't read each other's code if we don't adhere to the conventions. This is starting to turn sticky. Now I am trying to modify various scripts, autopep8, pep8ify and PythonTidy to try to match the conventions. We also run pep8 against source code but there are so many implicit amendments to our standard that it's hard to track them all. The lead dev simple picks faults that the pep8 script doesn't pick up and shouts at us in the next stand-up meeting. Every week there are new additions to the coding standards that force us to rewrite existing, working, tested code. Thank heavens we still have tests, (I reverted some commits and fixed a bunch of the ones he removed). All the while there is increasing pressure to meet the deadline. I believe a fundamental issue is that the lead dev and another core dev refuse to trust other developers to do their job. But how to deal with that? We can't do our job because we are too busy rewriting everything. I've never encountered this dynamic in a software engineering team. Am I wrong to question their adherence to coding standards? Has anyone else experienced a similar situation and how have they dealt with it successfully? (I'm not looking for a discussion just actual solutions people have found)

    Read the article

  • DAC pack up all your troubles

    - by Tony Davis
    Visual Studio 2010, or perhaps its apparently-forthcoming sister, "SQL Studio", is being geared up to become the natural way for developers to create databases. Central to this drive is the introduction of 'data-tier application components', or DACs. Applications are developed as normal but when it comes to deployment, instead of supplying the DBA with a bunch of scripts to create the required database objects, the developer creates a single DAC Package ("DAC Pack"); a zipped XML file containing all the database objects needed by the application, along with versioning information, policies for deployment, and so on. It's an intriguing prospect. Developers can work on their development database using their existing tools and source control, and then package up the changes into a single DACPAC for deployment and management. DBAs get an "application level view" of how their instances are being used and the ability to collectively, rather than individually, manage the objects. The DBA needing to manage a large number of relatively small databases can use "DAC snapshots" to get a quick overview of what has changed across all the databases they manage. The reason that DAC packs haven't caused more excitement is that they can only be pushed to SQL Server 2008 R2, and they must be developed or inspected using Visual Studio 2010. Furthermore, what we see right now in VS2010 is more of a 'work-in-progress' or 'vision of the future', with serious shortcomings and restrictions that render it unsuitable for anything but small 'non-critical' departmental databases. The first problem is that DAC packs support a limited set of schema objects (corresponding closely to the features available on 'Azure'). This means that Service Broker queues, CLR Objects, and perhaps most critically security (permissions, certificates etc.), are off-limits. Applications that require these objects will need to add them via a post-deployment TSQL script, rather defeating the whole idea. More worrying still is the process for altering a database with a DAC pack. The grand 'collective' philosophy, whereby a single XML file can be used for deploying and managing builds and changes, extends, unfortunately, to database upgrades. Any change to a database object will result in the creation of a new database, copying the data from the old version, nuking the previous one, and then renaming the new one. Simple eh? The problem is that even something as trivial as adding a comment to a stored procedure in a 5GB database will require the server to find at least twice as much space, as well sufficient elbow-room in the transaction log for copying the largest table. Of course, you'll need to take the database offline for the full course of the deployment, which is likely to take a long time if there is a lot of data. This upgrade/rename process breaks the log chain, makes any subsequent full restore operation highly complicated, and will also break log shipping. As with any grand vision, the devil is always in the detail. It's hard to fathom why Microsoft hasn't used a SQL Compare-style approach to the upgrade process, altering a database with a change script, and this will surely be adopted in the near future. Something had to be in place for VS2010, but right now DAC packs only make sense for Azure. For this, they're cute, but hardly compelling. Nevertheless, DBAs would do well to get familiar with VS 2010 and DAC packs. Like it or not, they're both coming. Cheers, Tony.

    Read the article

  • Pull Request Changes, Multi-Selection in Advanced View, and Advertisement Changes

    [Do you tweet? Follow us on Twitter @matthawley and @adacole_msft] We deployed a new version of the CodePlex website today. Pull Request Changes In this release, we have begun to re-focus on Pull Requests to ensure a productive experience between the project users and developers. We feel we made significant progress in this area for this release and look forward to using your feedback to drive future iterations. One of the biggest hurdles people have indicated is the inability to see what a pull request includes without pulling the source down from a Mercurial client. With today’s changes, any user has the ability to view a pull request, the changesets / changes included, and perform an inline diff of the file. When a pull request is made, the CodePlex website will query for all outgoing changes from the fork to the main repository for a point-in-time comparison. Because of this point-in-time comparison… All existing pull requests created prior to this release will not have changesets associated with them. If new commits are pushed to the fork while a pull request is active, they will not appear associated with the pull request. The pull request will need to be re-submitted for them to appear. Once a pull request is created, you can “View the Pull Request” which takes you to a page that looks like As you may notice, we now display a lot more detailed information regarding that pull request including who it was requested by and when, the associated changesets, the description, who it’s assigned to (we’ll come back to this) and the listing of summarized file changes. What you’ll also notice, is that each modified file has the ability to view a diff of all changes made. When you click “(view diff)” for a file, an inline diff experience appears. This new experience allows you to quickly navigate through all of the modified files as well as viewing the various change blocks for each file. You’ll also notice as you browse through each file’s changes, we update the URL to include the file path so you can quickly send a direct link to a pull request’s file. Clicking “(close diff)” will bring you back to the original pull request view. View this pull request live on WikiPlex. Pull Request Review Assignment Another new feature we added for pull requests is the ability for project members to assign pull requests for review. Any project member has the ability to assign (and re-assign if needed) a pull request to a project member. Once the assignment has been made, that project member will be notified via email of the assignment. Once they complete the review of the pull request, they can either accept or deny it similarly to the previous process. Multi-Selection in Advanced View Filters One of the more recent requests we have heard from users is the ability multi-select advanced view filters for work items. We are happy to announce this is now possible. Simply control-click the multiple options for each filter item and your work item query will be refined as such. Should you happen to unselect all options for a given filter, it will automatically reset to the default option for that filter. Furthermore, the “Direct Link” URL will be updated to include the multi-selected options for each filter. Note: The “Direct Link” feature was released in our previous deployment, just never written about. It allows you to capture the current state of your query and send it to other individuals. Advertisement Changes Very recently, the advertiser (The Lounge) we partnered to provide advertising revenue for projects, or donated to charity, was acquired by Lake Quincy Media. There has been no change in the advertising platform offering, and all projects have been converted over to using the new infrastructure. Project owners should note the new contact information for getting paid. The CodePlex team values your feedback, and is frequently monitoring Twitter, our Discussions and Issue Tracker for new features or problems. If you’ve not visited the Issue Tracker recently, please take a few moments to log an idea or vote for the features you would most like to see implemented on CodePlex.

    Read the article

  • How to Add Your Gmail Account to Outlook 2013 Using IMAP

    - by Lori Kaufman
    If you use Outlook to check and manage your email, you can easily use it to check your Gmail account as well. You can setup your Gmail account to allow you to synchronize email across multiple machines using email clients instead of a browser. We will show you how to use IMAP in your Gmail account so you can synchronize your Gmail account across multiple machines, and then how to add your Gmail account to Outlook 2013. To setup your Gmail account to use IMAP, sign in to your Gmail account and go to Mail. Click the Settings button in the upper, right corner of the window and select Settings from the drop-down menu. On the Settings screen, click Forwarding and POP/IMAP. Scroll down to the IMAP Access section and select Enable IMAP. Click Save Changes at the bottom of the screen. Close your browser and open Outlook. To begin adding your Gmail account, click the File tab. On the Account Information screen, click Add Account. On the Add Account dialog box, you can choose the E-mail Account option which automatically sets up your Gmail account in Outlook. To do this enter your name, email address, and the password for your Gmail account twice. Click Next. The progress of the setup displays. The automatic process may or may not work. If the automatic process fails, select Manual setup or additional server types, instead of E-mail Account, and click Next. On the Choose Service screen, select POP or IMAP and click Next. On the POP and IMAP Account Settings enter the User, Server, and Logon Information. For the Server Information, select IMAP from the Account Type drop-down list and enter the following for the incoming and outgoing server information: Incoming mail server: imap.googlemail.com Outgoing mail server (SMTP): smtp.googlemail.com Make sure you enter your full email address for the User Name and select Remember password if you want Outlook to automatically log you in when checking email. Click More Settings. On the Internet E-mail Settings dialog box, click the Outgoing Server tab. Select the My outgoing server (SMTP) requires authentication and make sure the Use same settings as my incoming mail server option is selected. While still in the Internet E-mail Settings dialog box, click the Advanced tab. Enter the following information: Incoming server: 993 Incoming server encrypted connection: SSL Outgoing server encrypted connection TLS Outgoing server: 587 NOTE: You need to select the type of encrypted connection for the outgoing server before entering 587 for the Outgoing server (SMTP) port number. If you enter the port number first, the port number will revert back to port 25 when you change the type of encrypted connection. Click OK to accept your changes and close the Internet E-mail Settings dialog box. Click Next. Outlook tests the accounts settings by logging into the incoming mail server and sending a test email message. When the test is finished, click Close. You should see a screen saying “You’re all set!”. Click Finish. Your Gmail address displays in the account list on the left with any other email addresses you have added to Outlook. Click the Inbox to see what’s in your Inbox in your Gmail account. Because you’re using IMAP in your Gmail account and you used IMAP to add the account to Outlook, the messages and folders in Outlook reflect what’s in your Gmail account. Any changes you make to folders and any time you move email messages among folders in Outlook, the same changes are made in your Gmail account, as you will see when you log into your Gmail account in a browser. This works the other way as well. Any changes you make to the structure of your account (folders, etc.) in a browser will be reflected the next time you log into your Gmail account in Outlook.     

    Read the article

  • SQL SERVER – Partition Parallelism Support in expressor 3.6

    - by pinaldave
    I am very excited to learn that there is a new version of expressor’s data integration platform coming out in March of this year.  It will be version 3.6, and I look forward to using it and telling everyone about it.  Let me describe a little bit more about what will be so great in expressor 3.6: Greatly enhanced user interface Parallel Processing Bulk Artifact Upgrading The User Interface First let me cover the most obvious enhancements. The expressor Studio user interface (UI) has had some significant work done. Kudos to the expressor Engineering team; the entire UI is a visual masterpiece that is very responsive and intuitive. The improvements are more than just eye candy; they provide significant productivity gains when developing expressor Dataflows. Operator shape icons now include a description that identifies the function of each operator, instead of having to guess at the function by the icon. Operator shapes and highlighting depict the current function and status: Disabled, enabled, complete, incomplete, and error. Each status displays an appropriate message in the message panel with correction suggestions. Floating or docking property panels provide descriptive tool tips for each property as well as auto resize when adjusting the canvas, without having to search Help or the need to scroll around to get access to the property. Progress and status indicators let you know when an operation is working. “No limit” canvas with snap-to-grid allows automatic sizing and accurate positioning when you have numerous operators in the Dataflow. The inline tool bar offers quick access to pan, zoom, fit and overview functions. Selecting multiple artifacts with a right click context allows you to easily manage your workspace more efficiently. Partitioning and Parallel Processing Partitioning allows each operator to process multiple subsets of records in parallel as opposed to processing all records that flow through that operator in a single sequential set. This capability allows the user to configure the expressor Dataflow to run in a way that most efficiently utilizes the resources of the hardware where the Dataflow is running. Partitions can exist in most individual operators. Using partitions increases the speed of an expressor data integration application, therefore improving performance and load times. With the expressor 3.6 Enterprise Edition, expressor simplifies enabling parallel processing by adding intuitive partition settings that are easy to configure. Bulk Artifact Upgrading Bulk Artifact Upgrading sounds a bit intimidating, but it actually is not and it is a welcome addition to expressor Studio. In past releases, users were prompted to confirm that they wanted to upgrade their individual artifacts only when opened. This was a cumbersome and repetitive process. Now with bulk artifact upgrading, a user can easily select what artifact or group of artifacts to upgrade all at once. As you can see, there are many new features and upgrade options that will prove to make expressor Studio quicker and more efficient.  I hope I’m not the only one who is excited about all these new upgrades, and that I you try expressor and share your experience with me. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Performance, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Oracle Social Network Developer Challenge: Fishbowl Solutions

    - by Kellsey Ruppel
    Originally posted by Jake Kuramoto on The Apps Lab blog. Today, I give you the final entry in the Oracle Social Network Developer Challenge, held last week during OpenWorld. This one comes from Friend of the ‘Lab and Fishbowl Solutions (@fishbowle20) hacker, John Sim (@jrsim_uix), whom you might remember from his XBox Kinect demo at COLLABORATE 12 (presentation slides and abstract) hacks and other exploits with WebCenter. We put this challenge together specifically for developers like John, who like to experiment with new tools and push the envelope of what’s possible and build cool things, and as you can see from his entry John did just that, mashing together Google Maps and Oracle Social Network into a mobile app built with PhoneGap that uses the device’s camera and GPS to keep teams on the move in touch. He calls it a Mobile GeoTagging Solution, but I think Avengers Assemble! would have equally descriptive, given that was obviously his inspiration. Here’s his description of the mobile app: My proposed solution was to design and simplify GeoLocation mapping, and automate updates for users and teams on the move; who don’t have access to a laptop or want to take their ipads out – but allow them to make quick updates to OSN and upload photos taken from their mobile device – there and then. As part of this; the plan was to include a rules engine that could be configured by the user to allow the device to automatically update and post messages when they arrived at a set location(s). Inspiration for this came from on{x} – automate your life. Unfortunately, John didn’t make it to the conference to show off his hard work in person, but luckily, he had a colleague from Fishbowl and a video to showcase his work.    Here are some shots of John’s mobile app for your viewing pleasure: John’s thinking is sound. Geolocation is usually relegated to consumer use cases, thanks to services like foursquare, but distributed teams working on projects out in the world definitely need a way to stay in contact. Consider a construction job. Different contractors all converge on a single location, and time is money. Rather than calling or texting each other and risking a distracted driving accident, an app like John’s allows everyone on the job to see exactly where the other contractors are. Using his GPS rules, they could easily be notified about how close each is to the site, definitely useful when you have a flooring contractor sitting idle, waiting for an electrician to finish the wiring. The best part is that the project manager or general contractor could stay updated on all the action (or inaction) using Oracle Social Network, either sitting at a desk using the browser app or desktop client or on the go, using one of the native mobile apps built for Oracle Social Network. I can see this being used by insurance adjusters too, and really any team that, erm, assembles at a given spot. Of course, it’s also useful for meeting at the pub after the day’s work is done. Beyond people, this solution could also be implemented for physical objects that are in route to a destination. Say you’re a customer waiting on rail shipment or a package delivery. You could track your valuable’s whereabouts easily as they report their progress via checkins. If they deviated from the GPS rules, you’d be notified. You might even be able to get a picture into Oracle Social Network with some light hacking. Thanks to John and his colleagues at Fishbowl for participating in our challenge. We hope everyone had a good experience. Make sure to check out John’s blog post on his work and the experience using Oracle Social Network. Although this is the final, official entry we had, tomorrow, I’ll show you the work of someone who finished code, but wasn’t able to make the judging event. Stay tuned.

    Read the article

  • How to run RCU from the command line

    - by Kevin Smith
    When I was trying to figure out how to run RCU on 64-bit Linux I found this post. It shows how to run RCU from the command line. It didn't actually work for me, so you can see my post on how to run RCU on 64-bit Linux. But, seeing how to run RCU from the command got me started thinking about running RCU from the command line to create the schema for WebCenter Content. That post got me part of the way there since it shows how run RCU silently from the command line, but to do this you need to know the name of the RCU component for WebCenter Content. I poked around in the RCU files and found the component name for WCC is CONTENTSERVER11. There is a contentserver11 directory in rcuHome/rcu/integration and when you look at the contentserver11.xml file you will see <RepositoryConfig COMP_ID="CONTENTSERVER11"> With the component name for WCC in hand I was able to use this command line to run RCU and create the schema for WCC. .../rcuHome/bin/rcu -silent -createRepository -databaseType ORACLE -connectString localhost:1521:orcl1 -dbUser sys -dbRole sysdba -schemaPrefix TEST -component CONTENTSERVER11 -f <rcu_passwords.txt To make the silent part work and not have it prompt you for the passwords needed (sys password and password for each schema) you use the -f option and specify a file containing the passwords, one per line, in the order the components are listed on the -component argument. Here is the output from rcu when I ran the above command. Processing command line ....Repository Creation Utility - Checking PrerequisitesChecking Global PrerequisitesRepository Creation Utility - Checking PrerequisitesChecking Component PrerequisitesRepository Creation Utility - Creating TablespacesValidating and Creating TablespacesRepository Creation Utility - CreateRepository Create in progress.Percent Complete: 0...Percent Complete: 100Repository Creation Utility: Create - Completion SummaryDatabase details:Host Name              : localhostPort                   : 1521Service Name           : ORCL1Connected As           : sysPrefix for (prefixable) Schema Owners : TESTRCU Logfile            : /u01/app/oracle/logdir.2012-09-26_07-53/rcu.logComponent schemas created:Component                            Status  LogfileOracle Content Server 11g - Complete Success /u01/app/oracle/logdir.2012-09-26_07-53/contentserver11.logRepository Creation Utility - Create : Operation Completed This works fine if you want to use the default tablespace sizes and options, but there does not seem to be a way to specify the tablespace options on the command line. You can specify the name of the tablespace and temp tablespace, but they must already exist in the database before running RCU. I guess you can always create the tablespaces first using your desired sizes and options and then run RCU and specify the tablespaces you created. When looking up the command line options in the RCU doc I found it has the list of components for each product that it supports. See Appendix B in the RCU User's Guide.

    Read the article

  • Do you want to be an ALM Consultant?

    - by Martin Hinshelwood
    Northwest Cadence is looking for our next great consultant! At Northwest Cadence, we have created a work environment that emphasizes excellence, integrity, and out-of-the-box thinking.  Our customers have high expectations (rightfully so) and we wouldn’t have it any other way!   Northwest Cadence has some of the most exciting customers I have ever worked with and even though I have only been here just over a month I have already: Provided training/consulting for 3 government departments Created and taught courseware for delivering Scrum to teams within a high profile multinational company Started presenting Microsoft's ALM Engagement Program  So if you are interested in helping companies build better software more efficiently, then.. Enquire at [email protected] Application Lifecycle Management (ALM) Consultant An ALM Consultant with a minimum of 8 years of relevant experience with Application Lifecycle Management, Visual Studio (including Visual Studio Team System) and software design is needed. Must provide thought leadership on best practices for enterprise architecture, understand the Microsoft technology solution stack, and have a thorough understanding of enterprise application integration. The ALM Practice Lead will play a central role in designing and implementing the overall ALM Practice strategy, including creating, updating, and delivering ALM courseware and consultancy engagements. This person will also provide project support, deliverables, and quality solutions on Visual Studio Team System that exceed client expectations. Engagements will vary and will involve providing expert training, consulting, mentoring, formulating technical strategies and policies and acting as a “trusted advisor” to customers and internal teams. Sound sense of business and technical strategy required. Strong interpersonal skills as well as solid strategic thinking are key. The ideal candidate will be capable of envisioning the solution based on the early client requirements, communicating the vision to both technical and business stakeholders, leading teams through implementation, as well as training, mentoring, and hands-on software development. The ideal candidate will demonstrate successful use of both agile and formal software development methods, enterprise application patterns, and effective leadership on prior projects. Job Requirements Minimum Education: Bachelor’s Degree (computer science, engineering, or math preferred). Locale / Travel: The Practice Lead position requires estimated 50% travel, most of which will be in the Continental US (a valid national Passport must be maintained).  This is a full time position and will be based in the Kirkland office. Preferred Education: Master’s Degree in Information Technology or Software Engineering; Premium Microsoft Certifications on .NET (MCSD) or MCPD or relevant experience; Microsoft Certified Trainer (MCT) or relevant experience. Minimum Experience and Skills: 7+ years experience with business information systems integration or custom business application design and development in a professional technology consulting, corporate MIS or software development environment. Essential Duties & Responsibilities: Provide training, consulting, and mentoring to organizations on topics that include Visual Studio Team System and ALM. Create content, including labs and demonstrations, to be delivered as training classes by Northwest Cadence employees. Lead development teams through the complete ALM and/or Visual Studio Team System solution. Be able to communicate in detail how a solution will integrate into the larger technical problem space for large, complex enterprises. Define technical solution requirements. Provide guidance to the customer and project team with respect to technical feasibility, complexity, and level of effort required to deliver a custom solution. Ensure that the solution is designed, developed and deployed in accordance with the agreed upon development work plan. Create and deliver weekly status reports of training and/or consulting progress. Engagement Responsibilities: · Provide a strong desire to provide thought leadership related to technology and to help grow the business. · Work effectively and professionally with employees at all levels of a customer’s organization. · Have strong verbal and written communication skills. · Have effective presentation, organizational and planning skills. · Have effective interpersonal skills and ability to work in a team environment. Enquire at [email protected]

    Read the article

  • How to Convert Videos to 3GP for Mobile Phones

    - by DigitalGeekery
    Would you like to play videos on your phone, but the device only supports 3GP files? We’ll show you how to convert popular video files into 3GP mobile phone video format with Pazera Free Video to 3GP Converter. Download the Pazera Free Video to 3GP Converter (Download link below). It will allow you to convert popular video files (AVI, MPEG, MP4, FLV, MKV, and MOV) to work on your mobile phone. There is no installation to run. You’ll just need to unzip the download folder and double-click the videoto3gp.exe file to run the application. To add video files to the queue, click on the Add files button. Browse for your file, and click Open.   Your video will be added to the Queue. You can add multiple files to the queue and convert them all at one time. The converter comes with several pre-configured profiles for conversion settings. To load a profile, select one from the Profile drop down list and then click the Load button. The settings in the panels at the bottom of the application will be automatically updated.   If you are a more advanced user, the options on the lower panels allow for adjusting settings to your liking. You can choose between 3GP and 3G2 (for some older phones), H.263, MPEG-4, and XviD video codecs, AAC or AMR-NB audio codecs, as well as a variety of bitrates, resolutions, etc.  By default, the converted file will be output to the same location as the input directory. You can change it by clicking the text box input radio button and browsing for a different folder. Click Convert to start the conversion process. A conversion output box will open and display the progress. When finished, click Close.   Now you’re ready to load the video onto your phone and enjoy.     Conclusion Pazera Free Video to 3GP Converter is not exactly the ultimate video conversion tool, but it is quick and simple enough for the average user to convert most video formats to 3GP. Plus, it’s portable. You can copy the folder to a USB drive and take it with you. Do you have some 3GP video files you’d like to convert to more common formats? Check out our earlier article on how to convert 3GP to AVI and MPEG for free. Link Download Pazera Free Video to 3GP Converter Similar Articles Productive Geek Tips Convert .3GP and .3G2 Files to AVI / MPEG for FreeExtract Audio from a Video File with Pazera Free Audio ExtractorConvert PDF Files to Word Documents and Other FormatsConvert YouTube Videos to MP3 with YouTube DownloaderFriday Fun: Watch HD Video Content with Meevid TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips VMware Workstation 7 Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Daily Motivator (Firefox) FetchMp3 Can Download Videos & Convert Them to Mp3 Use Flixtime To Create Video Slideshows Creating a Password Reset Disk in Windows Bypass Waiting Time On Customer Service Calls With Lucyphone MELTUP – "The Beginning Of US Currency Crisis And Hyperinflation"

    Read the article

  • Moving monarchs and dragons: migrating the JDK bugs to JIRA

    - by darcy
    Among insects, monarch butterflies and dragonflies have the longest migrations; migrating JDK bugs involves a long journey as well! As previously announced by Mark back in March, we've been working according to a revised plan to transition the JDK bug management from Sun's legacy system to initially an Oracle-internal JIRA instance which is afterward made visible and usable externally. I've been busily working on this project for the last few months and the team has made good progress on many aspects of the effort: JDK bugs will be imported into JIRA regardless of age; bugs will also be imported regardless of state, including closed bugs. Consequently, the JDK bug project will start pre-populated with over 100,000 existing bugs, some dating all the way back to 1994. This will allow a continuity of information and allow new issues to be linked to old ones. Using a custom import process, the Sun bug numbers will be preserved in JIRA. For example, the Sun bug with bug number 4040458 will become "JDK-4040458" in JIRA. In JIRA the project name, "JDK" in our case, is part of the bug's identifier. Bugs created after the JIRA migration will be numbered starting at 8000000; bugs imported from the legacy system have numbers ranging between 1000000 and 79999999. We're working with the bugs.sun.com team to try to maintain continuity of the ability to both read JDK bug information as well as to file new incidents. At least for now, the overall architecture of bugs.sun.com will be the same as it is today: it will be a gateway bridging to an Oracle-internal system, but the internal system will change to JIRA from the legacy database. Generally we are aiming to preserve the visibility of bugs currently viewable on bugs.sun.com; however, bugs in areas not related to the JDK will not be visible after the transition to JIRA. New incoming incidents will be sent to a separate JIRA project for initial triage before possibly being moved into the JDK project. JDK bug management leans heavily on being able to track the state of bugs in multiple releases, especially to coordinate delivering synchronized security releases (known as CPUs, critital patch updates, in Oracle parlance). For a security release, it is common for half a dozen or more release trains to be affected (for example, JDK 5, JDK 6 update, OpenJDK 6, JDK 7 update, JDK 8, virtual releases for HotSpot express, etc.). We've determined we need to track at least the tuple of (release, responsible engineer/assignee for the release, status in the release) for the release trains a fix is going into. To do this in JIRA, we are creating a separate port/backport issue type along with a custom link type to allow the multiple release information to be easily grouped and presented together. The Sun legacy system had a three-level classification scheme, product, category, and subcategory. Out of the box, JIRA only has a one-level classification, component. We've implemented a custom second-level classification, subcomponent. As part of the bug migration we've taken the opportunity to think about how bugs should be grouped under a two-level system and we'll the new system will be simpler and more regular. The main top-level components of the JDK product will include: core-libs client-libs deploy install security-libs other-libs tools hotspot For the libs areas, the primary name of the subcomportment will be the package of the API in question. In the core-libs component, there will be subcomponents like: java.lang java.lang.class_loading java.math java.util java.util:i18n In the tools component, subcomponents will primarily correspond to command names in $JDK/bin like, jar, javac, and javap. The first several bulk imports of the JDK bugs into JIRA have gone well and we're continuing to refine the import to have greater fidelity to the current data, including by reconstructing information not brought over in a structured fashion during the previous large JDK bug system migration back in 2004. We don't currently have a firm timeline of when the new system will be usable externally, but as it becomes available, I'll share further information in follow-up blog posts.

    Read the article

  • Seamless STP with Oracle SOA Suite

    - by user12339860
    STP stands for “Straight Through Processing”. Wikipedia describes STP as a solution that enables “the entire trade process for capital markets and payment transactions to be conducted electronically without the need for re-keying or manual intervention, subject to legal and regulatory restrictions” .I will deal with the later part of the definition i.e “payment transactions without manual intervention” in this article. The STP that I am writing about involves the interaction between a Bank and its’ corporate customers,to that extent this business case is also called “Corporate Payments”.Simply put a  Corporate Payment-STP solution needs to connect the payment transaction right from the Corporate ERP into the Bank’s Payment Hub. A SOA based STP solution can do a lot more than just process transaction. But before I get to the solution let me describe the perspectives of the two primary parties in this interaction. The Corporate customer and the Bank. Corporate's Interaction with Bank:  Typically it is the treasury department of an enterprise which interacts with the Bank on a daily basis. Here is how a day of interaction would look like from the treasury department of a corp. Corporate Cash Retrieve Beginning of day totals Monitor Cash Accounts Send or receive cash between accounts Supply chain payments Payment Settlements Calculate settlement positions Retrieve End of Day totals Assess Transaction Financial Impact Short Term Investment Desk Retrieve Current Account information Conduct Investment activities Bank’s Interaction with the Corporate :  From the Bank’s perspective, the interaction starts from the point of on boarding a corporate customer to billing the corporate for the value added services it provides. Once the corporate is on-boarded the daily interaction involves Handle the various formats of data arriving from customers Process Beginning of Day & End of Day reporting request from customers Meet compliance requirements Process Payments Transmit Payment Status Challenges with this Interaction :  Both the Bank & the Corporate face many challenges from these interactions. Some of the challenges include Keeping a consistent view of transaction data for various LOBs of the corporate & the Bank Corporate customers use different ERPs, hence the data formats are bound to be different Can the Bank’s IT systems convert the data formats that can be easily mapped to the corporate ERP How does the Bank manage the communication profiles of these customers?  Corporate customers are demanding near real time visibility on their corporate accounts Corporate customers can make better cash management decisions if they can analyse the impact. Can the Bank create opportunities to sell its products to the investment desks at corporate houses & manage their orders? How will the Bank bill the corporate customer for the value added services it provides. What does a SOA based Seamless STP solution bring to the table? Highlights of Oracle SOA based STP solution For the Corporate Customer: No Manual or Paper based banking transactions Secure Delivery of Payment data to the Bank from multiple ERPs without customization Single Portal for monitoring & administering payment transactions Rule based validation of payments Customer has data necessary for more effective handling of payment and cash management decisions  Business measurements track progress toward payment cost goals  For the Bank: Reduces time & complexity of transactions Simplifies the process of introducing new products to corporate customers Single Payment hub for all corporate ERP payments across multiple instruments New Revenue sources by delivering value added services to customers Leverages existing payment infrastructure Remove Inconsistent data formats and interchange between bank and corporate systems  Compliance and many other benefits

    Read the article

  • Rolling Along: PASS Board Year 2, Q2

    - by Denise McInerney
    Eighteen months into my time as a PASS Director I’m especially proud of what the Virtual Chapters have accomplished and want to share that progress with you. I'm also pleased that the organization has invested more resources to support the VCs. In this quarter I got to attend two conferences and meet more members of the SQL community. Virtual Chapters In the first six months of 2013 VCs have hosted more than 50 webinars, offering free technical education to over 6200 attendees. This is a great benefit to PASS members; thanks to the VC leaders, volunteers and speakers who contribute their time to produce these events. The Performance VC held their “Summer Performance Palooza”, an event featuring eight back-to-back sessions. Links to the session recordings can be found on the VCs web site. The new webinar platform, GoToWebinar, has been rolled out to all the VCs. This is a more stable, scalable platform and represents an important investment into the future of the VCs. A few new VCs are in the planning stages, including one focused on Security and one for Russian speakers. Visit the Virtual Chapter home page to sign up for the chapters that interest you. Each Virtual Chapter is offering a discount code for PASS Summit 2013. Be sure to ask your VC leader for the code to save $200 on Summit registration. 24 Hours of PASS The next 24HOP will be on July 31. This Summit Preview edition will feature 24 consecutive webcasts presented by experts who will be speaking at Summit in October. Registration for this free event is open now. And we will be using the GoToWebinar platform for 24HOP also. Business Analytics Conference April marked the first PASS Business Analytics Conference in Chicago. This introduced PASS to another segment of data professionals: the analysts and data scientists who work with the world’s growing collection of data. Overall the inaugural event was a success and gave us a glimpse into this increasingly important space. After Chicago the Board had several serious discussions about the lessons learned from this seven and what we should do next. We agreed to apply those lessons and continue to invest in this event; there will be a PASS Business Analytics Conference in 2014. I’m very pleased the next event will be in San Jose, CA, the heart of Silicon Valley, a place where a great deal of investment and innovation in data analytics is taking place. Global SQL Community Over the last couple of years PASS has been taking steps to become more relevant to SQL communities in different parts of the world. In May I had the opportunity to attend SQL Bits XI in Nottingham, England. It was enlightening to meet and talk with SQL professionals from around the U.K. as well as many other European countries. The many SQL Bits volunteers put on a great event and were gracious hosts. Budgets The Board passed the FY14 budget at the end of June. The  budget process can be challenging and requires the Board to make some difficult choices about where to allocate resources. Overall I’m satisfied with the decisions we made and think we are investing in the right activities and programs. Next Up The Board is meeting July 18-19 in Kansas City. We will be holding the Executive Committee election for the Exec Co that will take office in 2014. We will also be discussing plans for the next BA conference as well as the next steps for our Global Growth initiative. Applications for the upcoming Board of Directors election open on July 24. If you are considering running for the Board you can visit the PASS elections site to learn more about the election process. And I encourage anyone considering running to reach out to current and past Board members to learn about what the role entails. Plans for the next PASS Summit are in full swing. We are working on some fun new ideas to introduce attendees to the many ways to become involved in the SQL community.

    Read the article

  • Join us on our Journey to be #1 in SaaS!

    - by jessica.ebbelaar(at)oracle.com
    WHY ORACLE? Oracle is a robust organization that has proven to maintain growth and innovation at all levels with a constant evolving attitude. The main ingredient of Oracles success is the 105.000 talented employees who constantly amaze each other in building a better and more innovative organization. Oracle is a company where YOU can make a difference. What is OD? Oracle Direct is a state-of-the-art, multi-channel EMEA sales operation bringing to life the benefits of Oracle’s complete technology stack. It offers you the unique opportunity to work with the most talented and like-minded sales professionals in the industry.  You will have access to world class training and structured career development programmes allowing you to accelerate your Solution Sales career across a multitude of product lines and a choice of attractive locations. What positions are OD Hiring?   Oracle is on a journey to be the #1 SaaS vendor in EMEA.  Due to recent expansion and acquisitions within our Cloud Business, we are now growing our EMEA Cloud Applications Sales Group in Dublin. We have many exciting NEW opportunities across our CRM and HCM SaaS Sales teams. As a SaaS Sales Account Manager, you will proactively manage an assigned territory / vertical with responsibility for the full sales cycle. This role requires strong business development, solution selling, account management and closing skills. WHY ORACLE? Oracle is a robust organization that has proven to maintain growth and innovation at all levels with a constant evolving attitude. The main ingredient of Oracles success is the 105.000 talented employees who constantly amaze each other in building a better and more innovative organization. Oracle is a company where YOU can make a difference. What is OD? Oracle Direct is a state-of-the-art, multi-channel EMEA sales operation bringing to life the benefits of Oracle’s complete technology stack. It offers you the unique opportunity to work with the most talented and like-minded sales professionals in the industry.  You will have access to world class training and structured career development programmes allowing you to accelerate your Solution Sales career across a multitude of product lines and a choice of attractive locations. What positions are OD Hiring? Oracle is on a journey to be the #1 SaaS vendor in EMEA.  Due to recent expansion and acquisitions within our Cloud Business, we are now growing our EMEA Cloud Applications Sales Group in Dublin. We have many exciting NEW opportunities across our CRM and HCM SaaS Sales teams. As a SaaS Sales Account Manager, you will proactively manage an assigned territory / vertical with responsibility for the full sales cycle. This role requires strong business development, solution selling, account management and closing skills. What is the Business Development Group (BDG) The Business Development Group is the key entry point in Oracle for the future Sales and Management talent of the organisation. We are the Demand Generation engine for Oracle in EMEA. We provide revenue generating, quality sales pipeline to our Inside and Field Sales professionals as well as to our Channel Partners. Our current focus is to provide an agile and flexible service offering to our customers and stakeholders to meet ever changing business needs, whilst constantly striving to improve the customer experience, quality of our pipeline, market coverage and penetration. As a SaaS Business Development Consultant (BDC) you will be the first touch point with new customers. Your goal is to proactively identify and qualify business opportunities leading to revenue for Oracle. You will work closely with your Inside Sales colleagues who will progress your qualified pipeline and opportunities. Work for us Work for the only multi-pillar SaaS vendor in the market Be part of a FUN, fast paced and truly International sales team  Develop you solution sales EXPERTISE Drive your CAREER development within a structured and supportive environment The Profile You have a passion for selling cutting-edge technology You thrive in a fast paced and dynamic work environment where being the best is paramount Your priority is always the customer You live for a challenge and you love to win Join us on our Journey to be #1 in SaaS and be part of our Cloud Success Story! You will find more information about open roles here

    Read the article

  • Impressions of Pivotal Tracker

    Pivotal Tracker is a free, online agile project management system. Ive been using it recently to better communicate to customers about the current state of our project. In Pivotal Tracker, the unit of work is a story and stories are arranged into iterations or delivery cycles. Stories can be any level of granularity you want, but the idea is to use stories to communicate clearly to customers, so you dont want to write a novel. You especially dont want to write a list of detailed programming tasks. A good story for a point of sale system might be: Allow managers to override the price of an item while ringing up a customer. A less useful story: Script out the process of adding a manager flag to the user table and stage that script into the deploy directory. Stories are estimated using a point scale, by default 1, 2 or 3. Iterations are then automatically laid out by combining enough tasks to fill the point total for that period of time. You have to start with a guess on how many points your team can do in an iteration, then adjust with real data as you complete iterations. This is basic agile methodology, but where Pivotal Tracker adds value is that it automatically and graphically lays out iterations for you on your project site. This makes communication and planning easy. Compiling release notes is no longer painful as it has been clear from the outset what work is going on. While I much prefer Pivotal Trackers customer facing interface over what we used previously (TFS), I see a couple of gaps. First, I have not able to make much headway with the reporting tools. Despite my complaints about TFS, it can produce some nice reports. Second, its not clear where if at all, Id keep track of purely internal tasks. Im talking about server maintenance, cleaning up source control, checking back on some code which you never quite felt right about. Theres no purpose in cluttering up an iteration backlog with these items, but if you dont track them, you lose them. Im not sure what a good answer for that is. One gap I thought Id see, which I dont, is more granular dev tasks. If Im implementing a story, Ill write out the steps and track my progress, but really, those steps arent useful to anybody but me. The only time Ive found that level of detail really useful is when my tasks are defined at too high a level anyway or when Im working with someone who needs more coaching and might not be able to finish a story in time without some scaffolding to get them going. You can learn more about Pivotal Tracker at: http://www.pivotaltracker.com/learnmore.   --- Relevant Links --- A good intro to stories: http://www.agilemodeling.com/artifacts/userStory.htmDid you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Lead, Follow, or Get out of the way

    - by Daniel Moth
    This is one of the sayings (attributed to Thomas Paine) that totally resonated with me from the first time I heard it, which was only 3 years ago during some training course at work: "Lead, Follow, or Get out of the way" You'll find many books with this title and you'll find it quoted by politicians and other leaders in various countries at various times... the quote is open to interpretation and works on many levels. To set the tone of what this means to me, I'll use a simple micro example: In any given conversation, you are either leading it or following it, at different times/snapshots of the conversation. If you are not willing or able to lead it, and you are not willing or able to follow it, then you should depart. The bad alternative which this guidance encourages you NOT to do is to stick around and obstruct progress by not following, not leading, and simply complaining or trying to derail the discussion in no particular direction. The same pattern applies at your position/role at work. Either follow your management/leadership team, or try to lead them to what you think is a better place, or change jobs. Don't stick around complaining about the direction things are going, while not actively trying to either change things or make peace with it. In the previous paragraph you can replace the word "your management" with "the people reporting to you" and the guidance still holds. Either lead your direct reports to where you think they should go, or follow their lead, or change jobs. Complaining about folks not taking direction while doing nothing is not a maintainable state. To me this quote is not about a permanent state, it is not about some people always leading and some always following: It is about a role/hat that anybody can play/wear at any given moment. One minute I am leading you, the next I am following you, and the next we are both following someone else and so on... When there is disagreement, debate the different directions for as long as it takes for you to be comfortable that you can either follow or lead. If you don't become comfortable with either of those, get out of the way. Something to remember is that it is impossible to learn how to lead well, without learning how to follow well (probably deserves its own blog entry)... Things go wrong when someone thinks that they must always be leading, or when everybody wants to follow and nobody steps up to lead... Things go wrong when more than one person wants to lead and they don't try to reach agreement on a shared direction, stubbornly sticking to their guns pulling the rest of the team in multiple directions... Things go wrong when more than one person wants to lead and after numerous and lengthy discussions, none of them decides to follow or get out of the way... Things go wrong when people don't want to lead, don't want to follow, and insist on sticking around... While there are a few ways things that can go wrong as enumerated in the previous paragraph, the most common one in my experience is the last one I mentioned. You'll recognize these folks as the ones that always complain about everything that is wrong with their company/product but do nothing about it. Every time you hear someone giving feedback on how something is wrong or suboptimal, ask them "So now that you identified the problem, what do you think the solution is and what are you doing to drive us to that solution?" The next time things start going wrong, step up and remind everyone: Lead, Follow, or Get out of the way. For more perspectives, and for input to help you form your own interpretation, search the web for this phrase to see in what contexts it is being used (bing, google). Finally, regardless of your political views, I hope you can appreciate if only as an example this perspective of someone leading by actually getting out of the way. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Crime Scene Investigation: SQL Server

    - by Rodney Landrum
    “The packages are running slower in Prod than they are in Dev” My week began with this simple declaration from one of our lead BI developers, quickly followed by an emailed spreadsheet demonstrating that, over 5 executions, an extensive ETL process was running average 630 seconds faster on Dev than on Prod. The situation needed some scientific investigation to determine why the same code, the same data, the same schema would yield consistently slower results on a more powerful server. Prod had yet to be officially christened with a “Go Live” date so I had the time, and having recently been binge watching CSI: New York, I also had the inclination. An inspection of the two systems, Prod and Dev, revealed the first surprise: although Prod was indeed a “bigger” system, with double the amount of RAM of Dev, the latter actually had twice as many processor cores. On neither system did I see much sign of resources being heavily taxed, while the ETL process was running. Without any real supporting evidence, I jumped to a conclusion that my years of performance tuning should have helped me avoid, and that was that the hardware differences explained the better performance on Dev. We spent time setting up a Test system, similarly scoped to Prod except with 4 times the cores, and ported everything across. The results of our careful benchmarks left us truly bemused; the ETL process on the new server was slower than on both other systems. We burned more time tweaking server configurations, monitoring IO and network latency, several times believing we’d uncovered the smoking gun, until the results of subsequent test runs pitched us back into confusion. Finally, I decided, enough was enough. Hadn’t I learned very early in my DBA career that almost all bottlenecks were caused by code and database design, not hardware? It was time to get back to basics. With over 100 SSIS packages and hundreds of queries, each handling specific tasks such as file loads, bulk inserts, transforms, logging, and so on, the task seemed formidable. And yet, after barely an hour spent with Profiler, Extended Events, and wait statistics DMVs, I had a lead in the shape of a query that joined three tables, containing millions of rows, returned 3279 results, but performed 239K logical reads. As soon as I looked at the execution plans for the query in Dev and Test I saw the culprit, an implicit conversion warning on a join predicate field that was numeric in one table and a varchar(50) in another! I turned this information over to the BI developers who quickly resolved the data type mismatches and found and fixed “several” others as well. After the schema changes the same query with the same databases ran in under 1 second on all systems and reduced the logical reads down to fewer than 300. The analysis also revealed that on Dev, the ETL task was pulling data across a LAN, whereas Prod and Test were connected across slower WAN, in large part explaining why the same process ran slower on the latter two systems. Loading the data locally on Prod delivered a further 20% gain in performance. As we progress through our DBA careers we learn valuable lessons. Sometimes, with a project deadline looming and pressure mounting, we choose to forget them. I was close to giving into the temptation to throw more hardware at the problem. I’m pleased at least that I resisted, though I still kick myself for not looking at the code on day one. It can seem a daunting prospect to return to the fundamentals of the code so close to roll out, but with the right tools, and surprisingly little time, you can collect the evidence that reveals the true problem. It is a lesson I trust I will remember for my next 20 years as a DBA, if I’m ever again tempted to bypass the evidence.

    Read the article

< Previous Page | 95 96 97 98 99 100 101 102 103 104 105 106  | Next Page >