Search Results

Search found 6922 results on 277 pages for 'moving platforms'.

Page 221/277 | < Previous Page | 217 218 219 220 221 222 223 224 225 226 227 228  | Next Page >

  • Today's Links (6/29/2011)

    - by Bob Rhubart
    Event-Driven SOA: Events meet Services | Guido Schmutz Oracle ACE Director Guido Schmutz shows you how to achieve extreme loose coupling within a Service-Oriented Architecture by using event-driven interactions. Misconceptions About Software Architecture | Sanjeev Kumar A concise, to-the-point, and informative article by Sanjeev Kumar. Good Leaders Acknowledge What Can't Be Done - Jeffrey Pfeffer - Harvard Business Review "None of us likes to admit to bad decisions," says Jeffrey Pfeffer. "But imagine how much harder that is for someone who has been chosen to lead a large organization precisely because he or she is thought to have the power to see the future more clearly and chart a wise course." Suboptimal Thinking within Enterprise Architecture | James McGovern McGovern says: "We need to remember that enterprises live and thrive beyond just the current person at the helm." Boundaryless Information Flow | Richard Veryard "If all the boundaries are removed or porous, then the (extended) enterprise or ecosystem becomes like a giant sponge, in which all information permeates the whole," Veryard says. "Some people may think that's a good idea, but it's not what I'd call loose coupling." Coming to a City Near You: Oracle Business Analytics Summits | Rob Reynolds This series of events includes a Technology and Architecture track. New Date for Implementation of Sun Hands-On Course Requirement (Oracle Certification) As announced on the Oracle Certification website, Java Architect, Java Developer, Solaris System Administrator and Solaris Security Administrator certification tracks will include a new mandatory course attendance requirement. VirtualBox 4.0.10 is now available for download | Bob Netherton Netherton shares information on the new release. Updated Technical Best Practices whitepaper | Anthony Shorten The Technical Best Practices whitepaper has been updated with the latest advice. "New advice includes new installation advice, advanced settings, new security settings and advice for both Oracle WebLogic and IBM WebSphere installations," says Shorten. Kscope 11 ADF, AIA and Business Rules | Peter Paul van de Beek Whitehorses Solution Architect Peter Paul van de Beek shares his impressions of KScope11 presentations by Markus Eisele, Sten Vesterli, and Edwin Biemond. Amazon AWS for the learning experience | Andrej Koelewijn "Using AWS changes your expectations how your internal data center should operate," says Koelewijn. BPMN is dead, long live BPEL! (SOA Partner Community Blog) Jürgen Kress shares information -- including a long list of speakers -- for the SOA & BPM Integration Days 2011 conference, October 12th & 13th 2011 in Düsseldorf. InfoQ: HTML5 and the Dawn of Rich Mobile Web Applications James Pearce introduces cross-platform web apps development using HTML5 and web frameworks, such as jQTouch, jQuery Mobile, Sencha Touch, PhoneGap, outlining what makes a good framework. InfoQ: Interview and Book Excerpt: CMMI for Development "Frameworks like TOGAF are used to define an architecture that aligns IT assets and resources to support key business needs and processes of key stakeholders," says SEI's Mike Konrad. "But the individual application systems, capabilities, services, networks, and other IT assets and infrastructure still need to be acquired, developed, or sustained." InfoQ: Architecting a Cloud-Scale Identity Fabric | Eric Olden "The most cited reason for not moving to the cloud is concern about security," says Olden. "In particular, managing user identity and access in the cloud is a tough problem to solve and a big security concern for organizations."

    Read the article

  • Upgrade Office 2003 to 2010 on XP or Run them Side by Side

    - by Mysticgeek
    If you’re still running XP, currently have Office 2003 installed on your machine, and skipped Office 2007, you might want to upgrade to Office 2010. In this guide we will show you the upgrade process or how to run them side by side. In this example we are upgrading from Office 2003 Standard to Office Professional Plus 2010 RTM (Final) on XP Professional. System Requirements To run Office 2010 on your XP machine you have to make sure you have Service Pack 3 and Microsoft Silverlight installed (links below). Or you can just install them through Windows Update. Recommended Hardware 1GHZ CPU or higher 512 MB of RAM or higher 1024×768 Resolution or higher DirectX 9.0c compatible graphics card with 64 MB of memory or higher Installing Office 2010 Simply kick off the Office Professional Plus 2010 installation. Enter in your product key… Agree to the EULA…   Select the Customize button… Setup will detect Office 2003 and allow you to remove all applications, keep them, or select only the ones you want to keep. In this example we’re going to remove Excel and PowerPoint, and keep Outlook and Word 2003. Next, click the Installation Options tab and select Office programs you want to install. Since we’re keeping Outlook 2003 and don’t want to use Outlook 2010, we’re making sure not to install Outlook 2010. However, we want to run Word 2003 and 2010 on the same machine. After you’ve made your selections click the Upgrade button. The installation begins and you’re shown the progress. The amount of time it takes to install will vary between systems. Installation is complete and you can close out of the installer. Now when you go into the Start menu under Microsoft Office, you’ll see both versions of the Office apps available. Here is a shot of Word 2003 and 2010 running together on our XP machine.   Conclusion If you’re moving from Office 2003 to 2010, this allows you to install both versions side by side. It gives you a chance to learn 2010 features, and still work in the familiar 2003 environment when you need to get things done quickly. If you’re having problems installing Office 2010 make sure to check out our article on how to fix problems upgrading Office 2010 beta to RTM (Final) release. Also, if you were using Office 2007 and are currently using the 2010 beta, we have a guide on how to switch back to Office 2007 after the 2010 beta ends. Links XP Service Pack 3 Microsoft Silverlight Details on Office 2010 System Requirements Similar Articles Productive Geek Tips Add Word/Excel 97-2003 Documents Back to the "New" Context Menu After Installing Office 2007Make Word 2007 Always Save in Word 2003 FormatMake Excel 2007 Always Save in Excel 2003 FormatRemove Office 2010 Beta and Reinstall Office 2007How to Find Office 2003 Commands in Office 2010 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips VMware Workstation 7 Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Enable or Disable the Task Manager Using TaskMgrED Explorer++ is a Worthy Windows Explorer Alternative Error Goblin Explains Windows Error Codes Twelve must-have Google Chrome plugins Cool Looking Skins for Windows Media Player 12 Move the Mouse Pointer With Your Face Movement Using eViacam

    Read the article

  • SharePoint 2010 release date - is it that important?

    - by CharlesLee
    There has been lots of excitement in the SharePoint community over the last few days as Microsoft have announced the official release date of SharePoint 2010. May 12th is the date for your diaries (RTM in April.) The twittersphere has been telling everyone for the last few days about this news and there is much excitement. The major conferences this year all seem to have a SharePoint 2010 focus and some are entirely focussed on the new product (e.g. SharePoint Evolution Conference.)  Now by all accounts Microsoft have plugged some significant functionality gaps that exist in WSS 3.0 and MOSS 2007 and provided some exciting new functionality.  You don't need me to tell you about these as the MVPs (and other community members) are doing a sterling job, after all that is why Microsoft has MVPs in the first place. Lets get real for a second though as there is a significant investment involved in moving to SharePoint 2010:  Firstly you need 64 bit architecture across the board, now for some environments that is no inconsequential hurdle, that's a pretty significant roadblock.   The development farm, test farm and UAT farm are all going to require the same infrastructure upgrades. To take advantage of the tooling for SP2010 you will need to upgrade to Visual Studio 2010 and your development team is going to require 64 bit hardware/OS too.  I would not recommend installing SP 2010 in client installation mode (i.e. for Windows 7) on your developer machines, I would use this for demo machines only. Something that lots of people seem to forget in all their whooping and hollering about the new release is that there is a large amount of end user training going to be required as the browser UI has now adopted the omnipotent ribbon interface and there are other new and more complicated features. SharePoint Designer has also entirely changed in both look and feel and some significant feature changes have taken place. Lest we should forget that some companies have not long upgraded to MOSS 2007 and are yet to see a significant ROI for that project. And the reticence that most companies feel about implementing v1 Microsoft products.  This is only the surface of the deeper issues which would be involved in any upgrade process, so I guess I share a small part of the concern voiced by Mark Miller of EndUserSharePoint.com.  Is SharePoint 2010 relevant? I don't share this sentiment in its entirety as I firmly believe that all companies should be looking at SharePoint 2010 from day one, however most large scale existing implementations of MOSS 2007 are going to be several years away from a serious upgrade project.  So should the conference organisers and the SharePoint community as a whole be a little more understanding of the real world issues?  It's easy to get carried away in the excitement of a new product and new tools to play with but there needs to be a focus on the real world issues that most people are facing day to day and at the moment and for the short term future (at the very least the next 12 months) that is fairly and squarely in the WSS 2.0/3.0 and SPS 2003/MOSS 2007 camps. Don't get me wrong, I am very very excited about getting to grips with SharePoint 2010 in the real world and I cannot wait for my first real project to come along, but for now I am just being realistic about the reality for most people who work with SharePoint. I have been spending a lot of time on www.sharepointoverflow.com recently as there is a community of people building up who are committed to answering the real world questions that folks are dealing with every day.  I urge you to take a look and either ask or answer some questions direct from the front line of the SharePoint world.

    Read the article

  • Learn About Oracle’s Strategy for a Simple, Modern User Experience at OpenWorld 2012

    - by Applications User Experience
    By Kathy Miedema, Oracle Applications User Experience If you’re interested in what the best possible user experience looks like, you’ll want to hear what Oracle’s Applications User Experience team is planning for OpenWorld 2012, Sept. 30-Oct. 4 in San Francisco. This year, we will talk Fusion, Fusion, Fusion. We were among the first to show Oracle Fusion Applications in the last couple of years, and we’ll be showing it again this year so you can see what Oracle is planning for the next generation of enterprise applications. Attend our sessions to learn more about the user experience strategy in which Oracle is investing. Simplicity is the driving force behind the demos that we are unveiling now, which you can see at OpenWorld. We want to create opportunities for productivity and efficiency, and deliver enterprise data across devices to help you do your work in the way best suited to your job and needs, said Jeremy Ashley, Vice President, Oracle Applications User Experience. You can see the new look for Fusion Applications at a general session led by Ashley at 3:30 p.m. on Wednesday, Oct. 3. You’ll also have the chance to learn more about tailoring in Oracle Fusion Applications, and gain a new understanding of the investment in the user experience behind Fusion Applications at our sessions (see session information below). Inside the Oracle Applications User Experience team’s on-site lab at Oracle OpenWorld 2011. Head to the demogrounds to see new demos from the Applications User Experience team, including the new look for Fusion Applications and what we’re building for mobile platforms. Take a spin on our eye tracker, a very cool tool that we use to research the usability of a particular design. Visit the Usable Apps OpenWorld page to find out where our demopods will be located. We are also recruiting participants for our on-site lab, in which we gather feedback on new user experience designs, and taking reservations for a charter bus that will bring you to Oracle headquarters for a lab tour Thursday, Oct. 4, or Friday, Oct. 5. Tours leave at 10 a.m. and 1:45 p.m. from the Moscone Center in San Francisco. You’ll see more of our newest designs at the lab tour, and some of our research tools in action. Can’t participate in a customer feedback session or take a lab tour this time around? Visit Usable Apps to participate or book a tour another time. For more information on any OpenWorld sessions, check the content catalog – also available at www.oracle.com/openworld. For information on Applications User Experience (Apps UX) sessions and activities, go to the Usable Apps OpenWorld page. APPS UX OPENWORLD SESSIONS Oracle’s Roadmap to a Simple, Modern User Experience Presenter: Jeremy Ashley, Vice President Applications User Experience, Oracle; with Debra Lilley, Fujitsu Consulting; Basheer Khan, Innowave; and Edward Roske, InterRelSession ID: CON9467Date: Wednesday, Oct. 3 Time: 3:30 - 4:30 p.m.Location: Moscone West - 3002/3004 Jeremy Ashley Oracle Fusion Applications: Transforming Insight into Action Presenters: Killian Evers and Kristin Desmond, OracleSession ID: CON8718Date: Thursday, Oct. 4Time: 11:15 a.m. - 12:15 p.m.Location: Moscone West - 2008 “FRIENDS OF UX” OPENWORLD SESSIONS Sessions by the Oracle Usability Advisory Board (OUAB) members: Advances in Oracle Enterprise Governance, Risk, and Compliance Manager  Presenters: Koen Delaure, KPMG Advisory NV, and Oracle Usability Advisory Board member; Russell Stohr, Oracle Session ID: CON9389Date: Tuesday, Oct. 2Time: 1:15 - 2:15 p.m.Location: Palace Hotel - Concert Optimize Oracle E-Busines Suite Procure-to-Pay: Cut Inefficiences/Fraud with Oracle GRC Apps Presenters: Koen Delaure, KPMG Advisory NV, and Solveig Wagner, Seadrill Management AS, both Oracle Usability Advisory Board members; and Swarnali Bag, OracleSession ID: CON9401Date: Monday, Oct. 1Time: 12:15 - 1:15 p.m.Location: Intercontinental - Sutter Showcase of JD Edwards EnterpriseOne Mobility Presenters: Jon Wells, Westmoreland Coal Co., Oracle Usability Advisory Board member; Rob Mills and Liz Davson, Town of Oakville; Keith Sholes and Louise Farner, Oracle Session ID: CON9123Date: Tuesday, Oct. 2Time: 1:15 - 2:15 p.m.Location: InterContinental - Grand Ballroom B Sessions by the Fusion User Experience Adovcates (FXA) Usability and Features of Oracle Fusion Applications, Built upon Oracle Fusion Middleware Presenters: Debra Lilley, Fujitsu Consulting and Oracle Usability Advisory Board member; John King, King Training ResourcesSession ID: UGF10371Date: Sunday, Sept. 30Time: 11 a.m. - 11:45 a.m. Location: Moscone West – 2010 Ten Things to Love About Oracle Fusion Project Portfolio Management  Presenter: Floyd Teter, EiS TechnologiesSession ID: CON6021Date: Tuesday, Oct. 2Time: 10:15 - 11:15 a.m.Location: Moscone West – 2003

    Read the article

  • Hello World Pagelet

    - by astemkov
    Introduction The goal of this exercise is to give you a basic feel of how you can use Pagelet Producer to proxy a web page We will proxy a simple static Hello World web page, cut one section out of that page and present it as a pagelet that you can later insert on your own application page or to your portal page such as WebCenter Portal space or WebCenter Interaction community page. Hello World sample app This is the static web page we will work with: Let's assume the following: The Hello World web page is running on server http://appserver.company.com:1234/ The Hello World web page path is: http://appserver.company.com:1234/helloworld/ Initial Pagelet Producer setup Let's assume that the Pagelet Producer server is running on http://pageletserver.company.com:8889/pagelets/ First let's check that Pagelet Producer is up and running. In order to do that we just need to access the following URL: http://pageletserver.company.com:8889/pagelets/ And this is what should be returned: Now you can access Pagelet Producer administration screens using this URL: http://pageletserver.company.com:8889/pagelets/admin This is how the UI looks: Now if you connect to the internet via proxy server, you need to configure proxy in Pagelet Producer settings. In the Navigator pane: Jump To - Settings Click on "Proxy" Enter your proxy server configuration: Creating a resource First thing that you need to do is to create a resource for your web page. This will tell Pagelet Producer that all sub-paths of the web page should be proxied. It also will allow you to setup common rules of how your web page should be proxied and will serve as a container for your pagelets. In the Navigator pane: Jump To - Resources Click on any existing resource (ex. welcome_resource) Click on "Create selected type" toolbar button at the top of the Navigator pane Select "Web" in the "Select Producer Type" dialog box and click "OK" Now after the resource is created let's click on "General" sub-item a specify the following values Name = AppServer Source URL = http://appserver.company.com:1234/ Destination URL = /appserver/ Click on "Save" toolbar button at the top of the Navigator pane After the resource is created our web page becomes accessible by the URL: http://pageletserver.company.com:8889/pagelets/appserver/helloworld/ So in original web page address Source URL is replaced with Pagelet Producer URL (http://pageletserver.company.com:8889/pagelets) + Destination URL Creating a pagelet Now let's create "Hello World" pagelet. Under the resource node activate Pagelets subnode Click on "Create selected type" toolbar button at the top of the Navigator pane Click on "General" sub-node of newly created pagelet and specify the following values Name = Hello_World Library = MyLib Library is used for logical grouping. The portals use the "Library" value to group pagelets in their respective UI's. For example, when adding pagelets to a WebCenter Portal space you would see the individual pagelets listed under the "Library" name. URL Suffix = helloworld/index.html this is where the Hello World page html is served from Click on "Save" toolbar button at the top of the Navigator pane The Library name can be anything you want, it doesn't have to match the resource name at all. It is used as a logical grouping of pagelets, and you can include pagelets from multiple resources into the same library or create a new library for each pagelet. After you save the pagelet you can access it here: http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/MyLib/Hello_World which is : http://pageletserver.company.com:8889/pagelets/inject/v2/pagelet/ + [Library] + [Name] Or to test the injection of a pagelet into iframe you can click on the pagelets "Documentation" sub-node and use "Access Pagelet using REST" URL: This is what we will see: Clipping The pagelet that we just created covers the whole web page, but we want just the "Hello World" segment of it. So let's clip it. Under the Hello_World pagelet node activate Clipper sub-node Click on "Create selected type" toolbar button at the top of the Navigator pane Specify a Name for newly created clipper. For example: "c1" Click on "Content" sub-node of the clipper Click on "Launch Clipper" button New browser window will open By moving a mouse pointer over the web page select the area you want to clip: Click left mouse button - the browser window will disappear and you will see that Clipping Path was automatically generated Now let's save and access the link from the "Documentation" page again Here's our pagelet nicely clipped and ready for being used on your Web Center Space

    Read the article

  • How to use caching to increase render performance?

    - by Christian Ivicevic
    First of all I am going to cover the basic design of my 2d tile-based engine written with SDL in C++, then I will point out what I am up to and where I need some hints. Concept of my engine My engine uses the concept of GameScreens which are stored on a stack in the main game class. The main methods of a screen are usually LoadContent, Render, Update and InitMultithreading. (I use the last one because I am using v8 as a JavaScript bridge to the engine. The main game loop then renders the top screen on the stack (if there is one; otherwise, it exits the game) - actually it calls the render methods, but stores all items to be rendered in a list. After gathering all this information the methods like SDL_BlitSurface are called by my GameUIRenderer which draws the enqueued content and then draws some overlay. The code looks like this: while(Game is running) { Handle input if(Screens on stack == 0) exit Update timer etc. Clear the screen Peek the screen on the stack and collect information on what to render Actually render the enqueue screen stuff and some overlay etc. Flip the screen } The GameUIRenderer uses as hinted a std::vector<std::shared_ptr<ImageToRender>> to hold all necessary information described by this class: class ImageToRender { private: SDL_Surface* image; int x, y, w, h, xOffset, yOffset; }; This bunch of attributes is usually needed if I have a texture atlas with all tiles in one SDL_Surface and then the engine should crop one specific area and draw this to the screen. The GameUIRenderer::Render() method then just iterates over all elements and renders them something like this: std::for_each( this->m_vImageVector.begin(), this->m_vImageVector.end(), [this](std::shared_ptr<ImageToRender> pCurrentImage) { SDL_Rect rc = { pCurrentImage->x, pCurrentImage->y, 0, 0 }; // For the sake of simplicity ignore offsets... SDL_Rect srcRect = { 0, 0, pCurrentImage->w, pCurrentImage->h }; SDL_BlitSurface(pCurrentImage->pImage, &srcRect, g_pFramework->GetScreen(), &rc); } ); this->m_vImageVector.clear(); Current ideas which need to be reviewed The specified approach works really good and IMHO it is really has a good structure, however the performance could be definitely increased. I would like to know what do you suggest, how to implement efficient caching of surfaces etc so that there is no need to redraw the same scene over and over again? The map itself would be almost static, only when the player moves, we would need to move the map. Furthermore animated entities would either require updates of the whole map or updates of only the specific areas the entities are currently moving in. My first approaches were to include a flag IsTainted which should be used by the GameUIRenderer to decide whether to redraw everything or use cached version (or to not render anything so that we do not have to Clear the screen and let the last frame persist). However this seems to be quite messy if I have to manually handle in my Render method of the screen class if something has changed or not.

    Read the article

  • How do I fix my resolution after Directx install through Steam?

    - by Justin
    I'm a bit long-winded so see bottom for quick version and specs. Friendly Hello: Hello all on these askUbuntu pages, I just recently built my own computer and decided to switch to Ubuntu for the extra coolness. I've been learning a lot through all this, and mostly been trying to figure out issues on my own (read: Google searches). However, I couldn't seem to find others with this problem so I've come here for help. Detailed Recount: So I just used WINE and WINETRICKS to install Steam. All went well and it worked. Then I went to trying a game out. I remembered that Orcs Must Die! worked from http://www.steamgamesonlinux.com/ so I tried that out. After selecting to download it, that's when the problem occurred. The screen suddenly zoomed in!!! I think it's the resolution right? Half the screen is cut off and I can't see parts of the right side of windows. My theory is that this is due to Direct X being installed through Steam, as Steam automatically installed it as I chose to download the game. It didn't even ask me to install Direct X or not ): It all happened so fast. This all being said, the game works fine! It looks a little strange, as if the resolution was off, but it plays just fine. What I did so far: Restarted my computer. Didn't work -_- Researched Steam installing DirectX on Ubuntu then messing up resolution and couldn't really find anything. Researched uninstalling DirectX from Ubuntu but only found uninstalling DirectX after having been installed with Wine, not through Steam. Got mad and ate my feelings. Tried "xrandr -s 0" but it didn't do anything. Ran xrandr alone and terminal showed this: Screen 0: minimum 8 x 8, current 640 x 480, maximum 16384 x 16384 DVI-I-0 connected 640x480+0+0 (normal left inverted right x axis y axis) 0mm x 0mm 640x480 59.9*+ 320x240 120.1 DVI-I-1 disconnected (normal left inverted right x axis y axis) HDMI-0 disconnected (normal left inverted right x axis y axis) DP-0 disconnected (normal left inverted right x axis y axis) DVI-D-0 disconnected (normal left inverted right x axis y axis) DP-1 disconnected (normal left inverted right x axis y axis) About now I was mad so I played Odin's Sphere then took a nap. Back to it! I entered the following: xrandr --output DVI-I-0 --mode 1024x768 But I was met with this message: xrandr: cannot find mode 1024x768 I get the same messages for 800x600, 1400x1050, and seemingly any other combination of numbers. I then tried Going into System Settings then Displays, then playing around in there. My Resolution is set to 640x480 and there are no other options for me to choose from. Rotation has Normal, Clockwise, Counter Clockwise, and 180 Degrees. It's set to Normal and I haven't messed with that. Launcher Placement has Unknown and All Displays as its two options. It's set to Unknown, but moving it to All Displays doesn't seem to do anything. Finally, when I click Detect Displays, nothing seems to happen. Quick Version: Linux noob. Steam installed with Wine and Winetricks. Steam downloaded and installed game + DirectX. Resolution messed up now (I think; pretty sure), can't fix it, very annoying, no idea what's going on, halp! Specs: Ubuntu Version 12.04 Wine Version 1.4.1 Have not changed any settings in Wine Using Winetricks Graphics Card: http://www.gigabyte.com/products/pro...px?pid=4361#sp Drivers: Proprietary (Installing those were a LOT of fun) Also let it be known that I have a DVI to VGA cord running from my Graphics card to my monitor. If any more information is needed I am ready to report. Thank You: Thanks a lot for your help and all the work you do to support noob ubuntuers like me (:

    Read the article

  • asynchrony is viral

    - by Daniel Moth
    It is becoming hard to write code today without introducing some form of asynchrony and, if you are using .NET (e.g. for Windows Phone 8 or Windows Store apps), that means sooner or later you have to await something and mark your method as async. My most recent examples included introducing speech recognition in my Translator By Moth phone app where I had to await mySpeechRecognizerUI.RecognizeWithUIAsync() and when moving that code base to a Windows Store project just to show a MessageBox I had to await myMessageDialog.ShowAsync(). Any time you need to invoke an asynchronous method in your code, you have a choice to make: kick off the operation but don’t wait for it to complete (otherwise known as fire-and-forget), synchronously wait for it to complete (which will entail blocking, which can be bad, especially on a UI thread), or asynchronously wait for it to complete before continuing on with the rest of the method’s work. In most cases, you want the latter, and the await keyword makes that trivial to implement.  When you use the magical await keyword in front of an API call, then you typically have to make additional changes to your code: This await usage is within a method of course, and now you have to annotate that method with async. Furthermore, you have to change the return type of the method you just annotated so it returns a Task (if it previously returned void), or Task<myOldReturnType> (if it previously returned myOldReturnType). Note that if it returns void, in some cases you could cheat and stop there. Furthermore, any method that called this method you just annotated with async will now also be invoking an asynchronous operation, so you have to make that change in the body of the caller method to introduce the await keyword before the call to the method. …you guessed it, you now have to change this caller method to be annotated with async and have its return types tweaked... …and it goes on virally… At some point you reach the root of your user code, e.g. a GUI event handler, and whoever calls that void method can already deal with the fact that you marked it as async and the viral introduction of the keywords stops there… This is all wonderful progress and a very powerful mechanism, and I just wish someone had written a refactoring tool to take care of this… anyone? I mentioned earlier that you have a choice when invoking an asynchronous operation. If the first time you encounter this you wish to localize the impact of all these changes and essentially try to turn the asynchronous behavior into synchronous by blocking - don't! For reasons why you don't want to do that, read Toub's excellent blog post (and check out the rest of his blog with gems on async programming starting with the Async FAQ). Just embrace the pattern knowing that when you use one instance of an await, you'll propagate the change all the way to the root user code method, e.g. typically an event handler. Related aside: I just finished re-writing my MessageBox wrapper class for Phone projects, including making it work in Windows Store projects, and it does expect you to use it with an await :-). I'll share that in an upcoming post for those of you that have the same need… Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Social Business Forum Milano: Day 2

    - by me
    @YourService. The business world has flipped and small business can capitalize  by Frank Eliason (twitter: @FrankEliason ) Technology and social media tools have made it easier than ever for companies to communicate with consumers. They can listen and join in on conversations, solve problems, get instant feedback about their products and services, and more. So why, then, are most companies not doing this? Instead, it seems as if customer service is at an all time low, and that the few companies who are choosing to focus on their customers are experiencing a great competitive advantage. At Your Service explains the importance of refocusing your business on your customers and your employees, and just how to do it. Explains how to create a culture of empowered employees who understand the value of a great customer experience Advises on the need to communicate that experience to their customers and potential customers Frank Eliason, recognized by BusinessWeek as the 'most famous customer service manager in the US, possibly in the world,' has built a reputation for helping large businesses improve the way they connect with customers and enhance their relationships Quotes from the Audience: Bertrand Duperrin ?@bduperrin social service is not about shutting up the loudest cutsomers ! #sbf12 @frankeliason Paolo Pelloni ?@paolopelloniGautam Ghosh ?@GautamGhosh RT @cecildijoux: #sbf12 @frankeliason you need to change things and fix the approach it's not about social media it's about driving change  Peter H. Reiser ?@peterreiser #sbf12 Company Experience = Product Experience + Customer Interactions + Employee Experience @yourservice Engage or lose! Socialize, mobilize, conversify: engage your employees to improve business performance Christian Finn (twitter: @cfinn) First Christian was presenting the flying monkey   Then he outlined the four principals to fix the Intranet: 1. Socalize the Intranet 2. Get Thee to a Single Repository 3. Mobilize the Intranet 4. Conversationalize Your Processes Quotes from the Audience: Oscar Berg ?@oscarberg Engaged employees think their work bring out the best of their ideas @cfinn #sbf12 http://pic.twitter.com/68eddp48 John Stepper ?@johnstepper I like @cfinn's "conversify your processes" A nice related concept to "narrating your work", part of working out loud. http://johnstepper.com/2012/05/26/working-out-loud-your-personal-content-strategy/ Oscar Berg ?@oscarberg Organizations are talent markets - socializing your intranet makes this market function better @cfinn #sbf12 For profit, productivity, and personal benefit: creating a collaborative culture at Deutsche Bank John Stepper (twitter:@johnstepper) Driving adoption of collaboration + social media platforms at Deutsche Bank. John shared some great best practices on how to deploy an enterprise wide  community model  in a large company. He started with the most important question What is the commercial value of adding social ? Then he talked about the success of Community of Practices deployment and outlined some key use cases including the relevant measures to proof the ROI of the investment. Examples:  Community of practice -> measure: systematic collection of value stories  Self-service website  -> measure: based on representative models Optimizing asset inventory - > measure: Actual counts  This use case was particular interesting.  It is a crowd sourced spending/saving of infrastructure model.  User can cancel IT services they don't need (as example Software xx).  5% of the saving goes to social responsibility projects. The John outlined some  best practices on how to address the WIIFM (What's In It For Me) question of the individual users:  - change from hierarchy to graph -  working out loud = observable work + narrating  your work  - add social skills to career objectives - example: building a purposeful social network course/training as part of the job development curriculum And last but not least John gave some important tips on how to get senior management buy-in by establishing management sponsored division level collaboration boards which defines clear uses cases and measures. This divisional use cases are then implemented using a common social platform.  Thanks John - I learned a lot from your presentation!   Quotes from the Audience: Ana Silva ?@AnaDataGirl #sbf12 what's in it for individuals at Deutsche Bank? Shapping their reputations in a big org says @johnstepper #e20Ana Silva ?@AnaDataGirl Any reason why not? MT @magatorlibero #sbf12 is Deutsche B. experience on applying social inside company applicable to Italian people? Oscar Berg ?@oscarberg Your career is not a ladder, it is a network that opens up opportunities - @johnstepper #sbf12 Oscar Berg ?@oscarberg @johnstepper: Institutionalizing collaboration is next - collaboration woven into the fabric of daily work #sbf12 Ana Silva ?@AnaDataGirl #sbf12 @johnstepper talking about how Deutsche Bank is using #socbiz to build purposeful CoP & save money

    Read the article

  • Interviews Gone Bad.....Now What Do I Do?

    - by david.talamelli
    We have all done it at some stage of our working careers - you know those times when you leave an interview and then you think to yourself "why didn't I ask that question" or "I can't believe I said that" or "how could I have forgotten to say that". It happens to everyone but how you handle things moving forwards could be critical in helping you land that dream job. There is nothing better than seeing that dream job with the dream company that you are looking to work for advertised (or in some cases getting called by the Recruiter to let you know about that job). The role may seem perfect and it could be just what you are looking for and it is with the right company as well. You have sent in your resume and have subsequently had one, two or maybe three interviews for the role. After each step of the process you get a little bit more excited about the role as you start to think about your work day in your new role/company. Then it happens, you get it: you get The Phone Call to inform you that you have not been successful in securing the position that you have invested so much time and effort into. It can be disappointing to hear this news but what you do next is important in potentially keeping that door open for future opportunities with that company. How you handle yourself in this situation is important: if any of you remember the Choose Your Own Adventure Books do you: Tell the Recruiter (maybe get aggressive) they are wrong in their assessment and that you are the right candidate for the role Switch off and say ok thanks and hang up without engaging in any further dialogue Thank the company for their time and enquire if there may be any other opportunities in the future to explore If you chose the first option - the company in question may consider whether or not to look at you for other opportunities. How you handle yourself in the recruitment process could be an indication of how you would deal with clients/colleagues in your role and the impression that you leave a potential employer may be what sticks in their mind when they think of you (eg: isn't that the person who couldn't handle it when we told him he wasn't right for our role). The second option potentially produces a similar outcome. If you rush to get off the phone, the company may come back to you to talk about other roles when they come up, but you also leave open the potential thought with the company you were only interested in that role and therefore not interested in any other opportunities. Why take the risk of the company thinking that and potentially not getting back to you in the future. By picking the third option, you actively engage with the company and keep the dialogue open for future discussions. Ok, so you didn't get the role you interviewed for - you don't know who else the company may have been interviewing - maybe they found someone who was a better fit, or maybe there were too many boxes you didn't tick to step straight into that specific role. Take a deep breath and keep the company engaged. You are fresh in their mind - take advantage of that fact and let them know that while you respect their decision, that you are still interested in the company and would like to be kept in mind for future roles. Ask if it is ok to keep in touch and when they would like to keep in touch, as long as you are interested let them know you are still interested. You do need to balance that though if you come across as too keen or start stalking people - it could equally damage your brand. Companies normally have more than one open role. New roles are created all the time, circumstances change and hiring people is not a static business, it changes course from everyone's best laid initial plans. If you didn't get that initial role you wanted, keep the door open with that company so that when those new roles do come up or when circumstances do change you have already laid the ground to step into those new positions.

    Read the article

  • Book &ldquo;Team Foundation Server 2012 Starter&rdquo; published!

    - by Jakob Ehn
    During the summer and fall this year, me and my colleague Terje Sandstrøm has worked together on a book project that has now finally hit the stores! The title of the book is Team Foundation Server 2012 Starter and is published by Packt Publishing. You can find it at http://www.packtpub.com/team-foundation-server-2012-starter/book or from Amazon http://www.amazon.com/dp/1849688389                          The book is part of a concept that Packt have with starter-books, intended for people new to Team Foundation Server 2012 and who want a quick guideline to get it up and working. It covers the fundamentals, from installing and configuring it, and how to use it with source control, work items and builds. It is done as a step-by-step guide, but also includes best practices advice in the different areas. It covers the use of both the on-premises and the TFS Services version. It also has a list of links and references in the end to the most relevant Visual Studio 2012 ALM sites. Our good friend and fellow ALM MVP Mathias Olausson have done the review of the book, thanks again Mathias! We hope the book fills the gap between the different online guide sites and the more advanced books that are out. Check it out and please let us know what you think of the book! Book Description Your quick start guide to TFS 2012, top features, and best practices with hands on examples Overview Install TFS 2012 from scratch Get up and running with your first project Streamline release cycles for maximum productivity In Detail Team Foundation Server 2012 is Microsoft's leading ALM tool, integrating source control, work item and process handling, build automation, and testing. This practical "Team Foundation Server 2012 Starter Guide" will provide you with clear step-by-step exercises covering all major aspects of the product. This is essential reading for anyone wishing to set up, organize, and use TFS server. This hands-on guide looks at the top features in Team Foundation Server 2012, starting with a quick installation guide and then moving into using it for your software development projects. Manage your team projects with Team Explorer, one of the many new features for 2012. Covering all the main features in source control to help you work more efficiently, including tools for branching and merging, we will delve into the Agile Planning Tools for planning your product and sprint backlogs. Learn to set up build automation, allowing your team to become faster, more streamlined, and ultimately more productive with this "Team Foundation Server 2012 Starter Guide". What you will learn from this book Install TFS 2012 on premise Access TFS Services in the cloud Quickly get started with a new project with product backlogs, source control, and build automation Work efficiently with source control using the top features Understand how the tools for branching and merging in TFS 2012 help you isolate work and teams Learn about the existing process templates, such as Visual Studio Scrum 2.0 Manage your product and sprint backlogs using the Agile planning tools Approach This Starter guide is a short, sharp introduction to Team Foundation Server 2012, covering everything you need to get up and running. Who this book is written for If you are a developer, project lead, tester, or IT administrator working with Team Foundation Server 2012 this guide will get you up to speed quickly and with minimal effort.

    Read the article

  • NoSQL Memcached API for MySQL: Latest Updates

    - by Mat Keep
    With data volumes exploding, it is vital to be able to ingest and query data at high speed. For this reason, MySQL has implemented NoSQL interfaces directly to the InnoDB and MySQL Cluster (NDB) storage engines, which bypass the SQL layer completely. Without SQL parsing and optimization, Key-Value data can be written directly to MySQL tables up to 9x faster, while maintaining ACID guarantees. In addition, users can continue to run complex queries with SQL across the same data set, providing real-time analytics to the business or anonymizing sensitive data before loading to big data platforms such as Hadoop, while still maintaining all of the advantages of their existing relational database infrastructure. This and more is discussed in the latest Guide to MySQL and NoSQL where you can learn more about using the APIs to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database The native Memcached API is part of the MySQL 5.6 Release Candidate, and is already available in the GA release of MySQL Cluster. By using the ubiquitous Memcached API for writing and reading data, developers can preserve their investments in Memcached infrastructure by re-using existing Memcached clients, while also eliminating the need for application changes. Speed, when combined with flexibility, is essential in the world of growing data volumes and variability. Complementing NoSQL access, support for on-line DDL (Data Definition Language) operations in MySQL 5.6 and MySQL Cluster enables DevOps teams to dynamically update their database schema to accommodate rapidly changing requirements, such as the need to capture additional data generated by their applications. These changes can be made without database downtime. Using the Memcached interface, developers do not need to define a schema at all when using MySQL Cluster. Lets look a little more closely at the Memcached implementations for both InnoDB and MySQL Cluster. Memcached Implementation for InnoDB The Memcached API for InnoDB is previewed as part of the MySQL 5.6 Release Candidate. As illustrated in the following figure, Memcached for InnoDB is implemented via a Memcached daemon plug-in to the mysqld process, with the Memcached protocol mapped to the native InnoDB API. Figure 1: Memcached API Implementation for InnoDB With the Memcached daemon running in the same process space, users get very low latency access to their data while also leveraging the scalability enhancements delivered with InnoDB and a simple deployment and management model. Multiple web / application servers can remotely access the Memcached / InnoDB server to get direct access to a shared data set. With simultaneous SQL access, users can maintain all the advanced functionality offered by InnoDB including support for Foreign Keys, XA transactions and complex JOIN operations. Benchmarks demonstrate that the NoSQL Memcached API for InnoDB delivers up to 9x higher performance than the SQL interface when inserting new key/value pairs, with a single low-end commodity server supporting nearly 70,000 Transactions per Second. Figure 2: Over 9x Faster INSERT Operations The delivered performance demonstrates MySQL with the native Memcached NoSQL interface is well suited for high-speed inserts with the added assurance of transactional guarantees. You can check out the latest Memcached / InnoDB developments and benchmarks here You can learn how to configure the Memcached API for InnoDB here Memcached Implementation for MySQL Cluster Memcached API support for MySQL Cluster was introduced with General Availability (GA) of the 7.2 release, and joins an extensive range of NoSQL interfaces that are already available for MySQL Cluster Like Memcached, MySQL Cluster provides a distributed hash table with in-memory performance. MySQL Cluster extends Memcached functionality by adding support for write-intensive workloads, a full relational model with ACID compliance (including persistence), rich query support, auto-sharding and 99.999% availability, with extensive management and monitoring capabilities. All writes are committed directly to MySQL Cluster, eliminating cache invalidation and the overhead of data consistency checking to ensure complete synchronization between the database and cache. Figure 3: Memcached API Implementation with MySQL Cluster Implementation is simple: 1. The application sends reads and writes to the Memcached process (using the standard Memcached API). 2. This invokes the Memcached Driver for NDB (which is part of the same process) 3. The NDB API is called, providing for very quick access to the data held in MySQL Cluster’s data nodes. The solution has been designed to be very flexible, allowing the application architect to find a configuration that best fits their needs. It is possible to co-locate the Memcached API in either the data nodes or application nodes, or alternatively within a dedicated Memcached layer. The benefit of this flexible approach to deployment is that users can configure behavior on a per-key-prefix basis (through tables in MySQL Cluster) and the application doesn’t have to care – it just uses the Memcached API and relies on the software to store data in the right place(s) and to keep everything synchronized. Using Memcached for Schema-less Data By default, every Key / Value is written to the same table with each Key / Value pair stored in a single row – thus allowing schema-less data storage. Alternatively, the developer can define a key-prefix so that each value is linked to a pre-defined column in a specific table. Of course if the application needs to access the same data through SQL then developers can map key prefixes to existing table columns, enabling Memcached access to schema-structured data already stored in MySQL Cluster. Conclusion Download the Guide to MySQL and NoSQL to learn more about NoSQL APIs and how you can use them to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database See how to build a social app with MySQL Cluster and the Memcached API from our on-demand webinar or take a look at the docs Don't hesitate to use the comments section below for any questions you may have 

    Read the article

  • Silverlight Cream for May 15, 2010 -- #862

    - by Dave Campbell
    In this Issue: Victor Gaudioso, Antoni Dol(-2-), Brian Genisio, Shawn Wildermuth, Mike Snow, Phil Middlemiss, Pete Brown, Kirupa, Dan Wahlin, Glenn Block, Jeff Prosise, Anoop Madhusudanan, and Adam Kinney. Shoutouts: Victor Gaudioso would like you to Checkout my Interview with Microsoft’s Murray Gordon at MIX 10 Pete Brown announced: Connected Show Podcast #29 With … Me! From SilverlightCream.com: New Silverlight Video Tutorial: How to Create Fast Forward for the MediaElement Victor Gaudioso's latest video tutorial is on creating the ability to fast-forward a MediaElement... check it out in the tutorial player itself! Overlapping TabItems with the Silverlight Toolkit TabControl Antoni Dol has a very cool tutorial up on the Toolkit TabItems control... not only is he overlapping them quite nicely but this is a very cool tutorial... QuoteFloat: Animating TextBlock PlaneProjections for a spiraling effect in Silverlight Antoni Dol also has a Blend tutorial up on animating TextBlock items... run the demo and you'll want to read the rest :) Adventures in MVVM – My ViewModel Base – Silverlight Support! Brian Genisio continues his MVVM tutorials with this update on his ViewModel base using some new C# 4.0 features, and fully supports Silverlight and WPF My Thoughts on the Windows Phone 7 Shawn Wildermuth gives his take on WP7. He included a port of his XBoxGames app to WP7 ... thanks Shawn! Silverlight Tip of the Day #20 – Using Tooltips in Silverlight I figured Mike Snow was going to overrun me with tips since I have missed a couple days, but there's only one! ... and it's on Tooltips. Animating the Silverlight opacity mask Phil Middlemiss has an article at SilverZine describing a Behavior he wrote (and is sharing) that turns a FrameworkElement into an opacity mask for it's parent container... cool demo on the page too. Breaking Apart the Margin Property in Xaml for better Binding Pete Brown dug in on a Twitter message and put some thoughts down about breaking a Margin apart to see about binding to the individual elements. Building a Simple Windows Phone App Kirupa has a 6-part tutorial up on building not-your-typical first WP7 application... all good stuff! Integrating HTML into Silverlight Applications Dan Wahlin has a post up discussing three ways to display HTML inside a Silverlight app. Hello MEF in Silverlight 4 and VB! (with an MVVM Light cameo) Glenn Block has a post up discussing MEF, MVVM, and it's in VB this time... and it's actually a great tutorial top to bottom... all source included of course :) Understanding Input Scope in Silverlight for Windows Phone Jeff Prosise has a good post up on the WP7 SIP and how to set the proper InputScope to get the SIP you want. Thinking about Silverlight ‘desktop’ apps – Creating a Standalone Installer for offline installation (no browser) Anoop Madhusudanan is discussing something that's been floating around for a while... installing Silverlight from, say, a CD or DVD when someone installs your app. He's got some good code, but be sure to read Tim Heuer and Scott Guthrie's comments, and consider digging deeper into that part. Using FluidMoveBehavior to animate grid coordinates in Silverlight Adam Kinney has a cool post up on animating an object using the FluidMotionBehavior of Blend 4... looks great moving across a checkerboard... check out the demo, then grab the code. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Big Data: Size isn’t everything

    - by Simon Elliston Ball
    Big Data has a big problem; it’s the word “Big”. These days, a quick Google search will uncover terabytes of negative opinion about the futility of relying on huge volumes of data to produce magical, meaningful insight. There are also many clichéd but correct assertions about the difficulties of correlation versus causation, in massive data sets. In reading some of these pieces, I begin to understand how climatologists must feel when people complain ironically about “global warming” during snowfall. Big Data has a name problem. There is a lot more to it than size. Shape, Speed, and…err…Veracity are also key elements (now I understand why Gartner and the gang went with V’s instead of S’s). The need to handle data of different shapes (Variety) is not new. Data developers have always had to mold strange-shaped data into our reporting systems, integrating with semi-structured sources, and even straying into full-text searching. However, what we lacked was an easy way to add semi-structured and unstructured data to our arsenal. New “Big Data” tools such as MongoDB, and other NoSQL (Not Only SQL) databases, or a graph database like Neo4J, fill this gap. Still, to many, they simply introduce noise to the clean signal that is their sensibly normalized data structures. What about speed (Velocity)? It’s not just high frequency trading that generates data faster than a single system can handle. Many other applications need to make trade-offs that traditional databases won’t, in order to cope with high data insert speeds, or to extract quickly the required information from data streams. Unfortunately, many people equate Big Data with the Hadoop platform, whose batch driven queries and job processing queues have little to do with “velocity”. StreamInsight, Esper and Tibco BusinessEvents are examples of Big Data tools designed to handle high-velocity data streams. Again, the name doesn’t do the discipline of Big Data any favors. Ultimately, though, does analyzing fast moving data produce insights as useful as the ones we get through a more considered approach, enabled by traditional BI? Finally, we have Veracity and Value. In many ways, these additions to the classic Volume, Velocity and Variety trio acknowledge the criticism that without high-quality data and genuinely valuable outputs then data, big or otherwise, is worthless. As a discipline, Big Data has recognized this, and data quality and cleaning tools are starting to appear to support it. Rather than simply decrying the irrelevance of Volume, we need as a profession to focus how to improve Veracity and Value. Perhaps we should just declare the ‘Big’ silent, embrace these new data tools and help develop better practices for their use, just as we did the good old RDBMS? What does Big Data mean to you? Which V gives your business the most pain, or the most value? Do you see these new tools as a useful addition to the BI toolbox, or are they just enabling a dangerous trend to find ghosts in the noise?

    Read the article

  • Big Data – Role of Cloud Computing in Big Data – Day 11 of 21

    - by Pinal Dave
    In yesterday’s blog post we learned the importance of the NewSQL. In this article we will understand the role of Cloud in Big Data Story What is Cloud? Cloud is the biggest buzzword around from last few years. Everyone knows about the Cloud and it is extremely well defined online. In this article we will discuss cloud in the context of the Big Data. Cloud computing is a method of providing a shared computing resources to the application which requires dynamic resources. These resources include applications, computing, storage, networking, development and various deployment platforms. The fundamentals of the cloud computing are that it shares pretty much share all the resources and deliver to end users as a service.  Examples of the Cloud Computing and Big Data are Google and Amazon.com. Both have fantastic Big Data offering with the help of the cloud. We will discuss this later in this blog post. There are two different Cloud Deployment Models: 1) The Public Cloud and 2) The Private Cloud Public Cloud Public Cloud is the cloud infrastructure build by commercial providers (Amazon, Rackspace etc.) creates a highly scalable data center that hides the complex infrastructure from the consumer and provides various services. Private Cloud Private Cloud is the cloud infrastructure build by a single organization where they are managing highly scalable data center internally. Here is the quick comparison between Public Cloud and Private Cloud from Wikipedia:   Public Cloud Private Cloud Initial cost Typically zero Typically high Running cost Unpredictable Unpredictable Customization Impossible Possible Privacy No (Host has access to the data Yes Single sign-on Impossible Possible Scaling up Easy while within defined limits Laborious but no limits Hybrid Cloud Hybrid Cloud is the cloud infrastructure build with the composition of two or more clouds like public and private cloud. Hybrid cloud gives best of the both the world as it combines multiple cloud deployment models together. Cloud and Big Data – Common Characteristics There are many characteristics of the Cloud Architecture and Cloud Computing which are also essentially important for Big Data as well. They highly overlap and at many places it just makes sense to use the power of both the architecture and build a highly scalable framework. Here is the list of all the characteristics of cloud computing important in Big Data Scalability Elasticity Ad-hoc Resource Pooling Low Cost to Setup Infastructure Pay on Use or Pay as you Go Highly Available Leading Big Data Cloud Providers There are many players in Big Data Cloud but we will list a few of the known players in this list. Amazon Amazon is arguably the most popular Infrastructure as a Service (IaaS) provider. The history of how Amazon started in this business is very interesting. They started out with a massive infrastructure to support their own business. Gradually they figured out that their own resources are underutilized most of the time. They decided to get the maximum out of the resources they have and hence  they launched their Amazon Elastic Compute Cloud (Amazon EC2) service in 2006. Their products have evolved a lot recently and now it is one of their primary business besides their retail selling. Amazon also offers Big Data services understand Amazon Web Services. Here is the list of the included services: Amazon Elastic MapReduce – It processes very high volumes of data Amazon DynammoDB – It is fully managed NoSQL (Not Only SQL) database service Amazon Simple Storage Services (S3) – A web-scale service designed to store and accommodate any amount of data Amazon High Performance Computing – It provides low-tenancy tuned high performance computing cluster Amazon RedShift – It is petabyte scale data warehousing service Google Though Google is known for Search Engine, we all know that it is much more than that. Google Compute Engine – It offers secure, flexible computing from energy efficient data centers Google Big Query – It allows SQL-like queries to run against large datasets Google Prediction API – It is a cloud based machine learning tool Other Players Besides Amazon and Google we also have other players in the Big Data market as well. Microsoft is also attempting Big Data with the Cloud with Microsoft Azure. Additionally Rackspace and NASA together have initiated OpenStack. The goal of Openstack is to provide a massively scaled, multitenant cloud that can run on any hardware. Thing to Watch The cloud based solutions provides a great integration with the Big Data’s story as well it is very economical to implement as well. However, there are few things one should be very careful when deploying Big Data on cloud solutions. Here is a list of a few things to watch: Data Integrity Initial Cost Recurring Cost Performance Data Access Security Location Compliance Every company have different approaches to Big Data and have different rules and regulations. Based on various factors, one can implement their own custom Big Data solution on a cloud. Tomorrow In tomorrow’s blog post we will discuss about various Operational Databases supporting Big Data. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • What’s New from the Oracle Marketing Cloud at Oracle OpenWorld 2014

    - by Kathryn Perry
    A Guest Post by Laura Vogel, Director, Oracle Marketing Cloud Events (pictured left) Marketing—CX Central is your hub for all things Marketing related at OpenWorld in San Francisco, September 28-October 2, 2014. Learn how to personalize the modern marketing journey to improve customer loyalty. We’re hosting more than 60 breakout sessions, half of which will highlight customer success stories from marquee brands including Bizo, Comcast, Dell, Epson, John Deere, Lane Bryant, ReadyTalk and Shutterfly. Moscone West, Levels 2 and 3To learn more about how modern marketing works, visit Moscone West, levels 2 and 3, for exciting demos of each of the Oracle Marketing Cloud solutions (BlueKai, Compendium, Eloqua, Push I/O, and Responsys). You also can check out our stations for Vertical Marketing Best Practices, the Markie Awards, and more! CX Spotlight Sessions “Accelerating Big Profits in Big Data,” Jeff Tanner, Baylor University “Using Content Marketing to Impact Every Stage of the Buyer’s Journey,” Jennifer Agustin, Bizo “Expanding Your Marketing with Proven Testing and Optimization,” Brian Border, Shutterfly and Matthew Balthazor, Epson “Modern Marketing: The New Digital Dialogue,” Cory Treffiletti, Oracle A Special Marquee SessionDell’s Hayden Mugford will speak on "The Digital Ecosystem: Driving Experience Through Contact Engagement.” She will highlight how the organization built a digital ecosystem that supports a behaviorally driven, multivehicle nurturing campaign. The Dell 1:1 Global Marketing team worked with multiple partners to innovate integrations with Oracle Eloqua, Oracle Real-Time Decisions for real-time decision logic, and a content management system (CMS) that enables 100 percent customized e-mails. The program doubled average order values for nurtured contacts versus non-nurtured and tripled open and click-through rates versus push e-mail. It Wouldn’t Be an Oracle Marketing Cloud Event Without a Party!We’re hosting CX Central Fest: a unique customer experience specifically designed for attendees of CX Central. It will include a chance to rock out at a private concert featuring Los Angeles indie electronic pop group, Capital Cities! Join us Tuesday, September 30 from 7-9 p.m. Other Oracle Marketing Cloud Session Highlights Thought leadership by role Exploring the benefits of moving to the Cloud Product line roadmaps and innovations in Marketing Technical deep dives for product lines within Marketing Best practices and impactful business measurements Solutions that are integrated across CX Target AudienceSession content is geared toward professionals in Marketing, Marketing Operations, Marketing Demand Generation, Social: Chief Marketing Officers, Vice Presidents, Directors and Managers. OutcomesCustomers attending Marketing—CX Central @ OpenWorld will be able to: Gain insight into delivering consistent cross-channel marketing Discover how to provide the right information to the right customer at the right time and with the right channel Get answers to burning questions and advice on business challenges Hear from other Oracle customers about recommended best practices to help their organization move forward Network and share ideas to help create a strategy for connecting with customers in better ways Resources At a Glance Register Now Track Site—View Marketing Sessions 72 1024x768 Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Focus on Session Doc Downloadable Justification Email OpenWorld is a fabulous way for you to see all that Oracle Marketing Cloud has to offer. Register today.

    Read the article

  • Management and Monitoring Tools for Windows Azure

    - by BuckWoody
    With such a large platform, Windows Azure has a lot of moving parts. We’ve done our best to keep the interface as simple as possible, while giving you the most control and visibility we can. However, as with most Microsoft products, there are multiple ways to do something – and I’ve always found that to be a good strength. Depending on the situation, I might want a graphical interface, a command-line interface, or just an API so I can incorporate the management into my own tools, or have third-party companies write other tools. While by no means exhaustive, I thought I might put together a quick list of a few tools you can use to manage and monitor Windows Azure components, from our IaaS, SaaS and PaaS offerings. Some of the products focus on one area more than another, but all are available today. I’ll try and maintain this list to keep it current, but make sure you check the date of this post’s update – if it’s more than six months old, it’s most likely out of date. Things move fast in the cloud. The Windows Azure Management Portal The primary tool for managing Windows Azure is our portal – most everything you need is there, from creating new services to querying a database. There are two versions as of this writing – a Silverlight client version, and a newer HTML5 version. The latter is being updated constantly to be in parity with the Silverlight client. There’s a balance in this portal between simplicity and power – we’re following the “less is more” approach, with increasing levels of detail as you work through the portal rather than overwhelming you with a single, long “more is more” page. You can find the Portal here: http://windowsazure.com (then click “Log In” and then “Portal”) Windows Azure Management API You can also use programming tools to either write your own interface, or simply provide management functions directly within your solution. You have two options – you can use the more universal REST API’s, which area bit more complex but work with any system that can write to them, or the more approachable .NET API calls in code. You can find the reference for the API’s here: http://msdn.microsoft.com/en-us/library/windowsazure/ee460799.aspx  All Class Libraries, for each part of Windows Azure: http://msdn.microsoft.com/en-us/library/ee393295.aspx  PowerShell Command-lets PowerShell is one of the most powerful scripting languages I’ve used with Windows – and it’s baked into all of our products. When you need to work with multiple servers, scripting is really the only way to go, and the Windows Azure PowerShell Command-Lets allow you to work across most any part of the platform – and can even be used within the services themselves. You can do everything with them from creating a new IaaS, PaaS or SaaS service, to controlling them and even working with security and more. You can find more about the Command-Lets here: http://wappowershell.codeplex.com/documentation (older link, still works, will point you to the new ones as well) We have command-line utilities for other operating systems as well: https://www.windowsazure.com/en-us/manage/downloads/  Video walkthrough of using the Command-Lets: http://channel9.msdn.com/Events/BUILD/BUILD2011/SAC-859T  System Center System Center is actually a suite of graphical tools you can use to manage, deploy, control, monitor and tune software from Microsoft and even other platforms. This will be the primary tool we’ll recommend for managing a hybrid or contiguous management process – and as time goes on you’ll see more and more features put into System Center for the entire Windows Azure suite of products. You can find the Management Pack and README for it here: http://www.microsoft.com/en-us/download/details.aspx?id=11324  SQL Server Management Studio / Data Tools / Visual Studio SQL Server has two built-in management and development, and since Version 2008 R2, you can use them to manage Windows Azure Databases. Visual Studio also lets you connect to and manage portions of Windows Azure as well as Windows Azure Databases. You can read more about Visual Studio here: http://msdn.microsoft.com/en-us/library/windowsazure/ee405484  You can read more about the SQL tools here: http://msdn.microsoft.com/en-us/library/windowsazure/ee621784.aspx  Vendor-Provided Tools Microsoft does not suggest or endorse a specific third-party product. We do, however, use them, and see lots of other customers use them. You can browse to these sites to learn more, and chat with their folks directly on how they support Windows Azure. Cerebrata: Tools for managing from the command-line, graphical diagnostics, graphical storage management - http://www.cerebrata.com/  Quest Cloud Tools: Monitoring, Storage Management, and costing tools - http://communities.quest.com/community/cloud-tools  Paraleap: Monitoring tool - http://www.paraleap.com/AzureWatch  Cloudgraphs: Monitoring too -  http://www.cloudgraphs.com/  Opstera: Monitoring for Windows Azure and a Scale-out pattern manager - http://www.opstera.com/products/Azureops/  Compuware: SaaS performance monitoring, load testing -  http://www.compuware.com/application-performance-management/gomez-apm-products.html  SOASTA: Penetration and Security Testing - http://www.soasta.com/cloudtest/enterprise/  LoadStorm: Load-testing tool - http://loadstorm.com/windows-azure  Open-Source Tools This is probably the most specific set of tools, and the list I’ll have to maintain most often. Smaller projects have a way of coming and going, so I’ll try and make sure this list is current. Windows Azure MMC: (I actually use this one a lot) http://wapmmc.codeplex.com/  Windows Azure Diagnostics Monitor: http://archive.msdn.microsoft.com/wazdmon  Azure Application Monitor: http://azuremonitor.codeplex.com/  Azure Web Log: http://www.xentrik.net/software/azure_web_log.html  Cloud Ninja:Multi-Tennant billing and performance monitor -  http://cnmb.codeplex.com/  Cloud Samurai: Multi-Tennant Management- http://cloudsamurai.codeplex.com/    If you have additions to this list, please post them as a comment and I’ll research and then add them. Thanks!

    Read the article

  • Launch Webcast Q&A: Oracle WebCenter Suite 11g - The Platform for the Modern User Experience

    - by howard.beader(at)oracle.com
    Did you have a chance to watch the Oracle WebCenter Suite 11g Launch Webcast yet? Andy MacMillan presented some great information on the webcast and answered quite a few of your questions in the Q&A session as well. For your reading pleasure we have captured a number of the questions and answers and they are summarized below: Question: Can you tell me what should our Portal strategy be for integrating and extending our Oracle enterprise applications? Answer: We recommend that you look at this in two steps, the first would be to ensure that you have a good understanding of our common user experience architecture. Internally our product teams at Oracle are already investing in this quite heavily today for Fusion Applications and this is driving natural convergence from a UX strategy standpoint. The second step would be to look at how best to componentize the back office applications so that the business users across your organization can take advantage of these -- don't make it just about putting a new skin on top of what you already have from an application standpoint, instead look at how best to embed the social computing capabilities as part of the solution for your business users. Question: We are currently using the BEA WebLogic Portal now, should we stay on WLP or should we be looking at moving to WebCenter or when should we move to WebCenter? Answer: Our strategy has been called "Continue & Converge", this theme means that you can continue to use WebLogic or Plumtree portals until your organization is ready to move to WebCenter and in the mean time you can continue to deploy what you need to in your organization of WLP or WCI Portals with the full support of Oracle. In addition WebCenter Services can be leveraged for social computing to complement what you are already doing today and enable your organization to take advantage of some of the latest and greatest social computing capabilities. We have migration scripts and conversion capabilities available as well as programs where Oracle can help you evaluate your options to decide how best to move forward. WebCenter provides the best of the best capabilities and will enable you to take advantage of new capabilities that may not exist in your current portal today. In the end though it's up to you as a customer as to when you want to make the transition to Oracle WebCenter Suite. Question: Can you tell me how is Oracle leveraging WebCenter internally and for its Application and Middleware product UX strategies? Answer: Internally, Oracle is leveraging WebCenter for our employees and thus far we are seeing significant updates with our users taking advantage of the business activity streams, team spaces and collaboration capabilities. From a product strategy standpoint, our product teams are taking advantage of the common user experience architecture and leveraging WebCenter to provide social and collaborative capabilities to the Oracle Applications and providing new types of composite applications with what is coming with Fusion Applications. WebCenter also provides a common user experience across all the products in the Oracle Fusion Middleware family as well. Question: Our organization is currently using SharePoint, but we are also an Oracle Applications customer, how should we be thinking about WebCenter as we move forward? Answer: Great question. Typically, we are seeing organizations using SharePoint for its core use cases of small team collaboration and file server replacement. WebCenter can connect to SharePoint as a content source to feed into WebCenter quite easily and it leverages the robust Oracle ECM product under WebCenter as well. In addition, SharePoint team sites can be connected to WebCenter utilizing our SharePoint connector. With Oracle WebCenter though, we are really targeting business users and enterprise applications, thus affecting positive change on the processes that drive the business to improve productivity across your organization. Question: Are organizations today using WebCenter as a Web platform for externally facing public websites? Answer: Yes, we are seeing a convergence around web content management and portal types of websites with customers converting them from just broadcasting content to making it a much richer personalized experience and also exposing back-office applications as well. Web Content Management capabilities are already embedded in WebCenter so that organizations can take advantage now of the benefits a personalized web experience provides for your customers. This is simply a short summary of a few of the questions addressed on the webcast, please tune in now to learn more about Oracle WebCenter, the user experience platform for the enterprise and the web! The Oracle WebCenter Suite 11g Launch Webcast can be found here

    Read the article

  • How to Crop Pictures in Word, Excel, and PowerPoint 2010

    - by DigitalGeekery
    When you add pictures to your Office documents you might need to crop them to remove unwanted areas, or isolate a specific part. Today we’ll take a look at how to crop images in Office 2010. Note: We will show you examples in Word, but you can crop images in Word, Excel, and PowerPoint. To insert a picture into your Office document, click the Picture button on the Insert tab. The Picture Tools format ribbon should now be active. If not, click on the image. New in Office 2010 is the ability to see the area of the photo that you are keeping in addition to what will be cropped out. On the Format tab, click Crop. Click and drag inward any of the four corners to crop from any one side. Notice you can still see the area to be cropped out is show in translucent gray. Press and hold the CTRL key while you drag a corner cropping handle inward to crop equally on all four sides. To crop equally on right and left or the top and bottom, press and hold down the CTRL key while you drag the center cropping handle on either side inward. You can further adjust the cropping area by clicking and dragging the picture behind the cropping area. To accept the current dimensions and crop the photo, press escape or click anywhere outside the cropping area. You can manually crop the image to exact dimensions. This can be done by right clicking on the image and entering the dimensions in the Width and Height boxes, or in the Size group on the Format tab.   Crop to a Shape Select your photo and click Crop from the Size group on the Format tab. Select Crop to Shape and choose any of the available shapes. You photo will be cropped into that shape. Using Fit and Fill If you wish to crop a photo but fill the shape, select Fill. When you choose this option, some edges of the picture might not display but the original picture aspect ratio is maintained. If you wish to have all of the picture fit within a shape, choose Fit. The original picture aspect ratio will be maintained.   Conclusion Users moving from previous versions of Microsoft Office are sure to appreciate the improved cropping abilities in Office 2010, especially the ability to see what will and won’t be kept when you crop a photo. Similar Articles Productive Geek Tips Import Microsoft Access Data Into ExcelEmbed an Excel Worksheet Into PowerPoint or Word 2007Add Artistic Effects to Your Pictures in Office 2010Embed True Type Fonts in Word and PowerPoint 2007 DocumentsChange The Default Color Scheme In Office 2007 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 TimeToMeet is a Simple Online Meeting Planning Tool Easily Create More Bookmark Toolbars in Firefox Filevo is a Cool File Hosting & Sharing Site Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate

    Read the article

  • Add the Vista Style Sidebar Back to Windows 7

    - by Mysticgeek
    If you are moving from Vista to Windows 7, you might miss the Sidebar which was introduced in Vista. Today we take a look at a couple options for getting a Sidebar back in Windows 7. Copy Files from Vista Note: In this example we are using 32-bit versions of Vista and Windows 7. Make sure you are logged in with Administrator credentials. If you have a Vista machine running, we can copy the Windows Sidebar files over to the Windows 7 machine. On the Vista machine navigate to C:\Program Files and copy the Windows Sidebar folder and all of its contents over to a flash drive or network location. On the Windows 7 machine go to C:\Program Files and rename the Windows Sidebar folder to something like Windows Sidebar_old. Now copy the Vista Windows Sidebar folder into C:\Program Files… Now you will have both folders…Windows Sidebar and Windows Sidebar_old in your C:\Program Files folder. Right-click on the desktop and select Gadgets. There you are…the Original Vista Sidebar is back and will act as it did in Vista. Move Sidebar Gadgets Another work around if you don’t have a copy of Vista, you can simply move the Desktop Gadgets you want over to the right side of the screen and they will stay there…no dock needed. Type gadgets into the Search box in the Windows Start Menu and click on Desktop Gadgets. Then drag the included Gadgets you want over to the right side of the screen. Or click on the link to Get more gadgets online to find more. Once you have them where you want, each time you reboot they will still be in the same location. This holds true no matter where you place them on your desktop as well. Install Desktop Sidebar If you want an enhanced sidebar that includes a lot of different features, and don’t have a copy of Vista, you might want to check out Desktop Sidebar Beta (link below). This is a freeware application that works with Windows XP, Vista, and Windows 7. After installation you can access it from the Start Menu… Here is how it will look after you launch it… It includes several pre-installed panels including a clock, Media Player, Search Bar, Slideshow, Messenger, Outlook inbox, Tasks, Quick Launch, Performance…and a lot more. It is highly customizable and allows you to change skins, add various levels of transparency, and a lot more. One caveat with going with Desktop Sidebar is we didn’t find a way to add Windows Gadgets to it (though there might be a plugin for it that we’re not aware of). But there are so many options, you may not mind. However, you can still use the desktop gadgets as you normally would in Windows 7. Believe it or not, some people actually prefer the Vista style Sidebar and would like it back in Windows 7. With these options you can get the Vista Sidebar back if you have a copy of Vista, place the Gadgets on the desktop, or go the freeware route. Download Desktop Sidebar (freeware) Similar Articles Productive Geek Tips Disable Windows Sidebar in VistaHow To Repair Your Crashed or Hanging Vista SidebarApplying Themes To Your Windows Vista SidebarDisable Sidebar / Desktop Gadgets on Windows 7Put AOL Instant Messenger (AIM) In your Windows Sidebar TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 VMware Workstation 7 Acronis Online Backup Ultimate Boot CD can help when disaster strikes Windows Firewall with Advanced Security – How To Guides Sculptris 1.0, 3D Drawing app AceStock, a Tiny Desktop Quote Monitor Gmail Button Addon (Firefox) Hyperwords addon (Firefox)

    Read the article

  • What’s the use of code reuse?

    - by Tony Davis
    All great developers write reusable code, don’t they? Well, maybe, but as with all statements regarding what “great” developers do or don’t do, it’s probably an over-simplification. A novice programmer, in particular, will encounter in the literature a general assumption of the importance of code reusability. They spend time worrying about DRY (don’t repeat yourself), moving logic into specific “helper” modules that they can then reuse, agonizing about the minutiae of the class structure, inheritance and interface design that will promote easy reuse. Unfortunately, writing code specifically for reuse often leads to complicated object hierarchies and inheritance models that are anything but reusable. If, instead, one strives to write simple code units that are highly maintainable and perform a single function, in a concise, isolated fashion then the potential for reuse simply “drops out” as a natural by-product. Programmers, of course, care about these principles, about encapsulation and clean interfaces that don’t expose inner workings and allow easy pluggability. This is great when it helps with the maintenance and development of code but how often, in practice, do we actually reuse our code? Most DBAs and database developers are familiar with the practical reasons for the limited opportunities to reuse database code and its potential downsides. However, surely elsewhere in our code base, reuse happens often. After all, we can all name examples, such as date/time handling modules, which if we write with enough care we can plug in to many places. I spoke to a developer just yesterday who looked me in the eye and told me that in 30+ years as a developer (a successful one, I’d add), he’d never once reused his own code. As I sat blinking in disbelief, he explained that, of course, he always thought he would reuse it. He’d often agonized over its design, certain that he was creating code of great significance that he and other generations would reuse, with grateful tears misting their eyes. In fact, it never happened. He had in his head, most of the algorithms he needed and would simply write the code from scratch each time, refining the algorithms and tailoring the code to meet the specific requirements. It was, he said, simply quicker to do that than dig out the old code, check it, correct the mistakes, and adapt it. Is this a common experience, or just a strange anomaly? Viewed in a certain light, building code with a focus on reusability seems to hark to a past age where people built cars and music systems with the idea that someone else could and would replace and reuse the parts. Technology advances so rapidly that the next time you need the “same” code, it’s likely a new technique, or a whole new language, has emerged in the meantime, better equipped to tackle the task. Maybe we should be less fearful of the idea that we could write code well suited to the system requirements, but with little regard for reuse potential, and then rewrite a better version from scratch the next time.

    Read the article

  • The Evolution Of C#

    - by Paulo Morgado
    The first release of C# (C# 1.0) was all about building a new language for managed code that appealed, mostly, to C++ and Java programmers. The second release (C# 2.0) was mostly about adding what wasn’t time to built into the 1.0 release. The main feature for this release was Generics. The third release (C# 3.0) was all about reducing the impedance mismatch between general purpose programming languages and databases. To achieve this goal, several functional programming features were added to the language and LINQ was born. Going forward, new trends are showing up in the industry and modern programming languages need to be more: Declarative With imperative languages, although having the eye on the what, programs need to focus on the how. This leads to over specification of the solution to the problem in hand, making next to impossible to the execution engine to be smart about the execution of the program and optimize it to run it more efficiently (given the hardware available, for example). Declarative languages, on the other hand, focus only on the what and leave the how to the execution engine. LINQ made C# more declarative by using higher level constructs like orderby and group by that give the execution engine a much better chance of optimizing the execution (by parallelizing it, for example). Concurrent Concurrency is hard and needs to be thought about and it’s very hard to shoehorn it into a programming language. Parallel.For (from the parallel extensions) looks like a parallel for because enough expressiveness has been built into C# 3.0 to allow this without having to commit to specific language syntax. Dynamic There was been lots of debate on which ones are the better programming languages: static or dynamic. The fact is that both have good qualities and users of both types of languages want to have it all. All these trends require a paradigm switch. C# is, in many ways, already a multi-paradigm language. It’s still very object oriented (class oriented as some might say) but it can be argued that C# 3.0 has become a functional programming language because it has all the cornerstones of what a functional programming language needs. Moving forward, will have even more. Besides the influence of these trends, there was a decision of co-evolution of the C# and Visual Basic programming languages. Since its inception, there was been some effort to position C# and Visual Basic against each other and to try to explain what should be done with each language or what kind of programmers use one or the other. Each language should be chosen based on the past experience and familiarity of the developer/team/project/company and not by particular features. In the past, every time a feature was added to one language, the users of the other wanted that feature too. Going forward, when a feature is added to one language, the other will work hard to add the same feature. This doesn’t mean that XML literals will be added to C# (because almost the same can be achieved with LINQ To XML), but Visual Basic will have auto-implemented properties. Most of these features require or are built on top of features of the .NET Framework and, the focus for C# 4.0 was on dynamic programming. Not just dynamic types but being able to talk with anything that isn’t a .NET class. Also introduced in C# 4.0 is co-variance and contra-variance for generic interfaces and delegates. Stay tuned for more on the new C# 4.0 features.

    Read the article

  • Best Practices for Handing over Legacy Code

    - by PersonalNexus
    In a couple of months a colleague will be moving on to a new project and I will be inheriting one of his projects. To prepare, I have already ordered Michael Feathers' Working Effectively with Legacy Code. But this books as well as most questions on legacy code I found so far are concerned with the case of inheriting code as-is. But in this case I actually have access to the original developer and we do have some time for an orderly hand-over. Some background on the piece of code I will be inheriting: It's functioning: There are no known bugs, but as performance requirements keep going up, some optimizations will become necessary in the not too distant future. Undocumented: There is pretty much zero documentation at the method and class level. What the code is supposed to do at a higher level, though, is well-understood, because I have been writing against its API (as a black-box) for years. Only higher-level integration tests: There are only integration tests testing proper interaction with other components via the API (again, black-box). Very low-level, optimized for speed: Because this code is central to an entire system of applications, a lot of it has been optimized several times over the years and is extremely low-level (one part has its own memory manager for certain structs/records). Concurrent and lock-free: While I am very familiar with concurrent and lock-free programming and have actually contributed a few pieces to this code, this adds another layer of complexity. Large codebase: This particular project is more than ten thousand lines of code, so there is no way I will be able to have everything explained to me. Written in Delphi: I'm just going to put this out there, although I don't believe the language to be germane to the question, as I believe this type of problem to be language-agnostic. I was wondering how the time until his departure would best be spent. Here are a couple of ideas: Get everything to build on my machine: Even though everything should be checked into source code control, who hasn't forgotten to check in a file once in a while, so this should probably be the first order of business. More tests: While I would like more class-level unit tests so that when I will be making changes, any bugs I introduce can be caught early on, the code as it is now is not testable (huge classes, long methods, too many mutual dependencies). What to document: I think for starters it would be best to focus documentation on those areas in the code that would otherwise be difficult to understand e.g. because of their low-level/highly optimized nature. I am afraid there are a couple of things in there that might look ugly and in need of refactoring/rewriting, but are actually optimizations that have been out in there for a good reason that I might miss (cf. Joel Spolsky, Things You Should Never Do, Part I) How to document: I think some class diagrams of the architecture and sequence diagrams of critical functions accompanied by some prose would be best. Who to document: I was wondering what would be better, to have him write the documentation or have him explain it to me, so I can write the documentation. I am afraid, that things that are obvious to him but not me would otherwise not be covered properly. Refactoring using pair-programming: This might not be possible to do due to time constraints, but maybe I could refactor some of his code to make it more maintainable while he was still around to provide input on why things are the way they are. Please comment on and add to this. Since there isn't enough time to do all of this, I am particularly interested in how you would prioritize.

    Read the article

  • Converting an Oracle VM VirtualBox VM into an Oracle VM Server image

    - by wim.coekaerts
    As we are working on tighter seemless moving of VM's between the 2 products, here are a few simple steps to convert an existing Oracle VM VirtualBox image over. Steps involved to make it easy/straightforward : (1) When creating a VM in Virtualbox, using Oracle Linux as an example, make sure that /etc/fstab only uses labels. Do not use hardcoded device names. instead of an entry /dev/sda1 /u01 ext3 defaults 1 1 use LABEL=foo /u01 ext3 defaults 1 1 for more info on labels : man e2label or use a logical volume /dev/VolGroup00/LVfoo /u01 ext3 defaults 1 1 Doing so will make it easier to have an OS boot up on a different hypervisor with potentially different device names. For instance, the VirtualBox VM might expose a scsi driver while in Oracle VM Server you might end up with an ide disk, this then changes /dev/sda to /dev/hda. (2) If you have a VM created that you want to convert, then shut down the VM in VirtualBox and convert the image files : go the the directory that contains your HardDisk image files (.VirtualBox/HardDisks/* as an example) for each of the virtual disks run the following command : VBoxManage clonehd virtualdiskfilename.vdi system.img --format raw where virtualdiskfilename.vdi is the original VBox VM file (this can also be a vmdk file) and system.img is the name of the virtualdisk for Oracle VM. this can be any filename as well, I typically use system.img to specify the boot disk (as is common for Oracle VM template creation) (3) create a vm.cfg To run a VM converted from VirtualBox, you have to create a vm.cfg for Oracle VM server that creates an HVM guest. The easiest is to use a simple hvm vm.cfg and change it for your vm. I have an example here : acpi = 1 apic = 1 builder = 'hvm' device_model = '/usr/lib/xen/bin/qemu-dm' disk = ['file:system.img,hda,w', 'file:oracle.img,hdb,w',',hdc:cdrom,r',] kernel = '/usr/lib/xen/boot/hvmloader' memory = '1024' name = 'vmname' on_crash = 'restart' on_reboot = 'restart' pae = 1 serial = 'pty' timer_mode = '0' usbdevice = 'tablet' vcpus = 1 vif = ['bridge=xenbr0,type=ioemu'] vif_other_config = [] vnc = 1 vncconsole = 1 vnclisten = '0.0.0.0' vncpasswd = '' vncunused = 1 If you take the above vm.cfg, all you need to do - modify disk = (add your virtual disks in there) - modify memory = (amount of memory your VM needs) - modify name = (enter a name for your VM here) - modify vif = (might want to replace bridge=xenbr0 to the bridge you want to use) if you want more than 1 vcpu or other changes of course you have to make those as well. (4) copy this set of files onto your Oracle VM server or onto a webserver in a subdirectory and import the template through Oracle VM Manager. You can also just start the vm using xm create vm.cfg if you like. And that's it. As I said, we are working on automation around all this but it is relatively trivial to convert VM's over as long as you take the basic issues into account. Primarily the set up of the filesystems and the use of labels in /etc/fstab. There are other potential things to look at, such as network config. If you want to make that part clean then prior to shutting down the VM change /etc/modprobe.conf and/or add the mac address of the VM into the vm.cfg in the vifs line. The good thing, at least with Linux, is that even tho the virtual hardware changes, Linux will deal with it just fine (e1000 vs 8139 realtek, ide vs scsi etc). hope this helps.

    Read the article

  • How To Delete Built-in Windows 7 Power Plans (and Why You Probably Shouldn’t)

    - by The Geek
    Do you actually use the Windows 7 power management features? If so, have you ever wanted to just delete one of the built-in power plans? Here’s how you can do so, and why you probably should leave it alone. Just in case you’re new to the party, we’re talking about the power plans that you see when you click on the battery/plug icon in the system tray. The problem is that one of the built-in plans always shows up there, even if you only use custom plans. When you go to “More power options” on the menu there, you’ll be taken to a list of them, but you’ll be unable to get rid of any of the built-in ones, even if you have your own. You can actually delete the power plans, but it will probably cause problems, so we highly recommend against it. If you still want to proceed, keep reading. Delete Built-in Power Plans in Windows 7 Open up an Administrator mod command prompt by right-clicking on the command prompt and choosing “Run as Administrator”, then type in the following command, which will show you a whole list of the plans. powercfg list Do you see that really long GUID code in the middle of each listing? That’s what we’re going to need for the next step. To make it easier, we’ll provide the codes here, just in case you don’t know how to copy to the clipboard from the command prompt. Power Scheme GUID: 381b4222-f694-41f0-9685-ff5bb260df2e  (Balanced) Power Scheme GUID: 8c5e7fda-e8bf-4a96-9a85-a6e23a8c635c  (High performance)Power Scheme GUID: a1841308-3541-4fab-bc81-f71556f20b4a  (Power saver) Before you do any deleting, what you’re going to want to do is export the plan to a file using the –export parameter. For some unknown reason, I used the .xml extension when I did this, though the file isn’t in XML format. Moving on… here’s the syntax of the command: powercfg –export balanced.xml 381b4222-f694-41f0-9685-ff5bb260df2e This will export the Balanced plan to the file balanced.xml. And now, we can delete the plan by using the –delete parameter, and the same GUID.  powercfg –delete 381b4222-f694-41f0-9685-ff5bb260df2e If you want to import the plan again, you can use the -import parameter, though it has one weirdness—you have to specify the full path to the file, like this: powercfg –import c:\balanced.xml Using what you’ve learned, you can export each of the plans to a file, and then delete the ones you want to delete. Why Shouldn’t You Do This? Very simple. Stuff will break. On my test machine, for example, I removed all of the built-in plans, and then imported them all back in, but I’m still getting this error anytime I try to access the panel to choose what the power buttons do: There’s a lot more error messages, but I’m not going to waste your time with all of them. So if you want to delete the plans, do so at your own peril. At least you’ve been warned! Similar Articles Productive Geek Tips Learning Windows 7: Manage Power SettingsCreate a Shortcut or Hotkey to Switch Power PlansDisable Power Management on Windows 7 or VistaChange the Windows 7 or Vista Power Buttons to Shut Down/Sleep/HibernateDisable Windows Vista’s Built-in CD/DVD Burning Features TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Gadfly is a cool Twitter/Silverlight app Enable DreamScene in Windows 7 Microsoft’s “How Do I ?” Videos Home Networks – How do they look like & the problems they cause Check Your IMAP Mail Offline In Thunderbird Follow Finder Finds You Twitter Users To Follow

    Read the article

< Previous Page | 217 218 219 220 221 222 223 224 225 226 227 228  | Next Page >