Search Results

Search found 1406 results on 57 pages for 'dr watson'.

Page 53/57 | < Previous Page | 49 50 51 52 53 54 55 56 57  | Next Page >

  • General guidelines / workflow to convert or transfer video "professionally"?

    - by cloneman
    I'm an IT "professional" who sometimes has to deal with small video conversion / video cutting projects, and I'd like to learn "the right way" to do this. Every time I search Google, there's always a disaster for weird, low-maturity trialware, or random forums threads from 3-4 years ago indicating various antiquated method to do it. The big question is the following: What are the "general" guidelines and tools to transcode video into some efficient (lossless?) intermediary, for editing purposes, for the purpose of eventually re-encoding it after? It seems to me like even the simplest of formats and tasks are a disaster of endless trial & error, or expertise only known by hardened experts who have a swiss army kife of weird conversion tools that they use, almost as if mounting an attack against the project. Here are a few cases in point: Simple VOB files extracted from DVD footage can't be imported into Adobe Premiere directly. Virtualdub is an old software people keep recommending but doesn't seem to support newer formats. I don't even know how to tell with certainty which codecs a video has, and weather the image is interlaced or not, and what resolution and codecs I'm dealing with. Problems: Choosing a wrong interlace option which diminishes quality Choosing a wrong pixel aspect ratio (stretches the image) Choosing a wrong "project type" in Premiere causing footage to require scaling Being forced to use some weird program that will have any number of negative effects What I'm looking for: Books or "Real knowledge" on format conversions, recognized tools, etc. that aren't some random forum guides on how to deal with video formats. Workflow guidelines on identifying a format going from one format to another without problems as mentioned above. Documentation on what programs like Adobe Premiere can and can't do with regards to formats, so that I don't use a wrench as a hammer. TL;DR How should you convert or "prepare" a video file to ensure it will be supported by Premiere for editing? Is premiere a suitable program to handle cropping, encoding, or should other tools be used for this, when making a video montage from a variety of source formats? What are some good books to read that specifically deal with converting videos that use any number of codecs?

    Read the article

  • Letter to Ballmer: Making Better Consumer Devices

    - by andrewbrust
    Last year, I wrote Steve Ballmer an email, and he was kind enough to write me back.  The email contained a scan of a column I wrote praising Microsoft’s BI strategy.  His reply contained three simple words: “Super nice  thanks.” Well, now I’d like to write to Steve again, in an open letter format, and this time the love may be a bit tougher.  But I’m still super earnest. The past two days have been eventful ones for Microsoft: The company announced the departure of company veterans Robbie Bach and J Allard and the market announced Apple is now besting Microsoft in market capitalization. Plus, announcements were made that make it plain that Ballmer will, in effect, be running Microsoft’s Entertainment & Devices division himself. With that in mind, I’d like to offer my list of a dozen things I think Microsoft’s CEO should do to improve that division’s offerings and, hopefully, its bottom line. So here goes:   1. On Windows Phone 7, Stay the Course The press is teeming with headlines and reader comments proclaiming the death-before-arrival of Windows Phone 7.  That’s plain silly.  You’ve got the makings of a great and unique SmartPhone platform, and you’re the only company (even considering RIM) that can offer full fidelity Exchange integration, not to mention implementing Office on the device.  Let the existing team finish this puppy and ship it. And then have them pump out a few updates, over-the-air, quickly.  Show them that Google Android’s not the only product that can do good, rapid dot releases. And another thing: make sure your OEMs’ devices have flawless touch screens.  If they don’t, then you shouldn’t certify them for delivery to customers.  Period. Oh, and kill the Kin, quietly.  It was DOA, and you know it.   2. Move Media Center to the Xbox Platform Media Center is, at its core, a good product.  But delivering a media distribution and DVR platform on a sophisticated PC operating system like Windows 7 just creates too many moving parts.  Xbox already functions as the best Media Center extender device – it should actually be the hub as well. Media Center is mostly based on .NET code – and XNA is a .NET environment for Xbox – find a way to bridge that small gap and make Media Center a joy to work with instead of a frustration.  Beating Apple TV out of this sub-market is the lowest hanging fruit on the tree (goofy pun, but it’s true).   3. Integrate Media Center with Mediaroom, or Kill the Latter You have two media products with almost identical names.  One is for standalone DVRs and the other is for IPTV cable set tops with DVR capabilities.  Can we merge these please?  My previous request of putting Media Center on Xbox would seem to tie into this nicely, since you’ve announced plans to do that with Mediaroom already.   4. Fix the Red Ring of Death People love the Xbox, but they really don’t love sending their consoles back every 18-24 months, when they get a bunch of red lights flashing on power up.  You’ve handled this defect about as gracefully as possible, but it’s been around for a long time now and it doesn’t seem to be fixed yet.  You can do better.  In fact, you must do better, or you insult your customers.   5. Add Blu Ray to Xbox I know, streaming movies are the future; physical media is legacy technology.  So if that’s true, why did you back HD DVD so hard?  You know why: for now, the film studios won’t allow a large selection of new release, HD, surround sound content be distributed on any medium other than Blu Ray or cable pay per view/on-demand.  Don’t you want home theater buffs to see the Xbox as a fantastic device for their rigs?  Don’t you want to put PlayStation 3 out of its misery?  And if you follow my suggestions above (move Media Center to the Xbox and fix the Red Ring problem), you’d have it all sewn up.  Do I think Blu Ray functionality will move a lot of units?  No.  Do I think that it would move more units with desperately needed influential home theater consumers?  You bet.  And you might sell more ZunePass subscriptions in the process. But while you’re at it, make the fan quieter, please.   6. Make More of Windows Home Server Home Server is a fantastic product.  And for reasons unknown to me, it seems like you’re letting it languish.  Development of the add-in ecosystem seems underfunded.  WHS’ unparalleled ease of use and reliability for home PC backup (and emergency restores) goes unsung.  Product cycles are slow.  Support for your OEMs, who are doing great work, especially in the green space with Atom CPUs, seems lacking.  You’ve married a trophy girl and you keep her cloistered at home!  That’s cruel, unusual and, um, incredibly ill-advised.  Make use of this ace card, and while you’re at it, give it real integration with Media Center.  The integration thus far proof-of-concept quality.  You should go way past that – both products will benefit immeasurably.   7. Set Up a Partner Platform for Custom Installers There’s a whole sub-industry of companies that install, integrate and configure home theater, security and connected home products.  They have an industry group. They are influential in the high-end of the consumer electronics industry, and so are their customers.  They love Media Center and they love Windows Home Server.  But I have talked to several of them at the Consumer Electronics Show and they tell me you don’t love them.  They find it very difficult to do business with Microsoft, even though they want nothing more than to sell and evangelize your platform.  This is a travesty.  Please fix it.  Get Allison Watson and the Microsoft Partner Network on board and have her hire someone who knows how to run a channel program for consumer electronics companies.  Problem solved.  Markets expanded.   8. Make Your Own Hardware In other areas, I know you love your partners.  I help run one, so I appreciate that.  But when it came to Xbox and Zune you built them it yourself (albeit on a contract basis, which is fine).  Windows Phone 7 has a chance to work as an OEM play, but it would work better if you produced the devices.  At least consider building a reference device that sells alongside your OEMs’ offerings.  That’s what Google did with the Nexxus One.  And while that phone was not itself a big seller, it catalyzed two wonderful things : (1) a quality bar was set and (2) partners exceeded it.  Before the Nexxus One, the best Android handset out there was the Motorola Droid. The Nexxus One was better, and the HTC Droid Incredible and Evo 4G are now even better than Google’s phone, which is why Verizon and Sprint decided not to carry it.  Imagine if all Windows Phone 6.x devices were on par with the HTC HD2.  I tend to believe you’d have a lot bigger market share than you do now.   9. Continue with Your Retail Initiative From what I hear, it sounds like it’s going well.  And this goes right along with making your own hardware.  When you build it, they will come.  And then it makes the likes of Best Buy and Staples do better.   10. Make an Acquisition (or Two) TiVo and/or Moxi look ripe for the picking.  With their ability to build stuff people love and your ability to run a business, you might just have something.  But do a better job than you did when you bought Danger.  Buy the ideas, not just the customers, eh?   11. Make Beautiful Stuff You’ve heard this one before, I know.  But I have some head-shrinking advice on this one.  You know that Apple obsesses over its industrial design.  You know that appeals to consumers.  But it seems you think doing so is Apple’s game exclusively and so you shouldn’t even try.  Bull dinky.  Come to New York and visit the Museum of Modern Art’s Architecture and Design gallery.  You’ll see that lots of companies and product categories have had very high design value well before Apple existed.  You can do this, and the Zune HD was a great start.  Now run with that.  Find those negative voices in your head that are telling you that you can’t and shut them up.  For good.   12. Burst the Bubble Some of the products you’ve built seem like they were conceived in a bizarro world.  That would appear to be the result of groupthink.  You must do better.  And there’s lots of people willing to advise you.  This includes just about everyone in the Regional Director program, and probably a bunch of MVPs.  Heck, I bet the guys at Engadget could help out too.  Imagine if you let them see the Kin before it shipped.  Talk to high-end gear consumers.  Talk to Best Buy and CostCo customers too.   Signing Off I hope this was of value to you.  As I wrote this I kept telling myself how obvious, even trite, some of these pieces of advice were and then, because of that, doubting they’d really help.  But I decided that they must not be obvious to Microsoft.  Sometimes when you get wrapped up in stuff, it’s hard to clear your head.  I think my head’s pretty clear here though (I’m wrapped up in other stuff), so maybe my perspective can help.  If not, well, then, I guess they all can’t be super nice.

    Read the article

  • What would cause ANY .NET application to crash immediately... except a project I create and Debug in

    - by blak3r
    My software recently got deployed to a customer who said that the application was crashing immediately after it started. After some initial debugging, the customer provided me remote access to one of the computers which was unable to run the application. I found that the crash wasn't specific to my application. Any application which depended on the .NET framework crashed immediately. Conveniently, Visual Studio 2008 was installed so I created a quick hello world application on it and clicked Debug. The application worked fine. But, then when I tried to execute the generated binaries in the /bin/Debug/HelloWorld.exe directory outside of visual studio it crashed. List of things i've tried (UPDATED): I checked that "Everyone" has Read&Execute permissions for c:\Windows. To test that the problem was with the .NET Framework (and not my application), I attempted to download Paint .NET on to the computers. The setup frontend crashed in the same manner. Performed a repair of the .NET framework as outlined in http://support.microsoft.com/kb/908077 (Boy was this fun and time consuming). No luck. Installed .NET 3.5 SP1 (before it just had .NET 3.5) Note: my application targets 2.0 so I did this more as a long shot... but i learned in the process that .NET 3.5 SP1 also updates the underlying frameworks. Ran Aaron Stebner's .NET Setup Verification Tool. This tool indicated that .NET was successfully installed. (I forget if i checked all the versions but at least 2.0 worked). Tested some mini hello world applications which were targeted for .NET 2.0 and .NET 3.5 and both crashed in the same way. Tried launching .NET apps via windbg cmd line. Doing this did allow me invoke my simple hello world applications. So, simple .NET hello world works when invoked by windbg or by launching via debug in visual studio... but doesn't if i try to execute it standalone. I created a dump file using WinDbg. It wasn't all that revealing to me. FAULTING_IP: mscorwks!PEImage::GetEntryPointToken+21 79f4ff9d f6401010 test byte ptr [eax+10h],10h EXCEPTION_RECORD: 0012f710 -- (.exr 0x12f710) ExceptionAddress: 79f4ff9d (mscorwks!PEImage::GetEntryPointToken+0x00000021) ExceptionCode: c0000005 (Access violation) ExceptionFlags: 00000000 NumberParameters: 2 Parameter[0]: 00000000 Parameter[1]: 00000010 Attempt to read from address 00000010 FAULTING_THREAD: 00000b44 PROCESS_NAME: MyProcess.exe ERROR_CODE: (NTSTATUS) 0x80000003 - {EXCEPTION} Breakpoint A breakpoint has been reached. EXCEPTION_CODE: (HRESULT) 0x80000003 (2147483651) - One or more arguments are invalid DETOURED_IMAGE: 1 NTGLOBALFLAG: 0 APPLICATION_VERIFIER_FLAGS: 0 MANAGED_STACK: !dumpstack -EE OS Thread Id: 0xb44 (0) Current frame: ChildEBP RetAddr Caller,Callee EXCEPTION_OBJECT: !pe cb10b4 Exception object: 00cb10b4 Exception type: System.ExecutionEngineException Message: <none> InnerException: <none> StackTrace (generated): <none> StackTraceString: <none> HResult: 80131506 MANAGED_OBJECT_NAME: System.ExecutionEngineException CONTEXT: 0012f72c -- (.cxr 0x12f72c) eax=00000000 ebx=00000000 ecx=00000000 edx=0000000e esi=001a1490 edi=00000001 eip=79f4ff9d esp=0012f9f8 ebp=0012fa1c iopl=0 nv up ei pl zr na pe nc cs=001b ss=0023 ds=0023 es=0023 fs=003b gs=0000 efl=00010246 mscorwks!PEImage::GetEntryPointToken+0x21: 79f4ff9d f6401010 test byte ptr [eax+10h],10h ds:0023:00000010=?? Resetting default scope READ_ADDRESS: 00000010 FOLLOWUP_IP: mscorwks!PEImage::GetEntryPointToken+21 79f4ff9d f6401010 test byte ptr [eax+10h],10h BUGCHECK_STR: APPLICATION_FAULT_NULL_CLASS_PTR_DEREFERENCE_SHUTDOWN PRIMARY_PROBLEM_CLASS: NULL_CLASS_PTR_DEREFERENCE_SHUTDOWN DEFAULT_BUCKET_ID: NULL_CLASS_PTR_DEREFERENCE_SHUTDOWN LAST_CONTROL_TRANSFER: from 79ef02b5 to 79f4ff9d STACK_TEXT: 79f4ff9d mscorwks!PEImage::GetEntryPointToken+0x21 79ef02b5 mscorwks!PEFile::GetEntryPointToken+0xa0 79eefeaf mscorwks!SystemDomain::ExecuteMainMethod+0xd4 79fb9793 mscorwks!ExecuteEXE+0x59 79fb96df mscorwks!_CorExeMain+0x15c 7900b1b3 mscoree!_CorExeMain+0x2c 7c817077 kernel32!BaseProcessStart+0x23 SYMBOL_STACK_INDEX: 0 SYMBOL_NAME: mscorwks!PEImage::GetEntryPointToken+21 FOLLOWUP_NAME: MachineOwner MODULE_NAME: mscorwks IMAGE_NAME: mscorwks.dll DEBUG_FLR_IMAGE_TIMESTAMP: 471ef729 STACK_COMMAND: .cxr 0012F72C ; kb ; dds 12f9f8 ; kb FAILURE_BUCKET_ID: NULL_CLASS_PTR_DEREFERENCE_SHUTDOWN_80000003_mscorwks.dll!PEImage::GetEntryPointToken BUCKET_ID: APPLICATION_FAULT_NULL_CLASS_PTR_DEREFERENCE_SHUTDOWN_DETOURED_mscorwks!PEImage::GetEntryPointToken+21 WATSON_STAGEONE_URL: http://watson.microsoft.com/StageOne/MyProcess_exe/2_4_4_39/4a8a192c/unknown/0_0_0_0/bbbbbbb4/80000003/00000000.htm?Retriage=1 Followup: MachineOwner Edit 1:The event log details for this error say it's a .NET Runtime version 2.0.50727.3053 - Fatal Execution Engine Error (7A097706)(80131506). Edit 2 (10-7-09): This issue is still active. Edit 3 (3-29-10): This update is to let everyone know that I never did successfully solve the problem. The customer who's machine this was on lost interest in solving it and just reimaged the machine :(. Thanks for all the contributions though.

    Read the article

  • RSS Feeds currently on Simple-Talk

    - by Andrew Clarke
    There are a number of news-feeds for the Simple-Talk site, but for some reason they are well hidden. Whilst we set about reorganizing them, I thought it would be a good idea to list some of the more important ones. The most important one for almost all purposes is the Homepage RSS feed which represents the blogs and articles that are placed on the homepage. Main Site Feed representing the Homepage ..which is good for most purposes but won't always have all the blogs, or maybe it will occasionally miss an article. If you aren't interested in all the content, you can just use the RSS feeds that are more relevant to your interests. (We'll be increasing these categories soon) The newsfeed for SQL articles The .NET section newsfeed The newsfeed for Red Gate books The newsfeed for Opinion articles The SysAdmin section newsfeed if you want to get a more refined feed, then you can pick and choose from these feeds for each category so as to make up your custom news-feed in the SQL section, SQL Training Learn SQL Server Database Administration TSQL Programming SQL Server Performance Backup and Recovery SQL Tools SSIS SSRS (Reporting Services) in .NET there are... ASP.NET Windows Forms .NET Framework ,NET Performance Visual Studio .NET tools in Sysadmin there are Exchange General Virtualisation Unified Messaging Powershell in opinion, there is... Geek of the Week Opinion Pieces in Books, there is .NET Books SQL Books SysAdmin Books And all the blogs have got feeds. So although you can get all the blogs from here.. Main Blog Feed          You can get individual RSS feeds.. AdamRG's Blog       Alex.Davies's Blog       AliceE's Blog       Andrew Clarke's Blog       Andrew Hunter's Blog       Bart Read's Blog       Ben Adderson's Blog       BobCram's Blog       bradmcgehee's Blog       Brian Donahue's Blog       Charles Brown's Blog       Chris Massey's Blog       CliveT's Blog       Damon's Blog       David Atkinson's Blog       David Connell's Blog       Dr Dionysus's Blog       drsql's Blog       FatherJack's Blog       Flibble's Blog       Gareth Marlow's Blog       Helen Joyce's Blog       James's Blog       Jason Crease's Blog       John Magnabosco's Blog       Laila's Blog       Lionel's Blog       Matt Lee's Blog       mikef's Blog       Neil Davidson's Blog       Nigel Morse's Blog       Phil Factor's Blog       red@work's Blog       reka.burmeister's Blog       Richard Mitchell's Blog       RobbieT's Blog       RobertChipperfield's Blog       Rodney's Blog       Roger Hart's Blog       Simon Cooper's Blog       Simon Galbraith's Blog       TheFutureOfMonitoring's Blog       Tim Ford's Blog       Tom Crossman's Blog       Tony Davis's Blog       As well as these blogs, you also have the forums.... SQL Server for Beginners Forum     Programming SQL Server Forum    Administering SQL Server Forum    .NET framework Forum    .Windows Forms Forum   ASP.NET Forum   ADO.NET Forum 

    Read the article

  • Smarter, Faster, Cheaper: The Insurance Industry’s Dream

    - by Jenna Danko
    On June 3rd, I saw the Gaylord Resort Centre in Washington D.C. become the hub of C level executives and managers of insurance carriers for the IASA 2013 Conference.  Insurance Accounting/Regulation and Technology sessions took the focus, but there were plenty of tertiary sessions for career development, which complemented the overall strong networking side of the conference.  As an exhibitor, Oracle, along with several hundred other product providers, welcomed the opportunity to display and demonstrate our solutions and we were encouraged by hustle and bustle of the exhibition floor.  The IASA organizers had pre-arranged fast track tours whereby interested conference delegates could sign up for a series of like-themed presentations from Vendors, giving them a level of 'Speed Dating' introductions to possible solutions and services.  Oracle participated in a number of these, which were very well subscribed.  Clearly, the conference had a strong business focus; however, attendees saw technology as a key enabler to get their processes done smarter, faster and cheaper.  As we navigated through the exhibition, it became clear from the inquiries that came to us that insurance carriers are gravitating to a number of focus areas: Navigating the maze of upcoming regulatory reporting changes. For US carriers with European holdings, Solvency II carries a myriad of rules and reporting requirements. Alignment across the globe of the Own Risk and Solvency Assessment (ORSA) processes brings to the fore the National Insurance of Insurance commissioners' (NAIC) recent guidance manual publication. Doing more with less and to certainly expect more from technology for less dollars. The overall cost of IT, in particular hardware, has dropped in real terms (though the appetite for more has risen: more CPU, more RAM, more storage), but software has seen less change. Clearly, customers expect either to pay less or get a lot more from their software solutions for the same buck. Doing things smarter – A recognition that with the advance of technology to stand still no longer means you are technically going backwards. Technology and, in particular technology interactions with human business processes, has undergone incredible change over the past 5 years. Consumer usage (iPhones, etc.) has been at the forefront, but now at the Enterprise level ever more effective technology exploitation is beginning to take place. That data and, in particular gleaning knowledge from data, is refining and improving business processes.  Organizations are now consuming more data than ever before, and it is set to grow exponentially for some time to come.  Amassing large volumes of data is one thing, but effectively analyzing that data is another.  It is the results of such analysis that leads to improvements both in terms of insurance product offerings and the processes to support them. Regulatory Compliance, damned if you do and damned if you don’t! Clearly, around the globe at lot is changing from a regulatory perspective and it is evident that in terms of regulatory requirements, whilst there is a greater convergence across jurisdictions bringing uniformity, there is also a lot of work to be done in the next 5 years. Just like the big data, hidden behind effective regulatory compliance there often lies golden nuggets that can give competitive advantages. From Oracle's perspective, our Rating Engine, Billing, Document Management and Insurance Analytics solutions on display served to strike up good conversations and, as is always the case at conferences, it was a great opportunity to meet and speak with existing Oracle customers that we might not have otherwise caught up with for a while. Fortunately, I was able to catch up on a few sessions at the close of the Exhibition.  The speaker quality was high and the audience asked challenging, but pertinent, questions.  During Dr. Jackie Freiberg’s keynote “Bye Bye Business as Usual,” the author discussed 8 strategies to help leaders create a culture where teams consistently deliver innovative ideas by disrupting the status quo.  The very first strategy: Get wired for innovation.  Freiberg admitted that folks in the insurance and financial services industry understand and know innovation is important, but oftentimes they are slow adopters.  Today, technology and innovation go hand in hand. In speaking to delegates during and after the conference, a high degree of satisfaction could be measured from their positive comments of speaker sessions and the exhibitors. I suspect many will be back in 2014 with Indianapolis as the conference location. Did you attend the IASA Conference in Washington D.C.?  If so, I would love to hear your comments. Andrew Collins is the Director, Solvency II of Oracle Financial Services. He can be reached at andrew.collins AT oracle.com.

    Read the article

  • PASS Summit 2010 Recap

    - by AjarnMark
    Last week I attended my eighth PASS Summit in nine years, and every year it is a fantastic event!  I was fortunate my first year to have a contact (Bill Graziano (blog | Twitter) from SQLTeam) that I was expecting to meet, and who got me started on a good track of making new contacts.  Each year I have made a few more, and renewed friendships from years past.  Many of the attendees agree that the pure networking opportunities are one of the best benefits of attending the Summit.  And there’s a lot of great technical stuff, too, some of the things that stick out for me this year include… Pre-Con Monday: PowerShell with Allen White (blog | Twitter).  This was the first time that I attended a pre-con.  For those not familiar with the concept, the regular sessions for the conference are 75-90 minutes long.  For an extra fee, you can attend a full-day session on a single topic during a pre- or post-conference training day.  I had been meaning for several months to dive in and learn PowerShell, but just never seemed to find (or make) the time for it, so when I saw this was one of the all-day sessions, and I was planning to be there on Monday anyway, I decided to go for it.  And it was well worth it!  I definitely came out of there with a good foundation to build my own PowerShell scripts, plus several sample scripts that he showed which already cover the first four or five things I was planning to do with PowerShell anyway.  This looks like the right tool for me to build an automated version of our software deployment process, which right now contains many repeated steps.  Thanks Allen! Service Broker with Denny Cherry (blog | Twitter).  I remembered reading Denny’s blog post on Using Service Broker instead of Replication, and ever since then I have been thinking about using this to populate a new reporting-focused Data Repository that we will be building in the near future.  When I saw he was doing this session, I thought it would be great to get more information and be able to ask the author questions.  When I brought this idea back to my boss, he really liked it, as we had previously been discussing doing nightly data loads, with an option to manually trigger a mid-day load if up-to-the-minute data was needed for something.  If we go the Service Broker route, we can keep the Repository current in near real-time.  Hooray! DBA Mythbusters with Paul Randal (blog | Twitter).  Even though I read every one of the posts in Paul’s blog series of the same name, I had to go see the legend in person.  It was great, and I still learned something new! How to Conduct Effective Meetings with Joe Webb (blog | Twitter).  I always like to sit in on a session that Joe does.  I met Joe several years ago when both he and Bill Graziano were on the PASS Board of Directors together, and we have kept in touch.  Joe is very well-spoken and has great experience with both SQL Server and business.  And we could certainly use some pointers at my work (probably yours, too) on making our meetings more effective and to run on-time.  Of course, now that I’m the Chapter Leader for the Professional Development virtual chapter, I also had to sit in on this ProfDev session and recruit Joe to do a presentation or two for the chapter next year. Query Optimization with David DeWitt.  Anyone who has seen Dr. David DeWitt present the 3rd keynote at a PASS Summit over the last three years knows what a great time it is to sit and listen to him make some really complicated and advanced topic easy to understand (although it still makes your head hurt).  It still amazes me that the simple two-table join query from pubs that he used in his example can possibly have 22 million possible physical query plans.  Ouch! Exhibit Hall:  This year I spent more serious time in the exhibit hall than any year past.  I have talked my boss into making a significant (for us) investment in monitoring tools next year, and this was a great opportunity to talk with all the big-hitters.  Readers of mine may recall that I fell in love with the SQL Sentry Power Suite several months ago and wrote a blog entry about it just from the trial version.  Well as things turned out, short-term budget priorities shifted, and we weren’t able to make the purchase then.  I have it in the budget for next year, but since I was going to the Summit, my boss wanted me to look at the other options to see if this was really the one that we wanted.  I spent a couple of hours talking with representatives from Red-Gate, Idera, Confio, and Quest about their offerings, and giving them each the same 3 scenarios that I wanted to be able to accomplish based on the questions and issues that arise in our company.  It was interesting to discover the different approaches or “world view” that each vendor takes to the subject of performance monitoring and troubleshooting.  I may write a separate article that goes into this in more depth, but the product that best aligned with our point of view, and met the current needs we have is still the SQL Sentry Power Suite.  I’m not saying that the others are bad or wrong or anything like that, just that the way they tackled the issue did not align as well with our particular needs as does SQL Sentry’s product.  And that was something I learned too, when you go shopping for these products, you really need to know what you want to get from them.  It’s best if you have a few example scenarios from work that you can use to test out how well each tool fits your particular needs. Overall, another GREAT event.  I can’t wait to get the DVDs so I can sit in on a bunch of other sessions that I couldn’t get to because I was in one of the ones above.  And I can hardly wait until next year!

    Read the article

  • Advice on learning programming languages and math.

    - by Joris Ooms
    I feel like I'm getting stuck lately when it comes to learning about programming-related things; I thought I'd ask a question here and write it all down in the hope to get some pointers/advice from people. Perhaps writing it down helps me put things in perspective for myself aswell. I study Interactive Multimedia Design. This course is based on two things: graphic design on one hand, and web development on the other hand. I have quite a decent knowledge of web-related languages (the usual HTML/JS/PHP) and I'll be getting a course on ASP.NET next year. In my free time, I have learnt how to work with CodeIgniter, aswell as some diving into Ruby (and Rails) and basic iOS programming. In my first year of college I also did a class on Java (19/20 on the end result). This grade doesn't really mean anything though; I have the basics of OOP down but Java-wise, we learnt next to nothing. Considering the time I have been programming in, for example, PHP.. I can't say I'm bad at it. I'm definitely not good or great at it, but I'm decent. My teachers tell me I have the programming thing down. They just tell me I should keep on learning. So that's what I do, and I try to take in as much as possible; however, sometimes I'm unsure where to start and I have this tendency to always doubt myself. Now, for the 'question'. I want to get into iOS programming. I know iOS programming boils down to programming in Cocoa Touch and Objective-C. I also know Obj-C is a superset of C. I have done a class on C a couple of years ago, but I failed miserably. I got stuck at pointers and never really understood them.. Until like a month ago. I suddenly 'got' it. I have been working through a book on Objective-C for a week or so now, and I understand the basics (I'm at like.. chapter 6 or so). However, I keep running into similar problems as the ones I had when I did the C class: I suck at math. No, really. I come from a Latin-Modern Languages background in high school and I had nearly no math classes back then. I wanted to study Computer Science, but I failed there because of the miserable state of my mathematics knowledge. I can't explain why I'm suddenly talking about math here though, because it isn't directly related to programming.. yet it is. For example, the examples in the book I'm reading now are about programming a fraction-calculator. All good, I can do the programming when I get the formulas down.. but it takes me a full day or more to actually get to that point. I also find it hard to come up with ideas for myself. I made one small iOS app the other day and it's just a button / label kind of thing. When I press the button, it generates a random number. That's really all I could come up with. Can you 'learn' that? It probably comes down to creativity, but evidently, I'm not too great at being creative. Are there any sites or resources out there that provide something like a basic list of things you can program when you're just starting out? Maybe I'm focusing on too many things at once. I want to keep my HTML/CSS at a decent level, while learning PHP and CodeIgniter, while diving into Ruby on Rails and learning Objective-C and the iOS SDK at the same time. I just want to be good at something, I guess. The problem is that I can't seem to be happy with my PHP stuff. I want more, something 'harder'; that's why I decided to pick up the iOS thing. Like I said, I have the basics down of a lot of different languages. I can program something simple in Java, in C, in Objective-C as of this week.. but it ends there. Mostly because I can't come up with ideas for more complex applications, and also because I just doubt myself: 'Oh, that's too complex, I can never do that'. And then it ends there. To conclude my rant, let me basically rephrase my questions into a 'tl;dr' part. A. I want to get into iOS programming and I have basic knowledge of C/Objective-C. However, I struggle to come up with ideas of my own and implement them and I also suck at math which is something that isn't directly related to, yet often needed while programming. What can I do? B. I have an interest in a lot of different programming languages and I can't stop reading/learning. However, I don't feel like I'm good in anything. Should I perhaps focus on just one language for a year or longer, or keep taking it all in at the same time and hope I'll finally get them all down? C. Are there any resources out there that provide basic ideas of things I can program? I'm thinking about 'simple' command-line applications here to help me while studying C/Obj-C away from the whole iPhone SDK. Like I said, the examples in my book are mainly math-based (fraction calculator) and it's kinda hard. :( Thanks a lot for reading my post. I didn't plan it to be this long but oh well. Thanks in advance for any answers.

    Read the article

  • Use WLST to Delete All JMS Messages From a Destination

    - by james.bayer
    I got a question today about whether WebLogic Server has any tools to delete all messages from a JMS Queue.  It just so happens that the WLS Console has this capability already.  It’s available on the screen after the “Show Messages” button is clicked on a destination’s Monitoring tab as seen in the screen shot below. The console is great for something ad-hoc, but what if I want to automate this?  Well it just so happens that the console is just a weblogic application layered on top of the JMX Management interface.  If you look at the MBean Reference, you’ll find a JMSDestinationRuntimeMBean that includes the operation deleteMessages that takes a JMS Message Selector as an argument.  If you pass an empty string, that is essentially a wild card that matches all messages. Coding a stand-alone JMX client for this is kind of lame, so let’s do something more suitable to scripting.  In addition to the console, WebLogic Scripting Tool (WLST) based on Jython is another way to browse and invoke MBeans, so an equivalent interactive shell session to delete messages from a destination would looks like this: D:\Oracle\fmw11gr1ps3\user_projects\domains\hotspot_domain\bin>setDomainEnv.cmd D:\Oracle\fmw11gr1ps3\user_projects\domains\hotspot_domain>java weblogic.WLST   Initializing WebLogic Scripting Tool (WLST) ...   Welcome to WebLogic Server Administration Scripting Shell   Type help() for help on available commands   wls:/offline> connect('weblogic','welcome1','t3://localhost:7001') Connecting to t3://localhost:7001 with userid weblogic ... Successfully connected to Admin Server 'AdminServer' that belongs to domain 'hotspot_domain'.   Warning: An insecure protocol was used to connect to the server. To ensure on-the-wire security, the SSL port or Admin port should be used instead.   wls:/hotspot_domain/serverConfig> serverRuntime() Location changed to serverRuntime tree. This is a read-only tree with ServerRuntimeMBean as the root. For more help, use help(serverRuntime)   wls:/hotspot_domain/serverRuntime> cd('JMSRuntime/AdminServer.jms/JMSServers/JMSServer-0/Destinations/SystemModule-0!Queue-0') wls:/hotspot_domain/serverRuntime/JMSRuntime/AdminServer.jms/JMSServers/JMSServer-0/Destinations/SystemModule-0!Queue-0> ls() dr-- DurableSubscribers   -r-- BytesCurrentCount 0 -r-- BytesHighCount 174620 -r-- BytesPendingCount 0 -r-- BytesReceivedCount 253548 -r-- BytesThresholdTime 0 -r-- ConsumersCurrentCount 0 -r-- ConsumersHighCount 0 -r-- ConsumersTotalCount 0 -r-- ConsumptionPaused false -r-- ConsumptionPausedState Consumption-Enabled -r-- DestinationInfo javax.management.openmbean.CompositeDataSupport(compositeType=javax.management.openmbean.CompositeType(name=DestinationInfo,items=((itemName=ApplicationName,itemType=javax.management.openmbean.SimpleType(name=java.lang.String)),(itemName=ModuleName,itemType=javax.management.openmbean.SimpleType(name=java.lang.String)),(itemName openmbean.SimpleType(name=java.lang.Boolean)),(itemName=SerializedDestination,itemType=javax.management.openmbean.SimpleType(name=java.lang.String)),(itemName=ServerName,itemType=javax.management.openmbean.SimpleType(name=java.lang.String)),(itemName=Topic,itemType=javax.management.openmbean.SimpleType(name=java.lang.Boolean)),(itemName=VersionNumber,itemType=javax.management.op ule-0!Queue-0, Queue=true, SerializedDestination=rO0ABXNyACN3ZWJsb2dpYy5qbXMuY29tbW9uLkRlc3RpbmF0aW9uSW1wbFSmyJ1qZfv8DAAAeHB3kLZBABZTeXN0ZW1Nb2R1bGUtMCFRdWV1ZS0wAAtKTVNTZXJ2ZXItMAAOU3lzdGVtTW9kdWxlLTABAANBbGwCAlb6IS6T5qL/AAAACgEAC0FkbWluU2VydmVyAC2EGgJW+iEuk+ai/wAAAAsBAAtBZG1pblNlcnZlcgAthBoAAQAQX1dMU19BZG1pblNlcnZlcng=, ServerName=JMSServer-0, Topic=false, VersionNumber=1}) -r-- DestinationType Queue -r-- DurableSubscribers null -r-- InsertionPaused false -r-- InsertionPausedState Insertion-Enabled -r-- MessagesCurrentCount 0 -r-- MessagesDeletedCurrentCount 3 -r-- MessagesHighCount 2 -r-- MessagesMovedCurrentCount 0 -r-- MessagesPendingCount 0 -r-- MessagesReceivedCount 3 -r-- MessagesThresholdTime 0 -r-- Name SystemModule-0!Queue-0 -r-- Paused false -r-- ProductionPaused false -r-- ProductionPausedState Production-Enabled -r-- State advertised_in_cluster_jndi -r-- Type JMSDestinationRuntime   -r-x closeCursor Void : String(cursorHandle) -r-x deleteMessages Integer : String(selector) -r-x getCursorEndPosition Long : String(cursorHandle) -r-x getCursorSize Long : String(cursorHandle) -r-x getCursorStartPosition Long : String(cursorHandle) -r-x getItems javax.management.openmbean.CompositeData[] : String(cursorHandle),Long(start),Integer(count) -r-x getMessage javax.management.openmbean.CompositeData : String(cursorHandle),Long(messageHandle) -r-x getMessage javax.management.openmbean.CompositeData : String(cursorHandle),String(messageID) -r-x getMessage javax.management.openmbean.CompositeData : String(messageID) -r-x getMessages String : String(selector),Integer(timeout) -r-x getMessages String : String(selector),Integer(timeout),Integer(state) -r-x getNext javax.management.openmbean.CompositeData[] : String(cursorHandle),Integer(count) -r-x getPrevious javax.management.openmbean.CompositeData[] : String(cursorHandle),Integer(count) -r-x importMessages Void : javax.management.openmbean.CompositeData[],Boolean(replaceOnly) -r-x moveMessages Integer : String(java.lang.String),javax.management.openmbean.CompositeData,Integer(java.lang.Integer) -r-x moveMessages Integer : String(selector),javax.management.openmbean.CompositeData -r-x pause Void : -r-x pauseConsumption Void : -r-x pauseInsertion Void : -r-x pauseProduction Void : -r-x preDeregister Void : -r-x resume Void : -r-x resumeConsumption Void : -r-x resumeInsertion Void : -r-x resumeProduction Void : -r-x sort Long : String(cursorHandle),Long(start),String[](fields),Boolean[](ascending)   wls:/hotspot_domain/serverRuntime/JMSRuntime/AdminServer.jms/JMSServers/JMSServer-0/Destinations/SystemModule-0!Queue-0> cmo.deleteMessages('') 2 where the domain name is “hotspot_domain”, the JMS Server name is “JMSServer-0”, the Queue name is “Queue-0” and the System Module is named “SystemModule-0”.  To invoke the operation, I use the “cmo” object, which is the “Current Management Object” that represents the currently navigated to MBean.  The 2 indicates that two messages were deleted.  Combining this WLST code with a recent post by my colleague Steve that shows you how to use an encrypted file to store the authentication credentials, you could easily turn this into a secure automated script.  If you need help with that step, a long while back I blogged about some WLST basics.  Happy scripting.

    Read the article

  • Scripting out Contained Database Users

    - by Argenis
      Today’s blog post comes from a Twitter thread on which @SQLSoldier, @sqlstudent144 and @SQLTaiob were discussing the internals of contained database users. Unless you have been living under a rock, you’ve heard about the concept of contained users within a SQL Server database (hit the link if you have not). In this article I’d like to show you that you can, indeed, script out contained database users and recreate them on another database, as either contained users or as good old fashioned logins/server principals as well. Why would this be useful? Well, because you would not need to know the password for the user in order to recreate it on another instance. I know there is a limited number of scenarios where this would be necessary, but nonetheless I figured I’d throw this blog post to show how it can be done. A more obscure use case: with the password hash (which I’m about to show you how to obtain) you could also crack the password using a utility like hashcat, as highlighted on this SQLServerCentral article. The Investigation SQL Server uses System Base Tables to save the password hashes of logins and contained database users. For logins it uses sys.sysxlgns, whereas for contained database users it leverages sys.sysowners. I’ll show you what I do to figure this stuff out: I create a login/contained user, and then I immediately browse the transaction log with, for example, fn_dblog. It’s pretty obvious that only two base tables touched by the operation are sys.sysxlgns, and also sys.sysprivs – the latter is used to track permissions. If I connect to the DAC on my instance, I can query for the password hash of this login I’ve just created. A few interesting things about this hash. This was taken on my laptop, and I happen to be running SQL Server 2014 RTM CU2, which is the latest public build of SQL Server 2014 as of time of writing. In 2008 R2 and prior versions (back to 2000), the password hashes would start with 0x0100. The reason why this changed is because starting with SQL Server 2012 password hashes are kept using a SHA512 algorithm, as opposed to SHA-1 (used since 2000) or Snefru (used in 6.5 and 7.0). SHA-1 is nowadays deemed unsafe and is very easy to crack. For regular SQL logins, this information is exposed through the sys.sql_logins catalog view, so there is really no need to connect to the DAC to grab an SID/password hash pair. For contained database users, there is (currently) no method of obtaining SID or password hashes without connecting to the DAC. If we create a contained database user, this is what we get from the transaction log: Note that the System Base Table used in this case is sys.sysowners. sys.sysprivs is used as well, and again this is to track permissions. To query sys.sysowners, you would have to connect to the DAC, as I mentioned previously. And this is what you would get: There are other ways to figure out what SQL Server uses under the hood to store contained database user password hashes, like looking at the execution plan for a query to sys.dm_db_uncontained_entities (Thanks, Robert Davis!) SIDs, Logins, Contained Users, and Why You Care…Or Not. One of the reasons behind the existence of Contained Users was the concept of portability of databases: it is really painful to maintain Server Principals (Logins) synced across most shared-nothing SQL Server HA/DR technologies (Mirroring, Availability Groups, and Log Shipping). Often times you would need the Security Identifier (SID) of these logins to match across instances, and that meant that you had to fetch whatever SID was assigned to the login on the principal instance so you could recreate it on a secondary. With contained users you normally wouldn’t care about SIDs, as the users are always available (and synced, as long as synchronization takes place) across instances. Now you might be presented some particular requirement that might specify that SIDs synced between logins on certain instances and contained database users on other databases. How would you go about creating a contained database user with a specific SID? The answer is that you can’t do it directly, but there’s a little trick that would allow you to do it. Create a login with a specified SID and password hash, create a user for that server principal on a partially contained database, then migrate that user to contained using the system stored procedure sp_user_migrate_to_contained, then drop the login. CREATE LOGIN <login_name> WITH PASSWORD = <password_hash> HASHED, SID = <sid> ; GO USE <partially_contained_db>; GO CREATE USER <user_name> FROM LOGIN <login_name>; GO EXEC sp_migrate_user_to_contained @username = <user_name>, @rename = N’keep_name’, @disablelogin = N‘disable_login’; GO DROP LOGIN <login_name>; GO Here’s how this skeleton would look like in action: And now I have a contained user with a specified SID and password hash. In my example above, I renamed the user after migrated it to contained so that it is, hopefully, easier to understand. Enjoy!

    Read the article

  • The Virtues and Challenges of Implementing Basel III: What Every CFO and CRO Needs To Know

    - by Jenna Danko
    The Basel Committee on Banking Supervision (BCBS) is a group tasked with providing thought-leadership to the global banking industry.  Over the years, the BCBS has released volumes of guidance in an effort to promote stability within the financial sector.  By effectively communicating best-practices, the Basel Committee has influenced financial regulations worldwide.  Basel regulations are intended to help banks: More easily absorb shocks due to various forms of financial-economic stress Improve risk management and governance Enhance regulatory reporting and transparency In June 2011, the BCBS released Basel III: A global regulatory framework for more resilient banks and banking systems.  This new set of regulations included many enhancements to previous rules and will have both short and long term impacts on the banking industry.  Some of the key features of Basel III include: A stronger capital base More stringent capital standards and higher capital requirements Introduction of capital buffers  Additional risk coverage Enhanced quantification of counterparty credit risk Credit valuation adjustments  Wrong  way risk  Asset Value Correlation Multiplier for large financial institutions Liquidity management and monitoring Introduction of leverage ratio Even more rigorous data requirements To implement these features banks need to embark on a journey replete with challenges. These can be categorized into three key areas: Data, Models and Compliance. Data Challenges Data quality - All standard dimensions of Data Quality (DQ) have to be demonstrated.  Manual approaches are now considered too cumbersome and automation has become the norm. Data lineage - Data lineage has to be documented and demonstrated.  The PPT / Excel approach to documentation is being replaced by metadata tools.  Data lineage has become dynamic due to a variety of factors, making static documentation out-dated quickly.  Data dictionaries - A strong and clean business glossary is needed with proper identification of business owners for the data.  Data integrity - A strong, scalable architecture with work flow tools helps demonstrate data integrity.  Manual touch points have to be minimized.   Data relevance/coverage - Data must be relevant to all portfolios and storage devices must allow for sufficient data retention.  Coverage of both on and off balance sheet exposures is critical.   Model Challenges Model development - Requires highly trained resources with both quantitative and subject matter expertise. Model validation - All Basel models need to be validated. This requires additional resources with skills that may not be readily available in the marketplace.  Model documentation - All models need to be adequately documented.  Creation of document templates and model development processes/procedures is key. Risk and finance integration - This integration is necessary for Basel as the Allowance for Loan and Lease Losses (ALLL) is calculated by Finance, yet Expected Loss (EL) is calculated by Risk Management – and they need to somehow be equal.  This is tricky at best from an implementation perspective.  Compliance Challenges Rules interpretation - Some Basel III requirements leave room for interpretation.  A misinterpretation of regulations can lead to delays in Basel compliance and undesired reprimands from supervisory authorities. Gap identification and remediation - Internal identification and remediation of gaps ensures smoother Basel compliance and audit processes.  However business lines are challenged by the competing priorities which arise from regulatory compliance and business as usual work.  Qualification readiness - Providing internal and external auditors with robust evidence of a thorough examination of the readiness to proceed to parallel run and Basel qualification  In light of new regulations like Basel III and local variations such as the Dodd Frank Act (DFA) and Comprehensive Capital Analysis and Review (CCAR) in the US, banks are now forced to ask themselves many difficult questions.  For example, executives must consider: How will Basel III play into their Risk Appetite? How will they create project plans for Basel III when they haven’t yet finished implementing Basel II? How will new regulations impact capital structure including profitability and capital distributions to shareholders? After all, new regulations often lead to diminished profitability as well as an assortment of implementation problems as we discussed earlier in this note.  However, by requiring banks to focus on premium growth, regulators increase the potential for long-term profitability and sustainability.  And a more stable banking system: Increases consumer confidence which in turn supports banking activity  Ensures that adequate funding is available for individuals and companies Puts regulators at ease, allowing bankers to focus on banking Stability is intended to bring long-term profitability to banks.  Therefore, it is important that every banking institution takes the steps necessary to properly manage, monitor and disclose its risks.  This can be done with the assistance and oversight of an independent regulatory authority.  A spectrum of banks exist today wherein some continue to debate and negotiate with regulators over the implementation of new requirements, while others are simply choosing to embrace them for the benefits I highlighted above. Do share with me how your institution is coping with and embracing these new regulations within your bank. Dr. Varun Agarwal is a Principal in the Banking Practice for Capgemini Financial Services.  He has over 19 years experience in areas that span from enterprise risk management, credit, market, and to country risk management; financial modeling and valuation; and international financial markets research and analyses.

    Read the article

  • Declarative Architectures in Infrastructure as a Service (IaaS)

    - by BuckWoody
    I deal with computing architectures by first laying out requirements, and then laying in any constraints for it's success. Only then do I bring in computing elements to apply to the system. As an example, a requirement might be "world-side availability" and a constraint might be "with less than 80ms response time and full HA" or something similar. Then I can choose from the best fit of technologies which range from full-up on-premises computing to IaaS, PaaS or SaaS. I also deal in abstraction layers - on-premises systems are fully under your control, in IaaS the hardware is abstracted (but not the OS, scale, runtimes and so on), in PaaS the hardware and the OS is abstracted and you focus on code and data only, and in SaaS everything is abstracted - you merely purchase the function you want (like an e-mail server or some such) and simply use it. When you think about solutions this way, the architecture moves to the primary factor in your decision. It's problem-first architecting, and then laying in whatever technology or vendor best fixes the problem. To that end, most architects design a solution using a graphical tool (I use Visio) and then creating documents that  let the rest of the team (and business) know what is required. It's the template, or recipe, for the solution. This is extremely easy to do for SaaS - you merely point out what the needs are, research the vendor and present the findings (and bill) to the business. IT might not even be involved there. In PaaS it's not much more complicated - you use the same Application Lifecycle Management and design tools you always have for code, such as Visual Studio or some other process and toolset, and you can "stamp out" the application in multiple locations, update it and so on. IaaS is another story. Here you have multiple machines, operating systems, patches, virus scanning, run-times, scale-patterns and tools and much more that you have to deal with, since essentially it's just an in-house system being hosted by someone else. You can certainly automate builds of servers - we do this as technical professionals every day. From Windows to Linux, it's simple enough to create a "build script" that makes a system just like the one we made yesterday. What is more problematic is being able to tie those systems together in a coherent way (as a solution) and then stamp that out repeatedly, especially when you might want to deploy that solution on-premises, or in one cloud vendor or another. Lately I've been working with a company called RightScale that does exactly this. I'll point you to their site for more info, but the general idea is that you document out your intent for a set of servers, and it will deploy them to on-premises clouds, Windows Azure, and other cloud providers all from the same script. In other words, it doesn't contain the images or anything like that - it contains the scripts to build them on-premises or on a cloud vendor like Microsoft. Using a tool like this, you combine the steps of designing a system (all the way down to passwords and accounts if you wish) and then the document drives the distribution and implementation of that intent. As time goes on and more and more companies implement solutions on various providers (perhaps for HA and DR) then this becomes a compelling investigation. The RightScale information is here, if you want to investigate it further. Yes, there are other methods I've found, but most are tied to a single kind of cloud, and I'm not into vendor lock-in. Poppa Bear Level - Hands-on EvaluateRightScale at no cost.  Just bring your Windows Azurecredentials and follow the these tutorials: Sign Up for Windows Azure Add     Windows Azure to a RightScale Account Windows Azure Virtual Machines     3-tier Deployment Momma Bear Level - Just the Right level... ;0)  WindowsAzure Evaluation Guide - if you are new toWindows Azure Virtual Machines and new to RightScale, we recommend that youread the entire evaluation guide to gain a more complete understanding of theWindows Azure + RightScale solution.    WindowsAzure Support Page @ support.rightscale.com - FAQ's, tutorials,etc. for  Windows Azure Virtual Machines (Work in Progress) Baby Bear Level - Marketing WindowsAzure Page @ www.rightscale.com - find overview informationincluding solution briefs and presentation & demonstration videos   Scale     and Automate Applications on Windows Azure  Solution Brief     - how RightScale makes Windows Azure Virtual Machine even better SQL     Server on Windows Azure  Solution Brief   -       Run Highly Available SQL Server on Windows Azure Virtual Machines

    Read the article

  • Silverlight Cream Monday WP7 App Review # 1

    - by Dave Campbell
    I'm going to try something here... if it seems useful, I'll continue, if it doesn't, I'll stop... so give me feedback! There are *lots* of Apps in the WP7 Marketplace, and heaven help me, but the Marketplace sucks for finding stuff. I won't rehash what's already been said in the blogs, but I agree with one and all. I went out last Saturday to find 2 apps that I knew were released, and couldn't do so on my device. Even in the Zune app, it took quite a while to find them... ok, I'll back off a bit, because I just found out I can do 'Search' now if I know the name... I didn't think that was working before. So my thought is on Monday (like today), I will post a review of 5 apps/games I either use or have played with on my device. These are strictly my opinions, you understand, but hey... it's better than a poke in the eye with an iPhone! A few disclaimers:     Feel free to write me about your app and tell me about it. While it would be very cool to receive a whole bunch of xap files to review, at this point, for technical reasons, I'm unable to side-load my device. Since I plan on only doing this one day a week, and only 5, I may never get caught up, so if you send me some info, be patient. Re: games ... remember I'm old... I'm from the era of Colossal Cave and Zork. Duke-Nukem 2D and Captain Comic were awesome. I don't own an XBOX or any other game system, so take game reviews from my perspective -- who knows, it may be refreshing :) I won't pay for an app or game just to try it. If you expect me to test-drive your app, it's going to have to have a Free Trial. In this Issue:   Jingo! is the first app I bought, just to see what the experience was like. It's very much like a game we used to play in school in the Army in 1971 on paper we passed around. Sort of a cross between hangman and Mastermind, you try to figure out the hidden word in 5 tries. You get really good at 5-letter words after a while. I like this because you have to think, and you're not pressured by a clock Jingo! is by James Furdell and is $1.99 I reviewed René Schulte's Pictures Lab a while back, and have not changed my mind. This is an excellent app for playing with any photo on your device... one you've just taken, one you've synced from your PC, or one you've saved from email. I like this because you can get some cool effects for your photos, and it just works. Pictures Lab is by Schulte Software Development and is $1.99 Since I work as a consultant, and from home, I wanted something I could track my time with. I've test-driven all the contenders I could find so far on the phone, and so far I like ONTRACK! the best. If asked, I have some suggestions, but it's probably just the way I work or think. What I do like is I can tap a project to start/stop/restart a counter, and at the end of the day it shows me how much time I've been working. If there's a way to make an adjustment in case you forget to tap the counter, I don't know how to do it, and that's my biggest complaint. I like this because you can get a daily readout which you can also email as a spreadsheet. The daily results display is very good. ONTRACK! is by Qmino and is $2.99 Remember Item 4 above... I've been playing guitar for 48 years... obviously since before the invention of 'tuners', so I'm not as dependent upon these as some folks are. I've tried some in the past and have always felt I can do just as good by ear (I have perfect relative pitch). So, I gave this app by András Velvárt a dance just to see how it works, and it is surprisingly good. If you're used to one of the stage tuners this may take a little getting used to, but it does the job. The difference with this one is there is no real 'null' point inside which you can think your guitar is in tune. The soundwave stays visible on the device, and if it's moving to the right, your string is flat, if it's moving to the left, your string is sharp. Getting it exact might be tricky, but it is exact! If you need to rely on a tuner, this is a good choice in my opinion, exactly because of the sensitivity.. tune up with this and you're dead-on. Guitar Tuner is by Kinabalu Innovation Limited and is $0.99 Popper 2 is the WP7 version of a wildly popular game by Bill Reiss named Dr. Popper. You can get a trial, or you can now get a free lite version of the game. Popper 2 is a fast-paced bubble breaker game. I find it something fun to play when I just want to buzz out, but maybe the best review is that my daughter didn't want to give my phone back when I showed it to her, and always wants to grab my phone to play 'that game'. A fun distraction with great graphics and a great price Popper 2 is by Blue Rose Systems, LLC and is $1.29 Let me know what you think of the idea of doing reviews, or the layout/whatever, and Stay in the 'Light!   Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Updated Agenda for OTN Architect Day Los Angeles (Oct 25)

    - by Bob Rhubart
    Here's the latest information on the session schedule and content for Oracle Technology Network Architect Day in Los Angeles on October 25, 2012. Registration is open, but seating is limited. When: Thursday October 25 12, 2012 8:30am – 5:00pm Where: Sofitel Los Angeles 8555 Beverly Boulevard Los Angeles, CA 90048 Agenda Time Session Title Room 8:30 am - 9:00 am Registration and Continental Breakfast 9:00 am - 9:15 am Welcome and Opening Comments | Bob Rhubart Beverly Ballroom 9:15 am - 10:00 am Engineered Systems: Oracle's Vision for the Future | Ralf Dossmann Oracle's Exadata and Exalogic are impressive products in their own right. But working in combination they deliver unparalleled transaction processing performance with up to a 30x increase over existing legacy systems, with the lowest cost of ownership over a 3 or 5 year basis than any other hardware. In this session you'll learn how to leverage Oracle's Engineered Systems within your enterprise to deliver record-breaking performance at the lowest TCO. Beverly Ballroom 10:00 am - 10:30 am Monitoring and Managing Applications in the Cloud | Basheer Khan Oracle offers a broad portfolio of software and hardware products and services to enable public, private and hybrid clouds to power the enterprise. However, enterprise cloud computing presents new management challenges, that need to be addressed to realize the economic benefits of cloud computing. In this session you will learn about the methods and tools you can use to proactively monitor your end-to-end Oracle Applications environment in the cloud, define service-level objectives, gain insight into your end users, and troubleshoot performance problems from a single console. Beverly Ballroom 10:30 am - 10:45 am Break 10:45 am - 11:30 am Breakout Sessions (pick one) Cloud Computing - Making IT Simple | Dr. James Baty The road to Cloud Computing is not without a few bumps. This session will help to smooth out your journey by tackling some of the potential complications. We'll examine whether standardization is a prerequisite for the Cloud. We'll look at why refactoring isn't just for application code. We'll check out deployable entities and their simplification via higher levels of abstraction. And we'll close out the session with a look at engineered systems and modular clouds. Beverly Ballroom Innovations in Grid Computing with Oracle Coherence | Ashok Aletty Learn how Oracle Coherence can increase the availability, scalability and performance of your existing applications with its advanced low-latency data-grid technologies. Also hear some interesting industry-specific use cases that customers had implemented and how Oracle is integrating Coherence into its Enterprise Java stack. Hollywood Room 11:30 am - 12:15 pm Breakout Sessions (pick one) Enterprise Strategy for Cloud Security | Dave Chappelle Security is high on the list of concerns for many organizations as they evaluate their cloud computing options. This session will examine security in the context of the various forms of cloud computing. We'll consider technical and non-technical aspects of security, and discuss several strategies for cloud computing, from both the consumer and producer perspectives. Beverly Ballroom Oracle Enterprise Manager | Perren Walker This session examines new Oracle Enterprise Manager monitoring, administration, and management features for Oracle Exalogic. It focuses on two management themes: cloud management related to virtualization and applications-to-disk management. For private cloud management, it discusses virtualization management features providing an enhanced set of application deployment capabilities enabling IaaS as well as PaaS interactions. Then from an end-to-end perspective, it covers the specific capabilities and—where applicable—best practices for machine, cloud, middleware, and application administration. Hollywood Room 12:15 pm - 1:15 pm Lunch Beverly Ballroom Lounge 1:15 pm - 2:00 pm Panel Discussion - Q&A with session speakers Beverly Ballroom 2:00 pm - 2:45 pm Breakout Sessions (pick one) Oracle Cloud Reference Architecture | Anbu Krishnaswamy Cloud initiatives are beginning to dominate enterprise IT roadmaps. Successful adoption of Cloud and the subsequent governance challenges warrant a Cloud reference architecture that is applied consistently across the enterprise. This presentation will answer the important questions: What exactly is a Cloud, why you need it, what changes it will bring to the enterprise, and what are the key capabilities of a Cloud infrastructure are - using Oracle's Cloud Reference Architecture, which is part of the IT Strategies from Oracle (ITSO) Cloud Enterprise Technology Strategy ETS). Beverly Ballroom 21st Century SOA | Jeff Davies Service Oriented Architecture has evolved from concept to reality in the last decade. The right methodology coupled with mature SOA technologies has helped customers demonstrate success in both innovation and ROI. In this session you will learn how Oracle SOA Suite's orchestration, virtualization, and governance capabilities provide the infrastructure to run mission critical business and system applications. We'll also take a special look at the convergence of SOA & BPM using Oracle's Unified technology stack. Hollywood Room 2:45 pm - 3:00 pm Break 3:00 pm - 4:00 pm Roundtable Discussion Beverly Ballroom 4:00 pm - 4:15 pm Closing Comments & Readouts from Roundtables Beverly Ballroom 4:15 pm - 5:00 pm Networking / Reception Beverly Ballroom Lounge Note: Session schedule and content subject to change.

    Read the article

  • JavaOne 2012 Sunday Strategy Keynote

    - by Janice J. Heiss
    At the Sunday Strategy Keynote, held at the Masonic Auditorium, Hasan Rizvi, EVP, Middleware and Java Development, stated that the theme for this year's JavaOne is: “Make the future Java”-- meaning that Java continues in its role as the most popular, complete, productive, secure, and innovative development platform. But it also means, he qualified, the process by which we make the future Java -- an open, transparent, collaborative, and community-driven evolution. "Many of you have bet your businesses and your careers on Java, and we have bet our business on Java," he said.Rizvi detailed the three factors they consider critical to the success of Java--technology innovation, community participation, and Oracle's leadership/stewardship. He offered a scorecard in these three realms over the past year--with OS X and Linux ARM support on Java SE, open sourcing of JavaFX by the end of the year, the release of Java Embedded Suite 7.0 middleware platform, and multiple releases on the Java EE side. The JCP process continues, with new JSR activity, and JUGs show a 25% increase in participation since last year. Oracle, meanwhile, continues its commitment to both technology and community development/outreach--with four regional JavaOne conferences last year in various part of the world, as well as the release of Java Magazine, with over 120,000 current subscribers. Georges Saab, VP Development, Java SE, next reviewed features of Java SE 7--the first major revision to the platform under Oracle's stewardship, which has included near-monthly update releases offering hundreds of fixes, performance enhancements, and new features. Saab indicated that developers, ISVs, and hosting providers have all been rapid adopters of the platform. He also noted that Oracle's entire Fusion middleware stack is supported on SE 7. The supported platforms for SE 7 has also increased--from Windows, Linux, and Solaris, to OS X, Linux ARM, and the emerging ARM micro-server market. "In the last year, we've added as many new platforms for Java, as were added in the previous decade," said Saab.Saab also explored the upcoming JDK 8 release--including Project Lambda, Project Nashorn (a modern implementation of JavaScript running on the JVM), and others. He noted that Nashorn functionality had already been used internally in NetBeans 7.3, and announced that they were planning to contribute the implementation to OpenJDK. Nandini Ramani, VP Development, Java Client, ME and Card, discussed the latest news pertaining to JavaFX 2.0--releases on Windows, OS X, and Linux, release of the FX Scene Builder tool, the JavaFX WebView component in NetBeans 7.3, and an OpenJFX project in OpenJDK. Nandini announced, as of Sunday, the availability for download of JavaFX on Linux ARM (developer preview), as well as Scene Builder on Linux. She noted that for next year's JDK 8 release, JavaFX will offer 3D, as well as third-party component integration. Avinder Brar, Senior Software Engineer, Navis, and Dierk König, Canoo Fellow, next took the stage and demonstrated all that JavaFX offers, with a feature-rich, animation-rich, real-time cargo management application that employs Canoo's just open-sourced Dolphin technology.Saab also explored Java SE 9 and beyond--Jigsaw modularity, Penrose Project for interoperability with OSGi, improved multi-tenancy for Java in the cloud, and Project Sumatra. Phil Rogers, HSA Foundation President and AMD Corporate Fellow, explored heterogeneous computing platforms that combine the CPU and the parallel processor of the GPU into a single piece of silicon and shared memory—a hardware technology driven by such advanced functionalities as HD video, face recognition, and cloud workloads. Project Sumatra is an OpenJDK project targeted at bringing Java to such heterogeneous platforms--with hardware and software experts working together to modify the JVM for these advanced applications and platforms.Ramani next discussed the latest with Java in the embedded space--"the Internet of things" and M2M--declaring this to be "the next IT revolution," with Java as the ideal technology for the ecosystem. Last week, Oracle released Java ME Embedded 3.2 (for micro-contollers and low-power devices), and Java Embedded Suite 7.0 (a middleware stack based on Java SE 7). Axel Hansmann, VP Strategy and Marketing, Cinterion, explored his company's use of Java in M2M, and their new release of EHS5, the world's smallest 3G-capable M2M module, running Java ME Embedded. Hansmaan explained that Java offers them the ability to create a "simple to use, scalable, coherent, end-to-end layer" for such diverse edge devices.Marc Brule, Chief Financial Office, Royal Canadian Mint, also explored the fascinating use-case of JavaCard in his country's MintChip e-cash technology--deployable on smartphones, USB device, computer, tablet, or cloud. In parting, Ramani encouraged developers to download the latest releases of Java Embedded, and try them out.Cameron Purdy, VP, Fusion Middleware Development and Java EE, summarized the latest developments and announcements in the Enterprise space--greater developer productivity in Java EE6 (with more on the way in EE 7), portability between platforms, vendors, and even cloud-to-cloud portability. The earliest version of the Java EE 7 SDK is now available for download--in GlassFish 4--with WebSocket support, better JSON support, and more. The final release is scheduled for April of 2013. Nicole Otto, Senior Director, Consumer Digital Technology, Nike, explored her company's Java technology driven enterprise ecosystem for all things sports, including the NikeFuel accelerometer wrist band. Looking beyond Java EE 7, Purdy mentioned NoSQL database functionality for EE 8, the concurrency utilities (possibly in EE 7), some of the Avatar projects in EE 7, some in EE 8, multi-tenancy for the cloud, supporting SaaS applications, and more.Rizvi ended by introducing Dr. Robert Ballard, oceanographer and National Geographic Explorer in Residence--part of Oracle's philanthropic relationship with the National Geographic Society to fund K-12 education around ocean science and conservation. Ballard is best known for having discovered the wreckage of the Titanic. He offered a fascinating video and overview of the cutting edge technology used in such deep-sea explorations, noting that in his early days, high-bandwidth exploration meant that you’d go down in a submarine and "stick your face up against the window." Now, it's a remotely operated, technology telepresence--"I think of my Hercules vehicle as my equivalent of a Na'vi. When I go beneath the sea, I actually send my spirit." Using high bandwidth satellite links, such amazing explorations can now occur via smartphone, laptop, or whatever platform. Ballard’s team regularly offers live feeds and programming out to schools and the world, spanning 188 countries--with embedding educators as part of the expeditions. It's technology at its finest, inspiring the next-generation of scientists and explorers!

    Read the article

  • Mobile Apps for Oracle E-Business Suite

    - by Carlos Chang
    Crosspost from the mobile apps blog.  TL;DR Oracle E-Business Suite is now building mobile apps with Oracle Mobile Application Framework (MAF). Believe it! Build iOS and Android apps with once code base and get it done! By Steven Chan (Oracle Development)  Many things have changed in the mobile space over the last few years. Here's an update on our strategy for mobile apps for the E-Business Suite. Mobile app strategy We're building our family of mobile apps for the E-Business Suite using Oracle Mobile Application Framework.  This framework allows us to write a single application that can be run on Apple iOS and Google Android platforms. Mobile apps for the E-Business Suite will share a common look-and-feel. The E-Business Suite is a suite of over 200 product modules spanning Financials, Supply Chain, Human Resources, and many other areas. Our mobile app strategy is to release standalone apps for specific product modules.  Our Oracle Timecards app, which allows users to create and submit timecards, is an example of a standalone app. Some common functions that span multiple product areas will have dedicated apps, too. An example of this is ourOracle Approvals app, which allows users to review and approve requests for expenses, requisitions, purchase orders, recruitment vacancies and offers, and more. You can read more about our Oracle Mobile Approvals app here: Now Available: Oracle Mobile Approvals for iOS Our goal is to support smaller screen (e.g. smartphones) as well as larger screens (e.g. tablets), with the smaller screen versions generally delivered first.  Where possible, we will deliver these as universal apps.  An example is our Oracle Mobile Field Service app, which allows field service technicians to remotely access customer, product, service request, and task-related information.  This app can run on a smartphone, while providing a richer experience for tablets. Deploying EBS mobile apps The mobile apps, themselves (i.e. client-side components) can be downloaded by end-users from the Apple iTunes today.  Android versions will be available from Google play. You can monitor this blog for Android-related updates. Where possible, our mobile apps should be deployable with a minimum of server-side changes.  These changes will generally involve a consolidated server-side patch for technology-stack components, and possibly a server-side patch for the functional product module. Updates to existing mobile apps may require new server-side components to enable all of the latest mobile functionality. All EBS product modules are certified for internal intranet deployments (i.e. used by employees within an organization's firewall).  Only a subset of EBS products such as iRecruitment are certified to be deployed externally (i.e. used by non-employees outside of an organization's firewall).  Today, many organizations running the E-Business Suite do not expose their EBS environment externally and all of the mobile apps that we're building are intended for internal employee use.  Recognizing this, our mobile apps are currently designed for users who are connected to the organization's intranet via VPN.  We expect that this may change in future updates to our mobile apps. Mobile apps and internationalization The initial releases of our mobile apps will be in English.  Later updates will include translations for all left-to-right languages supported by the E-Business Suite.  Right-to-left languages will not be translated. Customizing apps for enterprise deployments The current generation of mobile apps for Oracle E-Business Suite cannot be customized. We are evaluating options for limited customizations, including corporate branding with logos, corporate color schemes, and others. This is a potentially-complex area with many tricky implications for deployment and maintenance.  We would be interested in hearing your requirements for customizations in enterprise deployments.Prerequisites Apple iOS 7 and higher Android 4.1 (API level 16) and higher, with minimum CPU/memory configurations listed here EBS 12.1: EBS 12.1.3 Family Packs for the related product module EBS 12.2.3 References Oracle E-Business Suite Mobile Apps, Release 12.1 and 12.2 Documentation (Note 1641772.1) Oracle E-Business Suite Mobile Apps Administrator's Guide, Release 12.1 and 12.2 (Note 1642431.1) Follow @OracleMobile on Twitter Oracle Mobile Blog is here. 

    Read the article

  • Gamification = -10#/3mo

    - by erikanollwebb
    One of the purposes of gamification of anything is to see if you can modify the behavior of the user. In the enterprise, that might mean getting sales people to enter more information into a CRM system, encouraging employees to update their HR records, motivating people to participate in forums and discussions, or process invoices more quickly.  Wikipedia defines behavior modification as "the traditional term for the use of empirically demonstrated behavior change techniques to increase or decrease the frequency of behaviors, such as altering an individual's behaviors and reactions to stimuli through positive and negative reinforcement of adaptive behavior and/or the reduction of behavior through its extinction, punishment and/or satiation."  Gamification is just a way to modify someone's behavior using game mechanics. And the magic question is always whether it works. So I thought I would present my own little experiment from the last few months.  This spring, I upgraded to a Samsung Galaxy 4.  It's a pretty sweet phone in many ways, but one of the little extras I discovered was a built in app called S Health. S Health is an app that you can use to track calories, weight, exercise and it has a built in pedometer. I looked at it when I got the phone, but assumed you had to turn it on to use it so I didn't look at it much.  But sometime in July, I realized that in fact, it just ran in the background and was quietly tracking my steps, with a goal of 10,000 per day.  10,000 steps per day is this magic number recommended by the Surgeon General and the American Heart Association.  Dr. Oz pushes it as the goal for daily exercise.  It's about 5 miles of walking. I'm generally not the kind of person who always has my phone with me.  I leave it in my purse and pull it out when I need it.  But then I realized that meant I wasn't getting a good measure of my steps.  I decided to do a little experiment, and carry it with me as much as possible for a week.  That's when I discovered the gamification that changed my life over the last 3 months.  When I hit 10,000 steps, the app jingled out a little "success!" tune and I got a badge.  I was hooked.  I started carrying my phone.  I started making sure I had shoes I could walk in with me.  I started walking at lunch time, because I realized how often I sat at my desk for 8-10 hours every day without moving.  I started pestering my husband to walk with me after work because I hadn't hit my 10,000 yet, leading him at one point to say "I'm not as much a slave to that badge as you are!"  I started looking at parking lots differently.  Can't get a space up close?  No worries, just that many steps toward my 10,000.  I even tried to see if there was a second power user level at 15,000 or 20,000 (*sadly, no).  If I was close at the end of the day, I have done laps around my house until I got my badge.  I have walked around the block one more time to get my badge.  I have mentally chastised myself when I forgot to put my phone in my pocket because I don't know how many steps I got.  The badge below I got when my boss and I were in New York City and we walked around the block of our hotel just to watch the badge pop up. There are a bunch of tools out on the market now that have similar ideas for helping you to track your exercise, make it social.  There are apps (my favorite is still Zombies, Run!).  You could buy a FitBit or UP by Jawbone.   Interactive fitness makes the Expresso stationary bike with built in video games.  All designed to help you be more aware of your activity and keep you engaged and motivated.  And the idea is to help you change your behavior. I know someone who would spend extra time and work hard on the Expresso because he had built up strategies for how to kill the most dragons while he was riding to get more points.  When the machine broke down, he didn't ride a different bike because it just wasn't that interesting. But for me, just the simple jingle and badge have been all I needed.  I admit, I still giggle gleefully when I hear the tune sing out from my pocket. After a few weeks, I noticed I had dropped a few pounds.  Not a lot, just 2-3.  But then I was really hooked.  I started making a point both to eat a little less and hit 10,000 steps as much as I could.  I bemoaned that during the floods in Boulder, I wasn't hitting my 10,000 steps.  And now, a few months later, I'm almost 10 lbs lighter. All for 1 badge a day. So yes, simple gamification can increase motivation and engagement.  And that can lead to changes in behavior.  Now the job is to apply that to the enterprise space in a meaningful and engaging way. 

    Read the article

  • Top Questions and Answers for Pluging into Oracle Database as a Service

    - by David Swanger
    Yesterday we hosted a comprehensive online forum that shared a comprehensive path to help your organization design, deploy, and deliver a Database as a Service cloud. If you missed the online forum, you can watch it on demand by registering here. We received numerous questions.  Below are highlights of the most informative: DBaaS requires a lengthy and careful design efforts. What is the minimum requirements of setting up a scaled-down environment and test it out? You should have an OEM 12c environment for DBaaS administration and then a target database deployment platform that has the key characteristics of what your production environment will look like. This could be a single server or it could be a small pool of hosts if your production DBaaS will be larger and you want to test a more robust / real world configuration with Zones and Pools or DR capabilities for example. How does this benefit companies having their own data center? This allows companies to transform their internal IT to a service delivery model for the database. The benefits to the company are significant cost savings, improved business agility and reduced risk. The benefits to the consumers (internal) of services if much fast provisioning, and response to change in business requirements. From a deployment perspective, is DBaaS's job solely DBA's job? The best deployment model enables the DBA (or end-user) to control the entire process. All resources required to deploy the service are pre-provisioned, and there are no external dependencies (on network, storage, sysadmins teams). The service is created either via a self-service portal or by the DBA. The purpose of self service seems to be that the end user does not rely on the DBA. I just need to give him a template. He decides how much AMM he needs. Why shall I set it one by one. That doesn't seem to be the purpose of self service. Most customers we have worked with define a standardized service catalog, with a few (2 to 5) different classes of service. For each of these classes, there is a pre-defined deployment template, and the user has the ability to select from some pre-defined service sizes. The administrator only has to create this catalog once. Each user then simply selects from the options offered in the catalog.  Looking at DBaaS service definition, it seems to be no different from a service definition provided by a well defined DBA team. Why do you attribute it to DBaaS? There are a couple of perspectives. First, some organizations might already be operating with a high level of standardization and a higher level of maturity from an ITIL or Service Management perspective. Their journey to DBaaS could be shorter and their Service Definition will evolve less but they still might need to add capabilities such as Self Service and Metering/Chargeback. Other organizations are still operating in highly siloed environments with little automation and their formal Service Definition (if they have one) will be a lot less mature today. Therefore their future state DBaaS will look a lot different from their current state, as will their Service Definition. How database as a service impact or help with "Click to Compute" or deploying "Database in cloud infrastructure" DBaaS enables Click to Compute. Oracle DBaaS can be implemented using three architecture models: Oracle Multitenant 12c, native consolidation using Oracle Database and consolidation using virtualization in infrastructure cloud. As Deploy session showed, you get higher consolidating density and efficiency using Multitenant and higher isolation using infrastructure cloud. Depending upon your business needs, DBaaS can be implemented using any of these models. How exactly is the DBaaS different from the traditional db? Storage/OS/DB all together to 'transparently' provide service to applications? Will there be across-databases access by application/user. Some key differences are: 1) The services run on a shared platform. 2) The services can be rapidly provisioned (< 15 minutes). 3) The services are dynamic and can be relocated, grown, shrunk as needed to meet business needs without disruption and rapidly. 4) The user is able to provision the services directly from a standardized service catalog.. With 24x7x365 databases its difficult to find off peak hrs to do basic admin tasks such as gathering stats, running backups, batch jobs. How does pluggable database handle this and different needs/patching downtime of apps databases might be serving? You can gather stats in Oracle Multitenant the same way you had been in regular databases. Regarding patching/upgrading, Oracle Multitenant makes patch/upgrade very efficient in that you can pre-provision a new version/patched multitenant db in a different ORACLE_HOME and then unplug a PDB from its CDB and plug it into the newer/patched CDB in seconds.  Thanks for all the great questions!  If you'd like to learn more and missed the online forum, you can watch it on demand here.

    Read the article

  • "Guiding" a Domain Expert to Retire from Programming

    - by James Kolpack
    I've got a friend who does IT at a local non-profit where they're using a custom web application which is no longer supported by the company who built it. (out of business, support was too expensive, I'm not sure...) Development on this app started around 10+ years ago so the technologies being harnessed are pretty out of date now - classic asp using vbscript and SQL Server 2000. The application domain is in the realm of government bookkeeping - so even though the development team is long gone, there are often new requirements of this software. Enter the... The domain expert. This is an middle aged accounting whiz without much (or any?) prior development experience. He studied the pages, code and queries and learned how to ape the style of the original team which, believe me, is mediocre at best. He's very clever and very tenacious but has no experience in software beyond what he's picked up from this app. Otherwise, he's a pleasant guy to talk to and definitely knows his domain. My friend in IT, and probably his superiors in the company, want him out of the code. They view him as wasting his expertise on coding tasks he shouldn't be doing. My friend got me involved with a few small contracts which I handled without much problem - other than somewhat of a communication barrier with the domain expert. He explained the requirements very quickly, assuming prior knowledge of the domain which I do not have. This is partially his normal style, and I think maybe a bit of resentment from my involvement. So, I think he feels like the owner of the code and has entrenched himself in a development position. So... his coding technique. One of his latest endeavors was to make a page that only he could reach (theoretically - the security model for the system is wretched) where he can enter a raw SQL query, run it, and save the query to run again later. A report that I worked on had been originally implemented by him using 6 distinct queries, 3 or 4 temp tables to coordinate the data between the queries, and the final result obtained by importing the data from the final query into Access and doing a pivot and some formatting. It worked - well, some of the results were incorrect - but at what a cost! (I implemented the report in a single query with at least 1/10th the amount of code.) He edits code in notepad. He doesn't seem to know about online reference material for the languages. I recently read an article on Dr. Dobbs titled "What Makes Bad Programmers Different" - and instantly thought of our domain expert. From the article: Their code is large, messy, and bug laden. They have very superficial knowledge of their problem domain and their tools. Their code has a lot of copy/paste and they have very little interest in techniques that reduce it. The fail to account for edge cases, while inefficiently dealing with the general case. They never have time to comment their code or break it into smaller pieces. Empirical evidence plays no little role in their decisions. 5.5 out of 6. My friend is wanting me to argue the case to their management - specifically, I got this email from their manager to respond to: ...Also, I need to talk to you about what effect there is from Domain Expert continuing to make edits to the live environment. If that is a problem for you I need to know so I can have his access blocked. Some examples would help. In my opinion, from a technical standpoint, it's dangerous to have him making changes without any oversight. On the other hand, I'm just doing one-off contracts at this point and don't have much desire to get involved deeply enough that I'm essentially arguing as one of the Bobs from Office Space. I'd like to help my friend out - but I feel like I'm getting in the middle of a political battle. More importantly - if I do get involved and suggest that his editing privileges be removed, it needs to be handled carefully so that doesn't feel belittled. He is beyond a doubt the foremost expert on this system. I'm hoping this is familiar territory for some other stackechangers, because I'm feeling a little bewildered. How should I respond? Should I argue that he shouldn't be allowed to touch the code? Should I phrase it as "no single developer, no matter how experienced, should be working on production code unchecked"? Should I argue to keep him involved with the code, but with a review process? Should I say "glad I could help, but uh, I'm busy now!" Other options? Thanks a bunch!

    Read the article

  • Building The Right SharePoint Team For Your Organization

    - by Mark Rackley
    I see the question posted fairly often asking what kind SharePoint team an organization should have. How many people do I need? What roles do I need to fill? What is best for my organization? Well, just like every other answer in SharePoint, the correct answer is “it depends”. Do you ever get sick of hearing that??? I know I do… So, let me give you my thoughts and opinions based upon my experience and what I’ve seen and let you come to your own conclusions. What are the possible SharePoint roles? I guess the first thing you need to understand are the different roles that exist in SharePoint (and their are LOTS). Remember, SharePoint is a massive beast and you will NOT find one person who can do it all. If you are hoping to find that person you will be sorely disappointed. For the most part this is true in SharePoint 2007 and 2010. However, generally things are improved in 2010 and easier for junior individuals to grasp. SharePoint Administrator The absolutely positively only role that you should not be without no matter the size of your organization or SharePoint deployment is a SharePoint administrator. These guys are essential to keeping things running and figuring out what’s wrong when things aren’t running well. These unsung heroes do more before 10 am than I do all day. The bad thing is, when these guys are awesome, you don’t even know they exist because everything is running so smoothly. You should definitely invest some time and money here to make sure you have some competent if not rockstar help. You need an admin who truly loves SharePoint and will go that extra mile when necessary. Let me give you a real world example of what I’m talking about: We have a rockstar admin… and I’m sure she’s sick of my throwing her name around so she’ll just have to live with remaining anonymous in this post… sorry Lori… Anyway! A couple of weeks ago our Server teams came to us and said Hi Lori, I’m finalizing the MOSS servers and doing updates that require a restart; can I restart them? Seems like a harmless request from your server team does it not? Sure, go ahead and apply the patches and reboot during our scheduled maintenance window. No problem? right? Sounded fair to me… but no…. not to our fearless SharePoint admin… I need a complete list of patches that will be applied. There is an update that is out there that will break SharePoint… KB973917 is the patch that has been shown to cause issues. What? You mean Microsoft released a patch that would actually adversely affect SharePoint? If we did NOT have a rockstar admin, our server team would have applied these patches and then when some problem occurred in SharePoint we’d have to go through the fun task of tracking down exactly what caused the issue and resolve it. How much time would that have taken? If you have a junior SharePoint admin or an admin who’s not out there staying on top of what’s going on you could have spent days tracking down something so simple as applying a patch you should not have applied. I will even go as far to say the only SharePoint rockstar you NEED in your organization is a SharePoint admin. You can always outsource really complicated development projects or bring in a rockstar contractor every now and then to make sure you aren’t way off track in other areas. For your day-to-day sanity and to keep SharePoint running smoothly, you need an awesome Admin. Some rockstars in this category are: Ben Curry, Mike Watson, Joel Oleson, Todd Klindt, Shane Young, John Ferringer, Sean McDonough, and of course Lori Gowin. SharePoint Developer Another essential role for your SharePoint deployment is a SharePoint developer. Things do start to get a little hazy here and there are many flavors of “developers”. Are you writing custom code? using SharePoint Designer? What about SharePoint Branding?  Are all of these considered developers? I would say yes. Are they interchangeable? I’d say no. Development in SharePoint is such a large beast in itself. I would say that it’s not so large that you can’t know it all well, but it is so large that there are many people who specialize in one particular category. If you are lucky enough to have someone on staff who knows it all well, you better make sure they are well taken care of because those guys are ready-made to move over to a consulting role and charge you 3 times what you are probably paying them. :) Some of the all-around rockstars are Eric Shupps, Andrew Connell (go Razorbacks), Rob Foster, Paul Schaeflein, and Todd Bleeker SharePoint Power User/No-Code Solutions Developer These SharePoint Swiss Army Knives are essential for quick wins in your organization. These people can twist the out-of-the-box functionality to make it do things you would not even imagine. Give these guys SharePoint Designer, jQuery, InfoPath, and a little time and they will create views, dashboards, and KPI’s that will blow your mind away and give your execs the “wow” they are looking for. Not only can they deliver that wow factor, but they can mashup, merge, and really help make your SharePoint application usable and deliver an overall better user experience. Before you hand off a project to your SharePoint Custom Code developer, let one of these rockstars look at it and show you what they can do (in probably less time). I would say the second most important role you can fill in your organization is one of these guys. Rockstars in this category are Christina Wheeler, Laura Rogers, Jennifer Mason, and Mark Miller SharePoint Developer – Custom Code If you want to really integrate SharePoint into your legacy systems, or really twist it and make it bend to your will, you are going to have to open up Visual Studio and write some custom code.  Remember, SharePoint is essentially just a big, huge, ginormous .NET application, so you CAN write code to make it do ANYTHING, but do you really want to spend the time and effort to do so? At some point with every other form of SharePoint development you are going to run into SOME limitation (SPD Workflows is the big one that comes to mind). If you truly want to knock down all the walls then custom development is the way to go. PLEASE keep in mind when you are looking for a custom code developer that a .NET developer does NOT equal a SharePoint developer. Just SOME of the things these guys write are: Custom Workflows Custom Web Parts Web Service functionality Import data from legacy systems Export data to legacy systems Custom Actions Event Receivers Service Applications (2010) These guys are also the ones generally responsible for packaging everything up into solution packages (you are doing that, right?). Rockstars in this category are Phil Wicklund, Christina Wheeler, Geoff Varosky, and Brian Jackett. SharePoint Branding “But it LOOKS like SharePoint!” Somebody call the WAAAAAAAAAAAAHMbulance…   Themes, Master Pages, Page Layouts, Zones, and over 2000 styles in CSS.. these guys not only have to be comfortable with all of SharePoint’s quirks and pain points when branding, but they have to know it TWICE for publishing and non-publishing sites.  Not only that, but these guys really need to have an eye for graphic design and be able to translate the ramblings of business into something visually stunning. They also have to be comfortable with XSLT, XML, and be able to hand off what they do to your custom developers for them to package as solutions (which you are doing, right?). These rockstars include Heater Waterman, Cathy Dew, and Marcy Kellar SharePoint Architect SharePoint Architects are generally SharePoint Admins or Developers who have moved into more of a BA role? Is that fair to say? These guys really have a grasp and understanding for what SharePoint IS and what it can do. These guys help you structure your farms to meet your needs and help you design your applications the correct way. It’s always a good idea to bring in a rockstar SharePoint Architect to do a sanity check and make sure you aren’t doing anything stupid.  Most organizations probably do not have a rockstar architect on staff. These guys are generally brought in at the deployment of a farm, upgrade of a farm, or for large development projects. I personally also find architects very useful for sitting down with the business to translate their needs into what SharePoint can do. A good architect will be able to pick out what can be done out-of-the-box and what has to be custom built and hand those requirements to the development Staff. Architects can generally fill in as an admin or a developer when needed. Some rockstar architects are Rick Taylor, Dan Usher, Bill English, Spence Harbar, Neil Hodgkins, Eric Harlan, and Bjørn Furuknap. Other Roles / Specialties On top of all these other roles you also get these people who specialize in things like Reporting, BDC (BCS in 2010), Search, Performance, Security, Project Management, etc... etc... etc... Again, most organizations will not have one of these gurus on staff, they’ll just pay out the nose for them when they need them. :) SharePoint End User Everyone else in your organization that touches SharePoint falls into this category. What they actually DO in SharePoint is determined by your governance and what permissions you give these guys. Hopefully you have these guys on a fairly short leash and are NOT giving them access to tools like SharePoint Designer. Sadly end users are the ones who truly make your deployment a success by using it, but are also your biggest enemy in breaking it.  :)  We love you guys… really!!! Okay, all that’s fine and dandy, but what should MY SharePoint team look like? It depends! Okay… Are you just doing out of the box team sites with no custom development? Then you are probably fine with a great Admin team and a great No-Code Solution Development team. How many people do you need? Depends on how busy you can keep them. Sorry, can’t answer the question about numbers without knowing your specific needs. I can just tell you who you MIGHT need and what they will do for you. I’ll leave you with what my ideal SharePoint Team would look like for a particular scenario: Farm / Organization Structure Dev, QA, and 2 Production Farms. 5000 – 10000 Users Custom Development and Integration with legacy systems Team Sites, My Sites, Intranet, Document libraries and overall company collaboration Team Rockstar SharePoint Administrator 2-3 junior SharePoint Administrators SharePoint Architect / Lead Developer 2 Power User / No-Code Solution Developers 2-3 Custom Code developers Branding expert With a team of that size and skill set, they should be able to keep a substantial SharePoint deployment running smoothly and meet your business needs. This does NOT mean that you would not need to bring in contract help from time to time when you need an uber specialist in one area. Also, this team assumes there will be ongoing development for the life of your SharePoint farm. If you are just going to be doing sporadic custom development, it might make sense to partner with an awesome firm that specializes in that sort of work (I can give you the name of a couple if you are interested).  Again though, the size of your team depends on the number of requests you are receiving and how much active deployment you are doing. So, don’t bring in a team that looks like this and then yell at me because they are sitting around with nothing to do or are so overwhelmed that nothing is getting done. I do URGE you to take the proper time to asses your needs and determine what team is BEST for your organization. Also, PLEASE PLEASE PLEASE do not skimp on the talent. When it comes to SharePoint you really do get what you pay for when it comes to employees, contractors, and software.  SharePoint can become absolutely critical to your business and because you skimped on hiring a developer he created a web part that brings down the farm because he doesn’t know what he’s doing, or you hire an admin who thinks it’s fine to stick everything in the same Content Database and then can’t figure out why people are complaining. SharePoint can be an enormous blessing to an organization or it’s biggest curse. Spend the time and money to do it right, or be prepared to spending even more time and money later to fix it.

    Read the article

  • ASP.NET MVC Consume JSONResult in Bing Maps API

    - by rockinthesixstring
    I know there are a few topics on this, but I seem to be fumbling my way through with no results. I'm trying to use a controller to return JSON results to my Bin Maps functions. Here's what I have for my controller (yes it is properly returning JSON data. Function Regions() As JsonResult Dim rj As New List(Of RtnJson)() rj.Add(New RtnJson("135 Bow Meadows Drive, Cochrane, Alberta", "desc", "title")) rj.Add(New RtnJson("12 Bowridge Dr NW, Calgary, Alberta, Canada", "desc2", "title2")) Return Json(rj, JsonRequestBehavior.AllowGet) End Function Then in my script I have this, but it's not working. <script type="text/javascript"> var map = null; var centerLat = 51.045 ; var centerLon = -114.05722; var json_object = $.getJSON("<%: Url.Action("Regions", "Events")%>"); function LoadMap() { map = new VEMap('bingMap'); map.LoadMap(new VELatLong(centerLat, centerLon), 10); $.each(json_object, function () { alert(this.address); //this alert is returning "address is undefined" StartGeocoding(this.address, this.title, this.desc); }); } function StartGeocoding(address, title, desc) { map.Find(null, // what address, // where null, // VEFindType (always VEFindType.Businesses) null, // VEShapeLayer (base by default) null, // start index for results (0 by default) null, // max number of results (default is 10) null, // show results? (default is true) null, // create pushpin for what results? (ignored since what is null) true, // use default disambiguation? (default is true) false, // set best map view? (default is true) GeocodeCallback); // call back function } function GeocodeCallback(shapeLayer, findResults, places, moreResults, errorMsg) { var bestPlace = places[0]; // Add pushpin to the *best* place var location = bestPlace.LatLong; var newShape = new VEShape(VEShapeType.Pushpin, location); var desc = "Latitude: " + location.Latitude + "<br>Longitude:" + location.Longitude; newShape.SetDescription(desc); newShape.SetTitle(bestPlace.Name); map.AddShape(newShape); } $(document).ready(function () { LoadMap(); }); </script>

    Read the article

  • Help Understanding the Mork File Format

    - by Sumit Ghosh
    Hi, I have a name value pair in a Java HashMap and this in continuation to my earlier question - here NickName=,LastModifiedDate=4ac18267,FaxNumberType=,BirthMonth=,LastName=,HomePhone=,WorkCountry=,HomePhoneType=,PreferMailFormat=0,CellularNumber=,FamilyName=,[email protected],AnniversaryMonth=,HomeCity=,WorkState=,HomeCountry=,PhoneticFirstName=,PhoneticLastName=,HomeState=,WorkAddress=,WebPage1=,WebPage2=,HomeAddress2=,WorkZipCode=,_AimScreenName=,AnniversaryYear=,WorkPhoneType=,Notes=,WorkAddress2=,WorkPhone=,Custom3=,Custom4=,Custom1=,Custom2=,PagerNumber=,AnniversaryDay=,WorkCity=,AllowRemoteContent=0,CellularNumberType=,FaxNumber=,PopularityIndex=2,FirstName=,SpouseName=,CardType=,Department=,Company=,HomeAddress=,BirthDay=,SecondEmail=,RecordKey=1,DisplayName=,DefaultEmail=,DefaultAddress=,BirthYear=,Category=,PagerNumberType=,[email protected],JobTitle=,HomeZipCode=, NickName=,LastModifiedDate=0,FaxNumberType=,BirthMonth=,LastName=Ghosh,HomePhone=+504-9907-1342,WorkCountry=USA,HomePhoneType=,PreferMailFormat=2,CellularNumber=512-282-2512,FamilyName=,[email protected],AnniversaryMonth=,HomeCity=Siguatepeque,WorkState=TX,HomeCountry=Honduras,PhoneticFirstName=,PhoneticLastName=,HomeState=Comayagua,WorkAddress=9309 HeatherwoodDr,WebPage1=http://www.mpcsol.com,WebPage2=http://www.jesuslovesthelittlechildren.org,HomeAddress2=VillaAlicia,WorkZipCode=78748,_AimScreenName=rentaprogrammer,AnniversaryYear=,WorkPhoneType=,Notes=Some notes go here.,WorkAddress2=Apartment 1,WorkPhone=512-282-2509,Custom3=Faith,Custom4=Timothy,Custom1=Hannah,Custom2=John,PagerNumber=512-282-2511,AnniversaryDay=,WorkCity=Austin,AllowRemoteContent=1,CellularNumberType=,FaxNumber=512-282-2510,PopularityIndex=0,FirstName=Sumit,SpouseName=,CardType=,Department=Programming,Company=MPC Solutions,HomeAddress=Two Blocks Past Oxen Team,BirthDay=,[email protected],RecordKey=2,DisplayName=Sumit,DefaultEmail=,DefaultAddress=,BirthYear=,Category=,PagerNumberType=,[email protected],JobTitle=Programmer,HomeZipCode=NA, NickName=,LastModifiedDate=0,FaxNumberType=,BirthMonth=,LastName=,HomePhone=,WorkCountry=,HomePhoneType=,PreferMailFormat=0,CellularNumber=,FamilyName=,[email protected],AnniversaryMonth=,HomeCity=,WorkState=,HomeCountry=,PhoneticFirstName=,PhoneticLastName=,HomeState=,WorkAddress=,WebPage1=,WebPage2=,HomeAddress2=,WorkZipCode=,_AimScreenName=,AnniversaryYear=,WorkPhoneType=,Notes=,WorkAddress2=,WorkPhone=,Custom3=,Custom4=,Custom1=,Custom2=,PagerNumber=,AnniversaryDay=,WorkCity=,AllowRemoteContent=0,CellularNumberType=,FaxNumber=,PopularityIndex=0,FirstName=,SpouseName=,CardType=,Department=,Company=,HomeAddress=,BirthDay=,SecondEmail=,RecordKey=3,DisplayName=,DefaultEmail=,DefaultAddress=,BirthYear=,Category=,PagerNumberType=,[email protected],JobTitle=,HomeZipCode=, I want to write it to a Mork file , using the Mork file format, can someone tell me how to decode the name value pair to this format given below. <(A9=3)(81=)([email protected])(80=0)(85=2)(86=4ac18267)(83=1) (87=Sumit)(88=Ghosh)(89=Sumit)([email protected])(8B [email protected])(8C=512-282-2509)(8D=+504-9907-1342)(8E=512-282-2510) (8F=512-282-2511)(90=512-282-2512)(91=Two Blocks Past Oxen Team)(92 =Villa Alicia)(93=Siguatepeque)(94=Comayagua)(95=NA)(96=Honduras) (97=9309 Heatherwood Dr)(98=Apartment 1)(99=Austin)(9A=TX)(9B=78748) (9C=USA)(9D=Programmer)(9E=Programming)(9F=MPC Solutions)(A0 =rentaprogrammer)(A1=http://www.mpcsol.com)(A2 =http://www.jesuslovesthelittlechildren.org)(A3=Hannah)(A4=John) (A5=Faith)(A6=Timothy)(A7=Some notes go here.)(A8 [email protected])> {1:^80 {(k^C0:c)(s=9)} [1:^82(^BF=3)] [1(^83=)(^84=)(^85=)(^86=)(^87=)(^88=)(^89^82)(^8A^82)(^8B=)(^8C=) (^8D=)(^8E=0)(^8F=2)(^90=0)(^91=)(^92=)(^93=)(^94=)(^95=)(^96=) (^97=)(^98=)(^99=)(^9A=)(^9B=)(^9C=)(^9D=)(^9E=)(^9F=)(^A0=)(^A1=) (^A2=)(^A3=)(^A4=)(^A5=)(^A6=)(^A7=)(^A8=)(^A9=)(^AA=)(^AB=)(^AC=) (^AD=)(^AE=)(^AF=)(^B0=)(^B1=)(^B2=)(^B3=)(^B4=)(^B5=)(^B6=)(^B7=) (^B8=)(^B9=)(^BA=)(^BB=)(^BC^86)(^BD=1)] [2(^83^87)(^84^88)(^85=)(^86=)(^87^89)(^88=)(^89^8A)(^8A^8A)(^8B^8B) (^8C=)(^8D=)(^8E=2)(^8F=0)(^90=1)(^91^8C)(^92^8D)(^93^8E)(^94^8F) (^95^90)(^96=)(^97=)(^98=)(^99=)(^9A=)(^9B^91)(^9C^92)(^9D^93)(^9E^94) (^9F=NA)(^A0^96)(^A1^97)(^A2^98)(^A3^99)(^A4=TX)(^A5^9B)(^A6^9C) (^A7^9D)(^A8^9E)(^A9^9F)(^AA^A0)(^AB=)(^AC=)(^AD=)(^AE=)(^AF=)(^B0=) (^B1=)(^B2^A1)(^B3^A2)(^B4=)(^B5=)(^B6=)(^B7^A3)(^B8^A4)(^B9^A5) (^BA^A6)(^BB^A7)(^BC=0)(^BD=2)] [3(^83=)(^84=)(^85=)(^86=)(^87=)(^88=)(^89^A8)(^8A^A8)(^8B=)(^8C=) (^8D=)(^8E=0)(^8F=0)(^90=0)(^91=)(^92=)(^93=)(^94=)(^95=)(^96=) (^97=)(^98=)(^99=)(^9A=)(^9B=)(^9C=)(^9D=)(^9E=)(^9F=)(^A0=)(^A1=) (^A2=)(^A3=)(^A4=)(^A5=)(^A6=)(^A7=)(^A8=)(^A9=)(^AA=)(^AB=)(^AC=) (^AD=)(^AE=)(^AF=)(^B0=)(^B1=)(^B2=)(^B3=)(^B4=)(^B5=)(^B6=)(^B7=) (^B8=)(^B9=)(^BA=)(^BB=)(^BC=0)(^BD=3)]}

    Read the article

  • SVN: Create a dump file of a folder

    - by DarkJaff
    I'm trying to create a dump file of a folder in my SVN repository.(My goal is to import this dump on another repository, but that's another story). I've read like 20 pages about this and they all tell me to use svndumpfilter but I can't seem to make it work. Here is my command: C:\>svnadmin dump d:/SvnData/TestingSVN/ | svndumpfilter include /TestingSVN/Trunk/Fms/ > d:\FMS.txt The output in the command line is this strange thing: Including prefixes: '/TestingSVN/Trunk/Fms' * Dumped revision 0. Revision 0 com*m iDtutmepde da sr e0v.isi n 1. Revision 1 committed as 1. * Dumped revision 2. Revision 2 committed a*s D2u.mpe revision 3. Revisio*n D3u mcpoemdm irtetveids iaosn 34.. Revision* 4D ucmopmemdi trteevdi saiso n4 .5. Revision 5 com*m iDtutmepde da sr e5v.isi n 6. Revision 6 commi*t tDeudm paesd 6r.evi ion 7. Revisio*n D7u mcpoemdm irtetveids iaosn 78.. Revision *8 Dcuommpmeidt treedv iassi o8n. 9. Revision 9* cDoummmpietdt erde vaiss i9o.n 1 . Revisi*o nD u1m0p ecdo mrmeivtitseido na s1 11.0 . Revision 11 *c oDmummiptetde dr eavsi s1i1o.n 1 . Revision 12 committed* aDsu m1p2e.d r vision 13. Revision 13 committ*e dD uamsp e1d3 .rev sion 14. Revision 14 commit*t eDdu mapse d1 4r.evi ion 15. Revision 15 committed as 15. * Dumped revision 16. Revision 16 committed as 16. Dropped 83 nodes: '/Branches' '/Branches/305' '/Branches/305/New Text Document.txt' '/Fms' '/Fms/ADPropertySheet.cpp'** etc. for 83 nodes... Also, the dump file itself is only 3 kb and contain no real data, only things like that (this is not the complete dump, just a sample) SVN-fs-dump-format-version: 2 UUID: 592fc9f0-5994-e841-a4dc-653714c95216 Revision-number: 0 Prop-content-length: 56 Content-length: 56 K 8 svn:date V 27 2009-06-19T15:05:52.001352Z PROPS-END Revision-number: 1 Prop-content-length: 112 Content-length: 112 K 7 svn:log V 38 This is an empty revision for padding. K 8 svn:date V 27 2009-06-19T15:11:29.378511Z PROPS-END Can anybody help me sort this out?

    Read the article

  • EJB client can't find a DataSource that tests successfully in WebLogic admin console

    - by suszterpatt
    Disclaimer: I'm completely new to JEE/EJB and all that, so bear with me. I have a simple EJB that I can successfully deploy using JDeveloper 11g's integrated WebLogic server and a remote database connection (JDBC). I have a DataSource named "PGY2" defined in WebLogic, and I can test it successfully from the admin console. Here's the code of the client I'm trying to test it with (generated entirely by JDev except for the three method calls on adminManager): public class AdminManagerClient { public static void main(String [] args) { try { final Context context = getInitialContext(); AdminManager adminManager = (AdminManager)context.lookup("Uran-AdminManager#hu.elte.pgy2.BACNAAI.UranEJB.AdminManager"); adminManager.addAdmin("root", "root", "Kovács Isten"); adminManager.addStudent("BACNAAI", "matt", "B Cs", 2005); adminManager.addTeacher("SIPKABT", "patt", "S P", "numanal", "Dr."); } catch (Exception ex) { ex.printStackTrace(); } } private static Context getInitialContext() throws NamingException { Hashtable env = new Hashtable(); // WebLogic Server 10.x connection details env.put( Context.INITIAL_CONTEXT_FACTORY, "weblogic.jndi.WLInitialContextFactory" ); env.put(Context.PROVIDER_URL, "t3://127.0.0.1:7101"); return new InitialContext( env ); } } But when I try to run this, I get the following error on the line with adminManager.addAdmin (that is, after the lookup): javax.ejb.EJBException: EJB Exception: ; nested exception is: Exception [EclipseLink-4002] (Eclipse Persistence Services - 1.0.2 (Build 20081024)): org.eclipse.persistence.exceptions.DatabaseException Internal Exception: java.sql.SQLException: Internal error: Cannot obtain XAConnection Creation of XAConnection for pool PGY2 failed after waitSecs:30 : java.sql.SQLException: Data Source PGY2 does not exist. Why can't the client find the data source, and how do I make it find it? EDIT: I took a closer look at WebLogic's output during deployment, and I found this. I have no idea what it means, but it may be relevant: <2010.05.20. 0:50:43 CEST> <Error> <Deployer> <BEA-149231> <Unable to set the activation state to true for the application 'PGY2'. weblogic.application.ModuleException: at weblogic.jdbc.module.JDBCModule.activate(JDBCModule.java:349) at weblogic.application.internal.flow.ModuleListenerInvoker.activate(ModuleListenerInvoker.java:107) at weblogic.application.internal.flow.DeploymentCallbackFlow$2.next(DeploymentCallbackFlow.java:411) at weblogic.application.utils.StateMachineDriver.nextState(StateMachineDriver.java:37) at weblogic.application.internal.flow.DeploymentCallbackFlow.activate(DeploymentCallbackFlow.java:74) Truncated. see log file for complete stacktrace weblogic.common.ResourceException: is already bound at weblogic.jdbc.common.internal.RmiDataSource.start(RmiDataSource.java:387) at weblogic.jdbc.common.internal.DataSourceManager.createAndStartDataSource(DataSourceManager.java:136) at weblogic.jdbc.common.internal.DataSourceManager.createAndStartDataSource(DataSourceManager.java:97) at weblogic.jdbc.module.JDBCModule.activate(JDBCModule.java:346) at weblogic.application.internal.flow.ModuleListenerInvoker.activate(ModuleListenerInvoker.java:107) Truncated. see log file for complete stacktrace >

    Read the article

  • Uses of a C++ Arithmetic Promotion Header

    - by OlduvaiHand
    I've been playing around with a set of templates for determining the correct promotion type given two primitive types in C++. The idea is that if you define a custom numeric template, you could use these to determine the return type of, say, the operator+ function based on the class passed to the templates. For example: // Custom numeric class template <class T> struct Complex { Complex(T real, T imag) : r(real), i(imag) {} T r, i; // Other implementation stuff }; // Generic arithmetic promotion template template <class T, class U> struct ArithmeticPromotion { typedef typename X type; // I realize this is incorrect, but the point is it would // figure out what X would be via trait testing, etc }; // Specialization of arithmetic promotion template template <> class ArithmeticPromotion<long long, unsigned long> { typedef typename unsigned long long type; } // Arithmetic promotion template actually being used template <class T, class U> Complex<typename ArithmeticPromotion<T, U>::type> operator+ (Complex<T>& lhs, Complex<U>& rhs) { return Complex<typename ArithmeticPromotion<T, U>::type>(lhs.r + rhs.r, lhs.i + rhs.i); } If you use these promotion templates, you can more or less treat your user defined types as if they're primitives with the same promotion rules being applied to them. So, I guess the question I have is would this be something that could be useful? And if so, what sorts of common tasks would you want templated out for ease of use? I'm working on the assumption that just having the promotion templates alone would be insufficient for practical adoption. Incidentally, Boost has something similar in its math/tools/promotion header, but it's really more for getting values ready to be passed to the standard C math functions (that expect either 2 ints or 2 doubles) and bypasses all of the integral types. Is something that simple preferable to having complete control over how your objects are being converted? TL;DR: What sorts of helper templates would you expect to find in an arithmetic promotion header beyond the machinery that does the promotion itself?

    Read the article

  • Model-View-Controller in JavaScript

    - by Casey Hope
    tl;dr: How does one implement MVC in JavaScript in a clean way? I'm trying to implement MVC in JavaScript. I have googled and reorganized with my code countless times but have not found a suitable solution. (The code just doesn't "feel right".) Here's how I'm going about it right now. It's incredibly complicated and is a pain to work with (but still better than the pile of code I had before). It has ugly workarounds that sort of defeat the purpose of MVC. And behold, the mess, if you're really brave: // Create a "main model" var main = Model0(); function Model0() { // Create an associated view and store its methods in "view" var view = View0(); // Create a submodel and pass it a function // that will "subviewify" the submodel's view var model1 = Model1(function (subview) { view.subviewify(subview); }); // Return model methods that can be used by // the controller (the onchange handlers) return { 'updateModel1': function (newValue) { model1.update(newValue); } }; } function Model1(makeSubView) { var info = ''; // Make an associated view and attach the view // to the parent view using the passed function var view = View1(); makeSubView(view.__view); // Dirty dirty // Return model methods that can be used by // the parent model (and so the controller) return { 'update': function (newValue) { info = newValue; // Notify the view of the new information view.events.value(info); } }; } function View0() { var thing = document.getElementById('theDiv'); var input = document.getElementById('theInput'); // This is the "controller", bear with me input.onchange = function () { // Ugly, uses a global to contact the model main.updateModel1(this.value); }; return { 'events': {}, // Adds a subview to this view. 'subviewify': function (subview) { thing.appendChild(subview); } }; } // This is a subview. function View1() { var element = document.createElement('div'); return { 'events': { // When the value changes this is // called so the view can be updated 'value': function (newValue) { element.innerHTML = newValue; } }, // ..Expose the DOM representation of the subview // so it can be attached to a parent view '__view': element }; } How does one implement MVC in JavaScript in a cleaner way? How can I improve this system? Or is this the completely wrong way to go, should I follow another pattern?

    Read the article

< Previous Page | 49 50 51 52 53 54 55 56 57  | Next Page >