Search Results

Search found 491 results on 20 pages for 'gareth jones'.

Page 18/20 | < Previous Page | 14 15 16 17 18 19 20  | Next Page >

  • RSS Feeds currently on Simple-Talk

    - by Andrew Clarke
    There are a number of news-feeds for the Simple-Talk site, but for some reason they are well hidden. Whilst we set about reorganizing them, I thought it would be a good idea to list some of the more important ones. The most important one for almost all purposes is the Homepage RSS feed which represents the blogs and articles that are placed on the homepage. Main Site Feed representing the Homepage ..which is good for most purposes but won't always have all the blogs, or maybe it will occasionally miss an article. If you aren't interested in all the content, you can just use the RSS feeds that are more relevant to your interests. (We'll be increasing these categories soon) The newsfeed for SQL articles The .NET section newsfeed The newsfeed for Red Gate books The newsfeed for Opinion articles The SysAdmin section newsfeed if you want to get a more refined feed, then you can pick and choose from these feeds for each category so as to make up your custom news-feed in the SQL section, SQL Training Learn SQL Server Database Administration TSQL Programming SQL Server Performance Backup and Recovery SQL Tools SSIS SSRS (Reporting Services) in .NET there are... ASP.NET Windows Forms .NET Framework ,NET Performance Visual Studio .NET tools in Sysadmin there are Exchange General Virtualisation Unified Messaging Powershell in opinion, there is... Geek of the Week Opinion Pieces in Books, there is .NET Books SQL Books SysAdmin Books And all the blogs have got feeds. So although you can get all the blogs from here.. Main Blog Feed          You can get individual RSS feeds.. AdamRG's Blog       Alex.Davies's Blog       AliceE's Blog       Andrew Clarke's Blog       Andrew Hunter's Blog       Bart Read's Blog       Ben Adderson's Blog       BobCram's Blog       bradmcgehee's Blog       Brian Donahue's Blog       Charles Brown's Blog       Chris Massey's Blog       CliveT's Blog       Damon's Blog       David Atkinson's Blog       David Connell's Blog       Dr Dionysus's Blog       drsql's Blog       FatherJack's Blog       Flibble's Blog       Gareth Marlow's Blog       Helen Joyce's Blog       James's Blog       Jason Crease's Blog       John Magnabosco's Blog       Laila's Blog       Lionel's Blog       Matt Lee's Blog       mikef's Blog       Neil Davidson's Blog       Nigel Morse's Blog       Phil Factor's Blog       red@work's Blog       reka.burmeister's Blog       Richard Mitchell's Blog       RobbieT's Blog       RobertChipperfield's Blog       Rodney's Blog       Roger Hart's Blog       Simon Cooper's Blog       Simon Galbraith's Blog       TheFutureOfMonitoring's Blog       Tim Ford's Blog       Tom Crossman's Blog       Tony Davis's Blog       As well as these blogs, you also have the forums.... SQL Server for Beginners Forum     Programming SQL Server Forum    Administering SQL Server Forum    .NET framework Forum    .Windows Forms Forum   ASP.NET Forum   ADO.NET Forum 

    Read the article

  • Session Report - Java on the Raspberry Pi

    - by Janice J. Heiss
    On mid-day Wednesday, the always colorful Oracle Evangelist Simon Ritter demonstrated Java on the Raspberry Pi at his session, “Do You Like Coffee with Your Dessert?”. The Raspberry Pi consists of a credit card-sized single-board computer developed in the UK with the intention of stimulating the teaching of basic computer science in schools. “I don't think there is a single feature that makes the Raspberry Pi significant,” observed Ritter, “but a combination of things really makes it stand out. First, it's $35 for what is effectively a completely usable computer. You do have to add a power supply, SD card for storage and maybe a screen, keyboard and mouse, but this is still way cheaper than a typical PC. The choice of an ARM (Advanced RISC Machine and Acorn RISC Machine) processor is noteworthy, because it avoids problems like cooling (no heat sink or fan) and can use a USB power brick. When you add in the enormous community support, it offers a great platform for teaching everyone about computing.”Some 200 enthusiastic attendees were present at the session which had the feel of Simon Ritter sharing a fun toy with friends. The main point of the session was to show what Oracle was doing to support Java on the Raspberry Pi in a way that is entertaining and fun. Ritter pointed out that, in addition to being great for teaching, it’s an excellent introduction to the ARM architecture, and runs well with Java and will get better once it has official hard float support. The possibilities are vast.Ritter explained that the Raspberry Pi Project started in 2006 with the goal of devising a computer to inspire children; it drew inspiration from the BBC Micro literacy project of 1981 that produced a series of microcomputers created by the Acorn Computer company. It was officially launched on February 29, 2012, with a first production of 10,000 boards. There were 100,000 pre-orders in one day; currently about 4,000 boards are produced a day. Ritter described the specification as follows:* CPU: ARM 11 core running at 700MHz Broadcom SoC package Can now be overclocked to 1GHz (without breaking the warranty!) * Memory: 256Mb* I/O: HDMI and composite video 2 x USB ports (Model B only) Ethernet (Model B only) Header pins for GPIO, UART, SPI and I2C He took attendees through a brief history of ARM Architecture:* Acorn BBC Micro (6502 based) Not powerful enough for Acorn’s plans for a business computer * Berkeley RISC Project UNIX kernel only used 30% of instruction set of Motorola 68000 More registers, less instructions (Register windows) One chip architecture to come from this was… SPARC * Acorn RISC Machine (ARM) 32-bit data, 26-bit address space, 27 registers First machine was Acorn Archimedes * Spin off from Acorn, Advanced RISC MachinesNext he presented its features:* 32-bit RISC Architecture–  ARM accounts for 75% of embedded 32-bit CPUs today– 6.1 Billion chips sold last year (zero manufactured by ARM)* Abstract architecture and microprocessor core designs– Raspberry Pi is ARM11 using ARMv6 instruction set* Low power consumption– Good for mobile devices– Raspberry Pi can be powered from 700mA 5V only PSU– Raspberry Pi does not require heatsink or fanHe described the current ARM Technology:* ARMv6– ARM 11, ARM Cortex-M* ARMv7– ARM Cortex-A, ARM Cortex-M, ARM Cortex-R* ARMv8 (Announced)– Will support 64-bit data and addressingHe next gave the Java Specifics for ARM: Floating point operations* Despite being an ARMv6 processor it does include an FPU– FPU only became standard as of ARMv7* FPU (Hard Float, or HF) is much faster than a software library* Linux distros and Oracle JVM for ARM assume no HF on ARMv6– Need special build of both– Raspbian distro build now available– Oracle JVM is in the works, release date TBDNot So RISCPerformance Improvements* DSP Enhancements* Jazelle* Thumb / Thumb2 / ThumbEE* Floating Point (VFP)* NEON* Security Enhancements (TrustZone)He spent a few minutes going over the challenges of using Java on the Raspberry Pi and covered:* Sound* Vision * Serial (TTL UART)* USB* GPIOTo implement sound with Java he pointed out:* Sound drivers are now included in new distros* Java Sound API– Remember to add audio to user’s groups– Some bits work, others not so much* Playing (the right format) WAV file works* Using MIDI hangs trying to open a synthesizer* FreeTTS text-to-speech– Should work once sound works properlyHe turned to JavaFX on the Raspberry Pi:* Currently internal builds only– Will be released as technology preview soon* Work involves optimal implementation of Prism graphics engine– X11?* Once the JavaFX implementation is completed there will be little of concern to developers-- It’s just Java (WORA). He explained the basis of the Serial Port:* UART provides TTL level signals (3.3V)* RS-232 uses 12V signals* Use MAX3232 chip to convert* Use this for access to serial consoleHe summarized his key points. The Raspberry Pi is a very cool (and cheap) computer that is great for teaching, a great introduction to ARM that works very well with Java and will work better in the future. The opportunities are limitless. For further info, check out, Raspberry Pi User Guide by Eben Upton and Gareth Halfacree. From there, Ritter tried out several fun demos, some of which worked better than others, but all of which were greeted with considerable enthusiasm and support and good humor (even when he ran into some glitches).  All in all, this was a fun and lively session.

    Read the article

  • Chicago SQL Saturday

    - by Johnm
    This past Saturday, April 17, 2010, I journeyed North to the great city of Chicago for some SQL Server fun, learning and fellowship. The Chicago edition of this grassroots phenomenon was the 31st scheduled SQL Saturday since the program's birth in late 2007. The Chicago SQL Saturday consisted of four tracks with eight sessions each and was a very energetic and fast paced day for the 300+/- SQL Server enthusiasts in attendance. The speaker line up included national notables such as Kevin Kline, Brent Ozar, and Brad McGehee. My hometown of Indianapolis was well represented in the speaker line up with Arie Jones, Aaron King and Derek Comingore. The day began with a very humorous keynote by Kevin Kline and Brent Ozar who emphasized the importance of community events such as SQL Saturday and the monthly user group meetings. They also brilliantly included the impact that getting involved in the SQL community through social media can have on your professional career. My approach to the day was to try to experience as much of the event as I could, so there were very few sessions that I attended for their full duration. I leaped from session to session like a bumble bee, gleaning bits of nectar from each session. Amid these leaps I took the opportunity to briefly chat with some of the in-the-queue speakers as well as other attendees that wondered the hallways. I especially enjoyed a great discussion with Devin Knight about his plans regarding the upcoming Jacksonville SQL Saturday as well as an interesting SQL interpretation of the Iron Chef, which I think would catch on like wild-fire. There were two sessions that stood out as exceptional. So much so that I could not pull myself away: Kevin Kline presented on "SQL Server Internals and Architecture". This session could have been classified as one that is intended for the beginner. Kevin even personally warned me of such as I entered the room. I am a believer in revisiting the basics regardless of the level of your mastery, so I entered into this session in that spirit. It was a very clear and precise presentation. Masterfully illustrated and demonstrated. Brad McGehee presented on "How and When to Use Indexed Views". This was a topic that I was recently exploring and was considering to for use in an integration project. Brad effectively communicated the complexity of this feature and what is involved to gain their full benefit. It was clear at the conclusion of this session that it was not the right feature for my specific needs. Overall, the event was a great success. The use of volunteers, from an attendee's perspective was masterful. The only recommendation that I would have for the next Chicago SQL Saturday would be to include more time in between sessions to permit some level of networking among the attendees, one-on-one questions for speakers and visits to the sponsor booths. Congratulations to Wendy Pastrick, Ted Krueger, and Aaron Lowe for their efforts and a very successful SQL Saturday!

    Read the article

  • SQL SERVER – What the Business Says Is Not What the Business Wants

    - by pinaldave
    This blog post is written in response to T-SQL Tuesday hosted by Steve Jones. Steve raised a very interesting question; every DBA and Database Developer has already faced this situation. When I read the topic, I felt that I can write several different examples here. Today, I will cover this scenario, which seems quite amusing. Shrinking Database Earlier this year, I was working on SQL Server Performance Tuning consultancy; I had faced very interesting situation. No matter how much I attempt to reduce the fragmentation, I always end up with heavy fragmentation on the server. After careful research, I figured out that one of the jobs was continuously Shrinking the Database – which is a very bad practice. I have blogged about my experience over here SQL SERVER – SHRINKDATABASE For Every Database in the SQL Server. I removed the incorrect shrinking process right away; once it was removed, everything continued working as it should be. After a couple of days, I learned that one of their DBAs had put back the same DBCC process. I requested the Senior DBA to find out what is going on and he came up with the following reason: “Business Requirement.” I cannot believe this! Now, it was time for me to go deep into the subject. Moreover, it had become necessary to understand the need. After talking to the concerned people here, I understood what they needed. Please read the exact business need in their own language. The Shrinking “Business Need” “We shrink the database because if we take backup after shrinking the database, the size of the same is smaller. Once we take backup, we have to send it to our remote location site. Our business requirement is that we need to always make sure that the file is smallest when we transfer it to remote server.” The backup is not affected in any way if you shrink the database or not. The size of backup will be the same. After a couple of the tests, they agreed with me. Shrinking will create performance issues for the same as it will introduce heavy fragmentation in the database. The Real Solution The real business need was that they needed the smallest possible backup file. We finally implemented a quick solution which they are still using to date. The solution was compressed backup. I have written about this subject in detail few years before SQL SERVER – 2008 – Introduction to New Feature of Backup Compression. Compressed backup not only creates a small filesize but also increases the speed of the database as well. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Best Practices, Pinal Dave, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Exceptional DBA Awards 2011

    - by Rebecca Amos
    From today, we’re accepting nominations for the 2011 Exceptional DBA Awards. DBAs make a vital contribution to the running of the companies they work for, and the Exceptional DBA Awards aim to acknowledge this and make this contribution more widely known. Check out our new website for all the info: www.exceptionaldba.com  Being an exceptional DBA doesn’t mean you have to sleep at the office, or know everything there is to know about SQL Server; who ever could? It means that you make an effort to make your servers secure and reliable, and to make your users’ lives easier. Maybe you’ve helped a junior colleague learn something new about server backups? Or cancelled your coffee break to get a database back online? Or contributed to a forum post on performance monitoring? All of these actions show that you might be an exceptional DBA. So have a think about the tasks you do every day that already make you exceptional – and then get started on your entry! You just need to answer a few questions on our website about your experience as a DBA, some of your biggest achievements, and any other activities you participate in within the SQL Server community. Anyone who is currently working as a SQL Server database administrator can enter, or be nominated by someone else. We’ve got four fantastic judges for the Awards, who you’ll be familiar with already: Brent Ozar, Brad McGehee, Rodney Landrum and Steve Jones. They’ll pick five finalists, and then we’ll ask the SQL Server community to vote for their winner. Not only could you win the respect and recognition of peers and colleagues, but the prizes also include full conference registration for the 2011 PASS Summit in Seattle (where the awards ceremony will take place), four nights' hotel accommodation, and $300 towards travel expenses. The winner will get a copy of Red Gate’s SQL DBA Bundle – and they’ll also be featured here, on Simple-Talk. So what are you waiting for? Chances are you’ve already made a small effort for someone today that means you might be an exceptional DBA. Visit the website now, and start writing your entry – or nominate your favourite DBA to enter: www.exceptionaldba.com

    Read the article

  • Plan Operator Tuesday round-up

    - by Rob Farley
    Eighteen posts for T-SQL Tuesday #43 this month, discussing Plan Operators. I put them together and made the following clickable plan. It’s 1000px wide, so I hope you have a monitor wide enough. Let me explain this plan for you (people’s names are the links to the articles on their blogs – the same links as in the plan above). It was clearly a SELECT statement. Wayne Sheffield (@dbawayne) wrote about that, so we start with a SELECT physical operator, leveraging the logical operator Wayne Sheffield. The SELECT operator calls the Paul White operator, discussed by Jason Brimhall (@sqlrnnr) in his post. The Paul White operator is quite remarkable, and can consume three streams of data. Let’s look at those streams. The first pulls data from a Table Scan – Boris Hristov (@borishristov)’s post – using parallel threads (Bradley Ball – @sqlballs) that pull the data eagerly through a Table Spool (Oliver Asmus – @oliverasmus). A scalar operation is also performed on it, thanks to Jeffrey Verheul (@devjef)’s Compute Scalar operator. The second stream of data applies Evil (I figured that must mean a procedural TVF, but could’ve been anything), courtesy of Jason Strate (@stratesql). It performs this Evil on the merging of parallel streams (Steve Jones – @way0utwest), which suck data out of a Switch (Paul White – @sql_kiwi). This Switch operator is consuming data from up to four lookups, thanks to Kalen Delaney (@sqlqueen), Rick Krueger (@dataogre), Mickey Stuewe (@sqlmickey) and Kathi Kellenberger (@auntkathi). Unfortunately Kathi’s name is a bit long and has been truncated, just like in real plans. The last stream performs a join of two others via a Nested Loop (Matan Yungman – @matanyungman). One pulls data from a Spool (my post – @rob_farley) populated from a Table Scan (Jon Morisi). The other applies a catchall operator (the catchall is because Tamera Clark (@tameraclark) didn’t specify any particular operator, and a catchall is what gets shown when SSMS doesn’t know what to show. Surprisingly, it’s showing the yellow one, which is about cursors. Hopefully that’s not what Tamera planned, but anyway...) to the output from an Index Seek operator (Sebastian Meine – @sqlity). Lastly, I think everyone put in 110% effort, so that’s what all the operators cost. That didn’t leave anything for me, unfortunately, but that’s okay. Also, because he decided to use the Paul White operator, Jason Brimhall gets 0%, and his 110% was given to Paul’s Switch operator post. I hope you’ve enjoyed this T-SQL Tuesday, and have learned something extra about Plan Operators. Keep your eye out for next month’s one by watching the Twitter Hashtag #tsql2sday, and why not contribute a post to the party? Big thanks to Adam Machanic as usual for starting all this. @rob_farley

    Read the article

  • Amazon.com Cutting Off Colorado Affiliates

    - by Joe Mayo
    I received an email from Amazon.com today, essentially cutting off my affiliate status because I'm in Colorado. Colorado recently passed legislation that requires retailers to either collect sales tax for on-line transactions or engage in an onerous process that makes you wish you had collected sales tax.  After I Tweeted this, Mike Jones tweeted a link to the legislation.  Here's an excerpt from Amazon.com's email: "Dear Colorado-based Amazon Associate: We are writing from the Amazon Associates Program to inform you that the Colorado government recently enacted a law to impose sales tax regulations on online retailers. The regulations are burdensome and no other state has similar rules. The new regulations do not require online retailers to collect sales tax. Instead, they are clearly intended to increase the compliance burden to a point where online retailers will be induced to "voluntarily" collect Colorado sales tax -- a course we won't take. We and many others strongly opposed this legislation, known as HB 10-1193, but it was enacted anyway. Regrettably, as a result of the new law, we have decided to stop advertising through Associates based in Colorado. We plan to continue to sell to Colorado residents, however, and will advertise through other channels, including through Associates based in other states. There is a right way for Colorado to pursue its revenue goals, but this new law is a wrong way. As we repeatedly communicated to Colorado legislators, including those who sponsored and supported the new law, we are not opposed to collecting sales tax within a constitutionally-permissible system applied even-handedly. The US Supreme Court has defined what would be constitutional, and if Colorado would repeal the current law or follow the constitutional approach to collection, we would welcome the opportunity to reinstate Colorado-based Associates. You may express your views of Colorado's new law to members of the General Assembly and to Governor Ritter, who signed the bill. Your Associates account has been closed as of March 8, 2010, and we will no longer pay advertising fees for customers you refer to Amazon.com after that date. Please be assured that all qualifying advertising fees earned prior to March 8, 2010, will be processed and paid in accordance with our regular payment schedule. Based on your account closure date of March 8, any final payments will be paid by May 31, 2010. We have enjoyed working with you and other Colorado-based participants in the Amazon Associates Program, and wish you all the best in your future.   Best Regards,   The Amazon Associates Team"

    Read the article

  • links for 2011-01-06

    - by Bob Rhubart
    Coming to your town: Oracle Enterprise Cloud Summit During these full-day events, cloud experts will share real-world best practices, reference architectures, detailed customer case studies, and more. Events scheduled in cities around the world.  (tags: oracle otn cloud event) Webcast: Security and Compliance for Private Cloud Consolidation Roxana Bradescu, Senior Director for Oracle Database Security Products, discusses Oracle Database Security Solutions to securely consolidate data and meet compliance requirements within private cloud computing environments. Thursday, January 13, 2011. 10am PST | 1pm EST (tags: oracle cloud security) Answering Questions about Mobile Devices | The AppsLab "How do the numbers of Android and iOS users compare? How often are people switching? Where are all these BlackBerry and Nokia users? Do they plan to jump to Android or iOS? What about webOS? Is it relevant?" Some answers in this AppsLab survey. (tags: oracle otn enterprise2.0 mobilecomputing iphone blackberry android) Webcast: Achieve 24/7 Cloud Availability Without Expensive Redundancy Ashish Ray and Matthew Baier discuss Oracle’s Maximum Availability Architecture and Oracle Database 11g. (tags: oracle cloud highavailability webcast) Converting a PV vm back into an HVM vm (Wim Coekaerts Blog) "I wanted to convert one of my VMs that was based on a paravirt kernel into a vm that just boots as a regular hardware virt VM with a standard x86-64 kernel...It took me a little while to figure out the fastest way so now that I have it pretty much down I wanted to share the steps." - Wim Coekaerts (tags: oracle otn virtualization oraclevm) @OTN_Garage: Resources for VirtualBox 4.0 Rick "@OTN_Garage" Ramsey shares links to several resources for those with a VirtualBox jones. (tags: oracle otn virtualization virtualbox) 'Federal Service Bus' Helps Belgian Government Speak a Common Language - SOA in Action Blog "The first SOA-enabled application was developed in less than two months and was fully operational in approximately 10 weeks. In addition, new FSB modules are reusable for other Belgian e-government applications, saving both time and taxpayer dollars." - Joe McKendrick (tags: soa oracle) Show Notes: Architects in the Cloud (ArchBeat Podcast) The complete 4-part interview with Stephen G. Bennett and Archie Reed, the authors of "Silver Clouds, Dark Linings: A Concise Guide to Cloud Computing," is now available. (tags: oracle otn cloud podcast archbeat)

    Read the article

  • Unable to add users to Microsoft Dynamics CRM 4.0 after database restore

    - by Wes Weeks
    Working with a client in our Multi-tenant CRM environment who was doing a database migration into CRM and as part of the process, a backup of their Organization_MSCRM database was taken just prior to starting the migration in case it needed to be restored and run a second time. In this case it did, so I restored the database and let the client know he should be good to go.  A few hours later I received a call that they were unable to add some new users, they would appear as available when using the add multiple user wizard, but anyone added would not be added to CRM.  It was also disucussed that these users had been added to CRM initally AFTER the database backup had been taken. I turned on tracing and tried to add the users through both the single user form and multiple user interface and was unable to do so.  The error message in the logs wasn't much help: Unexpected error adding user [email protected]: Microsoft.Crm.CrmException: INVALID_WRPC_TOKEN: Validate WRPC Token: WRPCTokenState=Invalid, TOKEN_EXPIRY=4320, IGNORE_TOKEN=False Searching on Google or bing didn't offer any assitance.  Apparently not a very common problem, or no one has been able to resolve. I did some searching in the MSCRM_CONFIG database and found that their are several user tables there and after getting my head around the structure found that there were enties here for users that were not part of the restored DB.  It seems that new users are added to both the Orgnaization_MSCRM and MSCRM_CONFIG and after the restore these were out of sync. I needed to remove the extra entries in order to address.  Restoring the MSCRM_CONFIG database was not an option as other clients could have been adding users at this point and to restore would risk breaking their instances of CRM.  Long story short, I was finally able to generate a script to remove the bad entries and when I tried to add users again, I was succesful.  In case someone else out there finds themselves in a similar situation, here is the script I used to delete the bad entries. DECLARE @UsersToDelete TABLE (   UserId uniqueidentifier )   Insert Into @UsersToDelete(UserId) Select UserId from [MSCRM_CONFIG].[dbo].[SystemUserOrganizations] Where CrmuserId Not in (select systemuserid from Organization_MSCRM.dbo.SystemUserBase) And OrganizationId = '00000000-643F-E011-0000-0050568572A1' --Id From the Organization table for this instance   Delete From [MSCRM_CONFIG].[dbo].[SystemUserAuthentication]   Where UserId in (Select UserId From @UsersToDelete)   Delete From [MSCRM_CONFIG].[dbo].[SystemUserOrganizations] Where UserId in (Select UserId From @UsersToDelete)   Delete From [MSCRM_CONFIG].[dbo].[SystemUser] Where Id in (Select UserId From @UsersToDelete)

    Read the article

  • links for 2011-01-31

    - by Bob Rhubart
    Do (Software) Architects Architect? "The first question, is 'Why is architect being used as a verb?' Mirriam-Webster dictionary does not contain a definition of architect as a verb, nor do many other recognized dictionaries." -- TheCPUWizard (tags: softwarearchitecture) Oracle Business Intelligence Blog: Gartner Magic Quadrant for BI Platforms 2011 "Oracle customers indicate they deploy the Oracle Business Intelligence Suite Enterprise Edition (OBIEE) platform to support among the most complex deployments in our survey." - Gartner (tags: oracle businessintelligence gartner) Oracle BI Server Modeling, Part 1- Designing a Query Factory (Oracle BI Foundation) Bob Ertl lays the groundwork for Business Intelligence modeling concepts with a look at "the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing." (tags: oracle otn businessintelligence) Tom Graves: Modelling people in enterprise-architecture Tom says: "One of the key characteristics of ‘crossing the chasm’ to a viable whole-of-enterprise architecture is the explicit inclusion of people. In short, we need to be able to model and map where people fit in relation to the architecture. But there’s a catch. A big catch." (tags: entarch) Java developer webcasts for customers and partners (SOA Partner Community Blog) Jurgen Kress shares info on several upcoming online events focused on WebLogic. (tags: weblogic oracle otn soa) Business SOA: Data Services are bogus, Information services are real Steve Jones says: "The other day when I was talking about MDM a bright spark pointed out that I hated data services but wasn't MDM just about data services?" (tags: SOA MDM) Andrejus Baranovskis's Blog: Configuring Missing Contribution Folders for Oracle UCM 11g and WebCenter 11g PS3 Andrejus says: "After doing some research on UCM, we found that Folders_g component must be configured in UCM, for Contribution Folders to be enabled." (tags: oracle otn oracleace UCM webcenter enterprise2.0) Wim Coekaerts: Converting an Oracle VM VirtualBox VM into an Oracle VM Server image Wim Coekaerts offers a few simple steps to convert an existing Oracle VM VirtualBox image.  (tags: oracle otn virtualization virtualbox) Stefan Hinker: Secure Deployment of Oracle VM Server for SPARC This new paper from Stefan Hinker will help you understand the general security concerns in virtualized environments as well as the specific additional threats that arise out of them. (tags: oracle otn SPARC virtualization enterprisearchitecture) The EA Roadmap to Rationalize, Standardize, and Consolidate the IT Portfolio Enterprise IT is in a state of constant evolution. As a result, business processes and technologies become increasingly more difficult to change and more costly to keep up-to-date. (tags: entarch oracle otn)

    Read the article

  • SQL Saturday 43 (Redmond, WA) Review

    - by BuckWoody
    Last Saturday (June 12th) we held a “SQL Saturday” (more about those here) event in Redmond, Washington. The event was held at the Microsoft campus, at the Mixer in our new location called the “Commons”. This is a mall-like area that we have on campus, and the Mixer is a large building with lots of meeting rooms, so it made a perfect location for the event. There was a sign to find the parking, and once there they had a sign to show how to get to the building. Since it’s a secure facility, Greg Larsen and crew had a person manning the door so that even late arrivals could get in. We had about 400 sign up for the event, and a little over 300 attend (official numbers later). I think we would have had a lot more, but the sun was out – and you just can’t underestimate the effect of that here in the Pacific Northwest. We joke a lot about not seeing the sun much, but when a day like what we had on Saturday comes around, and on a weekend at that, you’d cancel your wedding to go outside to play in the sun. And your spouse would agree with you for doing it. We had some top-notch speakers, including Clifford Dibble and Kalen Delany. The food was great, we had multiple sponsors (including Confio who seems to be at all of these) and the attendees were from all over the professional spectrum, from developers to BI to DBA’s. Everyone I saw was very engaged, and when I visited room-to-room I saw almost no one in the halls – everyone was in the sessions. I also saw a much larger Microsoft presence this year, especially from Dan Jones’ team. I had a great turnout at my session, and yes, I was wearing an Oracle staff shirt. I did that because I wanted to show that the session I gave on “SQL Server for the Oracle DBA” was non-marketing – I couldn’t exactly bash Oracle wearing their colors! These events are amazing. I can’t emphasize enough how much I appreciate the volunteers and how much work they put into these events, and to you for coming. If you’re reading this and you haven’t attended one yet, definitely find out if there is one in your area – and if not, start one. It’s a lot of work, but it’s totally worth it.       Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQL Saturday 43 (Redmond, WA) Review

    - by BuckWoody
    Last Saturday (June 12th) we held a “SQL Saturday” (more about those here) event in Redmond, Washington. The event was held at the Microsoft campus, at the Mixer in our new location called the “Commons”. This is a mall-like area that we have on campus, and the Mixer is a large building with lots of meeting rooms, so it made a perfect location for the event. There was a sign to find the parking, and once there they had a sign to show how to get to the building. Since it’s a secure facility, Greg Larsen and crew had a person manning the door so that even late arrivals could get in. We had about 400 sign up for the event, and a little over 300 attend (official numbers later). I think we would have had a lot more, but the sun was out – and you just can’t underestimate the effect of that here in the Pacific Northwest. We joke a lot about not seeing the sun much, but when a day like what we had on Saturday comes around, and on a weekend at that, you’d cancel your wedding to go outside to play in the sun. And your spouse would agree with you for doing it. We had some top-notch speakers, including Clifford Dibble and Kalen Delany. The food was great, we had multiple sponsors (including Confio who seems to be at all of these) and the attendees were from all over the professional spectrum, from developers to BI to DBA’s. Everyone I saw was very engaged, and when I visited room-to-room I saw almost no one in the halls – everyone was in the sessions. I also saw a much larger Microsoft presence this year, especially from Dan Jones’ team. I had a great turnout at my session, and yes, I was wearing an Oracle staff shirt. I did that because I wanted to show that the session I gave on “SQL Server for the Oracle DBA” was non-marketing – I couldn’t exactly bash Oracle wearing their colors! These events are amazing. I can’t emphasize enough how much I appreciate the volunteers and how much work they put into these events, and to you for coming. If you’re reading this and you haven’t attended one yet, definitely find out if there is one in your area – and if not, start one. It’s a lot of work, but it’s totally worth it.       Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SQL Query to update parent record with child record values

    - by Wells
    I need to create a Trigger that fires when a child record (Codes) is added, updated or deleted. The Trigger stuffs a string of comma separated Code values from all child records (Codes) into a single field in the parent record (Projects) of the added, updated or deleted child record. I am stuck on writing a correct query to retrieve the Code values from just those child records that are the children of a single parent record. -- Create the test tables CREATE TABLE projects ( ProjectId varchar(16) PRIMARY KEY, ProjectName varchar(100), Codestring nvarchar(100) ) GO CREATE TABLE prcodes ( CodeId varchar(16) PRIMARY KEY, Code varchar (4), ProjectId varchar(16) ) GO -- Add sample data to tables: Two projects records, one with 3 child records, the other with 2. INSERT INTO projects (ProjectId, ProjectName) SELECT '101','Smith' UNION ALL SELECT '102','Jones' GO INSERT INTO prcodes (CodeId, Code, ProjectId) SELECT 'A1','Blue', '101' UNION ALL SELECT 'A2','Pink', '101' UNION ALL SELECT 'A3','Gray', '101' UNION ALL SELECT 'A4','Blue', '102' UNION ALL SELECT 'A5','Gray', '102' GO I am stuck on how to create a correct Update query. Can you help fix this query? -- Partially working, but stuffs all values, not just values from chile (prcodes) records of parent (projects) UPDATE proj SET proj.Codestring = (SELECT STUFF((SELECT ',' + prc.Code FROM projects proj INNER JOIN prcodes prc ON proj.ProjectId = prc.ProjectId ORDER BY 1 ASC FOR XML PATH('')),1, 1, '')) The result I get for the Codestring field in Projects is: ProjectId ProjectName Codestring 101 Smith Blue,Blue,Gray,Gray,Pink ... But the result I need for the Codestring field in Projects is: ProjectId ProjectName Codestring 101 Smith Blue,Pink,Gray ... Here is my start on the Trigger. The Update query, above, will be added to this Trigger. Can you help me complete the Trigger creation query? CREATE TRIGGER Update_Codestring ON prcodes AFTER INSERT, UPDATE, DELETE AS WITH CTE AS ( select ProjectId from inserted union select ProjectId from deleted )

    Read the article

  • Pigs in Socks?

    - by MightyZot
    My wonderful wife Annie surprised me with a cruise to Cozumel for my fortieth birthday. I love to travel. Every trip is ripe with adventure, crazy things to see and experience. For example, on the way to Mobile Alabama to catch our boat, some dude hauling a mobile home lost a window and we drove through a cloud of busting glass going 80 miles per hour! The night before the cruise, we stayed in the Malaga Inn and I crawled UNDER the hotel to look at an old civil war bunker. WOAH! Then, on the way to and from Cozumel, the boat plowed through two beautiful and slightly violent storms. But, the adventures you have while travelling often pale in comparison to the cult of personalities you meet along the way.  :) We met many cool people during our travels and we made some new friends. Todd and Andrea are in the publishing business (www.myneworleans.com) and teaching, respectively. Erika is a teacher too and Matt has a pig on his foot. This story is about the pig. Without that pig on Matt’s foot, we probably would have hit a buoy and drowned. Alright, so…this pig on Matt’s foot…this is no henna tatt, this is a man’s tattoo. Apparently, getting tattoos on your feet is very painful because there is very little muscle and fat and lots of nifty nerves to tell you that you might be doing something stupid. Pig and rooster tattoos carry special meaning for sailors of old. According to some sources, having a tattoo of a pig or rooster on one foot or the other will keep you from drowning. There are many great musings as to why a pig and a rooster might save your life. The most plausible in my opinion is that pigs and roosters were common livestock tagging along with the crew. Since they were shipped in wooden crates, pigs and roosters were often counted amongst the survivors when ships succumbed to Davy Jones’ Locker. I didn’t spend a whole lot of time researching the pig and the rooster, so consider these musings as you would a grain of salt. And, I was not able to find a lot of what you might consider credible history regarding the tradition. What I did find was a comfort, or solace, in the maritime tradition. Seems like raw traditions like the pig and the rooster are in danger of getting lost in a sea of non-permanence. I mean, what traditions are us old programmers and techies leaving behind for future generations? Makes me wonder what Ward Christensen has tattooed on his left foot.  I guess my choice would have to be a Commodore 64.   (I met Ward, by the way, in an elevator after he received his Dvorak awards in 1992. He was a very non-assuming individual sporting business casual and was very much a “sailor” of an old-school programmer. I can’t remember his exact words, but I think they were essentially that he felt it odd that he was getting an award for just doing his work. I’m sure that Ward doesn’t know this…he couldn’t have set a more positive example for a young 22 year old programmer. Thanks Ward!)

    Read the article

  • SQL SERVER – Merge Operations – Insert, Update, Delete in Single Execution

    - by pinaldave
    This blog post is written in response to T-SQL Tuesday hosted by Jorge Segarra (aka SQLChicken). I have been very active using these Merge operations in my development. However, I have found out from my consultancy work and friends that these amazing operations are not utilized by them most of the time. Here is my attempt to bring the necessity of using the Merge Operation to surface one more time. MERGE is a new feature that provides an efficient way to do multiple DML operations. In earlier versions of SQL Server, we had to write separate statements to INSERT, UPDATE, or DELETE data based on certain conditions; however, at present, by using the MERGE statement, we can include the logic of such data changes in one statement that even checks when the data is matched and then just update it, and similarly, when the data is unmatched, it is inserted. One of the most important advantages of MERGE statement is that the entire data are read and processed only once. In earlier versions, three different statements had to be written to process three different activities (INSERT, UPDATE or DELETE); however, by using MERGE statement, all the update activities can be done in one pass of database table. I have written about these Merge Operations earlier in my blog post over here SQL SERVER – 2008 – Introduction to Merge Statement – One Statement for INSERT, UPDATE, DELETE. I was asked by one of the readers that how do we know that this operator was doing everything in single pass and was not calling this Merge Operator multiple times. Let us run the same example which I have used earlier; I am listing the same here again for convenience. --Let’s create Student Details and StudentTotalMarks and inserted some records. USE tempdb GO CREATE TABLE StudentDetails ( StudentID INTEGER PRIMARY KEY, StudentName VARCHAR(15) ) GO INSERT INTO StudentDetails VALUES(1,'SMITH') INSERT INTO StudentDetails VALUES(2,'ALLEN') INSERT INTO StudentDetails VALUES(3,'JONES') INSERT INTO StudentDetails VALUES(4,'MARTIN') INSERT INTO StudentDetails VALUES(5,'JAMES') GO CREATE TABLE StudentTotalMarks ( StudentID INTEGER REFERENCES StudentDetails, StudentMarks INTEGER ) GO INSERT INTO StudentTotalMarks VALUES(1,230) INSERT INTO StudentTotalMarks VALUES(2,255) INSERT INTO StudentTotalMarks VALUES(3,200) GO -- Select from Table SELECT * FROM StudentDetails GO SELECT * FROM StudentTotalMarks GO -- Merge Statement MERGE StudentTotalMarks AS stm USING (SELECT StudentID,StudentName FROM StudentDetails) AS sd ON stm.StudentID = sd.StudentID WHEN MATCHED AND stm.StudentMarks > 250 THEN DELETE WHEN MATCHED THEN UPDATE SET stm.StudentMarks = stm.StudentMarks + 25 WHEN NOT MATCHED THEN INSERT(StudentID,StudentMarks) VALUES(sd.StudentID,25); GO -- Select from Table SELECT * FROM StudentDetails GO SELECT * FROM StudentTotalMarks GO -- Clean up DROP TABLE StudentDetails GO DROP TABLE StudentTotalMarks GO The Merge Join performs very well and the following result is obtained. Let us check the execution plan for the merge operator. You can click on following image to enlarge it. Let us evaluate the execution plan for the Table Merge Operator only. We can clearly see that the Number of Executions property suggests value 1. Which is quite clear that in a single PASS, the Merge Operation completes the operations of Insert, Update and Delete. I strongly suggest you all to use this operation, if possible, in your development. I have seen this operation implemented in many data warehousing applications. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Joins, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Merge

    Read the article

  • My new laptop - with a really nice battery option

    - by Rob Farley
    It was about time I got a new laptop, and so I made a phone-call to Dell to discuss my options. I decided not to get an SSD from them, because I’d rather choose one myself – the sales guy tells me that changing the HD doesn’t void my warranty, so that’s good (incidentally, I’d love to hear people’s recommendations for which SSD to get for my laptop). Unfortunately this machine only has one HD slot, but I figure that I’ll put lots of stuff onto external disks anyway. The machine I got was a Dell Studio XPS 16. It’s red (which suits my company), but also has the Intel® Core™ i7-820QM Processor, which is 4 Cores/8 Threads. Makes for a pretty Task Manager, but nothing like the one I saw at SQLBits last year (at 96 cores), or the one that my good friend James Rowland-Jones writes about here. But the reason for this post is actually something in the software that comes with the machine – you know, the stuff that most people uninstall at the earliest opportunity. I had just reinstalled the operating system, and was going through the utilities to get the drivers up-to-date, when I noticed that one of Dell applications included an option to disable battery charging. So I installed it. And sure enough, I can tell the battery not to charge now. Clearly Dell see it as a temporary option, and one that’s designed for when you’re on a plane. But for me, I most often use my laptop with the power plugged in, which means I don’t need to have my battery continually topping itself up. So I really love this option, but I feel like it could go a little further. I’d like “Not Charging” to be the default option, and let me set it when I want to charge it (which should theoretically make my battery last longer). I also intend to work out how this option works, so that I can script it and put it into my StartUp options (so it can be the Default setting). Actually – if someone has already worked this out and can tell me what it does, then please feel free to let me know. Even better would be an external switch. I had a switch on my old laptop (a Dell Latitude) for WiFi, so that I could turn that off before I turned on the computer (this laptop doesn’t give me that option – no physical switch for flight mode). I guess it just means I’ll get used to leaving the WiFi off by default, and turning it on when I want it – might save myself some battery power that way too. Soon I’ll need to take the plunge and sync my iPhone with the new laptop. I’m a little worried that I might lose something – Apple’s messages about how my stuff will be wiped and replaced with what’s on the PC doesn’t fill me with confidence, as it’s a new PC that doesn’t have stuff on it. But having a new machine is definitely a nice experience, and one that I can recommend. I’m sure when I get around to buying an SSD I’ll feel like it’s shiny and new all over again! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Write TSQL, win a Kindle.

    - by Fatherjack
    So recently Red Gate launched sqlmonitormetrics.red-gate.com and showed the world how to embed your own scripts harmoniously in a third party tool to get the details that you want about your SQL Server performance. The site has a way to submit your own metrics and take a copy of the ones that other people have submitted to build a library of code to keep track of key metrics of your servers performance. There have been several submissions already but they have now launched a competition to provide an incentive for you to get creative and show us what you can do with a bit of TSQL and the SQL Monitor framework*. What’s it worth? Well, if you are one of the 3 winners then you get to choose either a Kindle Fire or $199. How do you win? Simply write the T-SQL for a SQL Monitor custom metric and the relevant description and introduction for it and submit it via  sqlmonitormetrics.red-gate.com before 14th Sept 2012 and then sit back and wait while the judges review your code and your aims in writing the metric. Who are the judges and how will they judge the metrics? There are two judges for this competition, Steve Jones (Microsoft SQL Server MVP, co-founder of SQLServerCentral.com, author, blogger etc) and Jonathan Allen (um, yeah, Steve has done all the good stuff, I’m here by good fortune). We will be looking to rate the metrics on each of 3 criteria: how the metric can help with performance tuning SQL Server. how having the metric running enables DBA’s to meet best practice. how interesting /original the idea for the metric is. Our combined decision will be final etc etc **  What happens to my metric? Any metrics submitted to the competition will be automatically entered into the site library and become available for sharing once the competition is over. You’ll get full credit for metrics you submit regardless of the competition results. You can enter as many metrics as you like. How long does it take? Honestly? Once you have the T-SQL sorted then so long as you can type your name and your email address you are done : http://sqlmonitormetrics.red-gate.com/share-a-metric/ What can I monitor? If you really really want a Kindle or $199 (and let’s face it, who doesn’t? ) and are momentarily stuck for inspiration, take a look at these example custom metrics that have been written by Stuart Ainsworth, Fabiano Amorim, TJay Belt, Louis Davidson, Grant Fritchey, Brad McGehee and me  to start the library off. There are some great pieces of TSQL in those metrics gathering important stats about how SQL Server is performing.   * – framework may not be the best word here but I was under pressure and couldnt think of a better one. If you prefer try ‘engine’, or ‘application’? I don’t know, pick something that makes sense to you. ** – for the full (legal) version of the rules check the details on sqlmonitormetrics.red-gate.com or send us an email if you want any point clarified. Disclaimer – Jonathan is a Friend of Red Gate and as such, whenever they are discussed, will have a generally positive disposition towards Red Gate tools. Other tools are often available and you should always try others before you come back and buy the Red Gate ones. All code in this blog is provided “as is” and no guarantee, warranty or accuracy is applicable or inferred, run the code on a test server and be sure to understand it before you run it on a server that means a lot to you or your manager.

    Read the article

  • The Oracle Retail Week Awards - most exciting awards yet?

    - by sarah.taylor(at)oracle.com
    Last night's annual Oracle Retail Week Awards saw the UK's top retailers come together to celebrate the very best of our industry over the last year.  The Grosvenor House Hotel on Park Lane in London was the setting for an exciting ceremony which this year marked several significant milestones in British - and global - retail.  Check out our videos about the event at our Oracle Retail YouTube channel, and see if you were snapped by our photographer on our Oracle Retail Facebook page. There were some extremely hot contests for many of this year's awards - and all very deserving winners.  The entries have demonstrated beyond doubt that retailers have striven to push their standards up yet again in all areas over the past year.  The judging panel includes some of the most prestigious names in the retail industry - to impress the panel enough to win an award is a substantial achievement.  This year the panel included the likes of Andy Clarke - Chief Executive of ASDA Group; Mark Newton Jones - CEO of Shop Direct Group; Richard Pennycook - the finance director at Morrisons; Rob Templeman - Chief Executive of Debenhams; and Stephen Sunnucks - the president of Gap Europe.  These are retail veterans  who have each helped to shape the British High Street over the last decade.  It was great to chat with many of them in the Oracle VIP area last night.  For me, last night's highlight was honouring both Sir Stuart Rose and Sir Terry Leahy for their contributions to the retail industry.  Both have set the standards in retailing over the last twenty years and taken their respective businesses from strength to strength, demonstrating that there is always a need for innovation even in larger businesses, and that a business has to adapt quickly to new technology in order to stay competitive.  Sir Terry Leahy's retirement this year marks the end of an era of global expansion for the Tesco group and a milestone in the progression of British retail.  Sir Terry has helped steer Tesco through nearly 20 years of change, with 14 years as Chief Executive.  During this time he led the drive for international expansion and an aggressive campaign to increase market share.  He has led the way for High Street retailers in adapting to the rise of internet retailing and nurtured a very successful home delivery service.  More recently he has pioneered the notion of cross-channel retailing with the introduction of Tesco apps for the iPhone and Android mobile phones allowing customers to scan barcodes of items to add to a shopping list which they can then either refer to in store or order for delivery.  John Lewis Partnership was a very deserving winner of The Oracle Retailer of the Year award for their overall dedication to excellent retailing practices.  The business was also named the American Express Marketing/Advertising Campaign of the Year award for their memorable 'Never Knowingly Undersold' advert series, which included a very successful viral video and radio campaign with Fyfe Dangerfield's cover of Billy Joel's 'She's Always a Woman' used for the adverts.  Store Design of the Year was another exciting category with Topshop taking the accolade for its flagship Oxford Street store in London, which combines boutique concession-style stalls with high fashion displays and exclusive collections from leading designers.  The store even has its own hairdressers and food hall, making it a truly all-inclusive fashion retail experience and a global landmark for any self-respecting international fashion shopper. Over the next few weeks we'll be exploring some of the winning entries in more detail here on the blog, so keep an eye out for some unique insights into how the winning retailers have made such remarkable achievements. 

    Read the article

  • Bitmask data insertions in SSDT Post-Deployment scripts

    - by jamiet
    On my current project we are using SQL Server Data Tools (SSDT) to manage our database schema and one of the tasks we need to do often is insert data into that schema once deployed; the typical method employed to do this is to leverage Post-Deployment scripts and that is exactly what we are doing. Our requirement is a little different though, our data is split up into various buckets that we need to selectively deploy on a case-by-case basis. I was going to use a SQLCMD variable for each bucket (defaulted to some value other than “Yes”) to define whether it should be deployed or not so we could use something like this in our Post-Deployment script: IF ($(DeployBucket1Flag) = 'Yes')BEGIN   :r .\Bucket1.data.sqlENDIF ($(DeployBucket2Flag) = 'Yes')BEGIN   :r .\Bucket2.data.sqlENDIF ($(DeployBucket3Flag) = 'Yes')BEGIN   :r .\Bucket3.data.sqlEND That works fine and is, I’m sure, a very common technique for doing this. It is however slightly ugly because we have to litter our deployment with various SQLCMD variables. My colleague James Rowland-Jones (whom I’m sure many of you know) suggested another technique – bitmasks. I won’t go into detail about how this works (James has already done that at Using a Bitmask - a practical example) but I’ll summarise by saying that you can deploy different combinations of the buckets simply by supplying a different numerical value for a single SQLCMD variable. Each bit of that value’s binary representation signifies whether a particular bucket should be deployed or not. This is better demonstrated using the following simple script (which can be easily leveraged inside your Post-Deployment scripts): /* $(DeployData) is a SQLCMD variable that would, if you were using this in SSDT, be declared in the SQLCMD variables section of your project file. It should contain a numerical value, defaulted to 0. In this example I have declared it using a :setvar statement. Test the affect of different values by changing the :setvar statement accordingly. Examples: :setvar DeployData 1 will deploy bucket 1 :setvar DeployData 2 will deploy bucket 2 :setvar DeployData 3   will deploy buckets 1 & 2 :setvar DeployData 6   will deploy buckets 2 & 3 :setvar DeployData 31  will deploy buckets 1, 2, 3, 4 & 5 */ :setvar DeployData 0 DECLARE  @bitmask VARBINARY(MAX) = CONVERT(VARBINARY,$(DeployData)); IF (@bitmask & 1 = 1) BEGIN     PRINT 'Bucket 1 insertions'; END IF (@bitmask & 2 = 2) BEGIN     PRINT 'Bucket 2 insertions'; END IF (@bitmask & 4 = 4) BEGIN     PRINT 'Bucket 3 insertions'; END IF (@bitmask & 8 = 8) BEGIN     PRINT 'Bucket 4 insertions'; END IF (@bitmask & 16 = 16) BEGIN     PRINT 'Bucket 5 insertions'; END An example of running this using DeployData=6 The binary representation of 6 is 110. The second and third significant bits of that binary number are set to 1 and hence buckets 2 and 3 are “activated”. Hope that makes sense and is useful to some of you! @Jamiet P.S. I used the awesome HTML Copy feature of Visual Studio’s Productivity Power Tools in order to format the T-SQL code above for this blog post.

    Read the article

  • Oracle Tutor: Installing Is Not Implementing or Why CIO's should care about End User Adoption

    - by emily.chorba(at)oracle.com
    Eighteen months ago I showed Tutor and UPK Productive Day One overview to a CIO friend of mine. He works in a manufacturing business which had been recently purchased by a global conglomerate. He had a major implementation coming up, but said that the corporate team would be coming in to handle the project. I asked about their end user training approach, but it was unclear to him at the time. We were in touch over the course of the implementation project. The major activities were data conversion, how-to workshops, General Ledger realignment, and report definition. The message was "Here's how we do it at corporate, and here's how you are going to do it." In short, it was an application software installation. The corporate team had experience and confidence and the effort through go-live was smooth. Some weeks after cutover, problems with customer orders began to surface. Orders could not be fulfilled in a timely fashion. The problem got worse, and the corporate emergency team was called in. After many days of analysis, the issue was tracked down and resolved, but by then there were weeks of backorders, and their customer base was impacted in a significant way. It took three months of constant handholding of customers by the sales force for good will to be reestablished, and this itself diminished a new product sales push. I learned of these results in a recent conversation with the CIO. I asked him what the solution to the problem was, and he replied that it was twofold. The first component was a lack of understanding by customer service reps about how a particular data item in order entry was to be filled in, resulting in discrepant order data. The second component was that product planners were using this data, along with data from other sources, to fill in a spreadsheet based on the abandoned system. This spreadsheet was the primary input for planning data. The result of these two inaccuracies was that key parts were not being ordered to effectively meet demand and the lead time for finished goods was pushed out by weeks. I reminded him about the Productive Day One approach, and it's focus on methodology and tools for end user training. A more collaborative solution workshop would have identified proper applications use in the new environment. Using UPK to document correct transaction entry would have provided effective guidelines to the CSRs for data entry. Using Oracle Tutor to document the manual tasks would have eliminated the use of an out of date spreadsheet. As we talked this over, he said, "I wish I knew when I started what I know now." Effective end user adoption is the most critical and most overlooked success factor in applications implementations. When the switch is thrown at go-live, employees need to know how to use the new systems to do their jobs. Their jobs are made up of manual steps and systems steps which must be performed in the right order for the implementing organization to operate smoothly. Use Tutor to document the manual policies and procedures, use UPK to document the systems tasks, and develop this documentation in conjunction with a solution workshop. This is the path to develop effective end user training material for a smooth implementation. Learn More For more information about Tutor, visit Oracle.com or the Tutor Blog. Post your questions at the Tutor Forum. Chuck Jones, Product Manager, Oracle Tutor and BPM

    Read the article

  • PASS Summit 2011 &ndash; Part II

    - by Tara Kizer
    I arrived in Seattle last Monday afternoon to attend PASS Summit 2011.  I had really wanted to attend Gail Shaw’s (blog|twitter) and Grant Fritchey’s (blog|twitter) pre-conference seminar “All About Execution Plans” on Monday, but that would have meant flying out on Sunday which I couldn’t do.  On Tuesday, I attended Allan Hirt’s (blog|twitter) pre-conference seminar entitled “A Deep Dive into AlwaysOn: Failover Clustering and Availability Groups”.  Allan is a great speaker, and his seminar was packed with demos and information about AlwaysOn in SQL Server 2012.  Unfortunately, I have lost my notes from this seminar and the presentation materials are only available on the pre-con DVD.  Hmpf! On Wednesday, I attended Gail Shaw’s “Bad Plan! Sit!”, Andrew Kelly’s (blog|twitter) “SQL 2008 Query Statistics”, Dan Jones’ (blog|twitter) “Improving your PowerShell Productivity”, and Brent Ozar’s (blog|twitter) “BLITZ! The SQL – More One Hour SQL Server Takeovers”.  In Gail’s session, she went over how to fix bad plans and bad query patterns.  Update your stale statistics! How to fix bad plans Use local variables – optimizer can’t sniff it, so it’ll optimize for “average” value Use RECOMPILE (at the query or stored procedure level) – CPU hit OPTIMIZE FOR hint – most common value you’ll pass How to fix bad query patterns Don’t use them – ha! Catch-all queries Use dynamic SQL OPTION (RECOMPILE) Multiple execution paths Split into multiple stored procedures OPTION (RECOMPILE) Modifying parameter values Use local variables Split into outer and inner procedure OPTION (RECOMPILE) She also went into “last resort” and “very last resort” options, but those are risky unless you know what you are doing.  For the average Joe, she wouldn’t recommend these.  Examples are query hints and plan guides. While I enjoyed Andrew’s session, I didn’t take any notes as it was familiar material.  Andrew is a great speaker though, and I’d highly recommend attending his sessions in the future. Next up was Dan’s PowerShell session.  I need to look into profiles, manifests, function modules, and function import scripts more as I just didn’t quite grasp these concepts.  I am attending a PowerShell training class at the end of November, so maybe that’ll help clear it up.  I really enjoyed the Excel integration demo.  It was very cool watching PowerShell build the spreadsheet in real-time.  I must look into this more!  On a side note, I am jealous of Dan’s hair.  Fabulous hair! Brent’s session showed us how to quickly gather information about a server that you will be taking over database administration duties for.  He wrote a script to do a fast health check and then later wrapped it into a stored procedure, sp_Blitz.  I can’t wait to use this at my work even on systems where I’ve been the primary DBA for years, maybe there’s something I’ve overlooked.  We are using EPM to help standardize our environment and uncover problems, but sp_Blitz will definitely still help us out.  He even provides a cloud-based update feature, sp_BlitzUpdate, for sp_Blitz so you don’t have to constantly update it when he makes a change.  I think I’ll utilize his update code for some other challenges that we face at my work.

    Read the article

  • Social Targeting: This One's Just for You

    - by Mike Stiles
    Think of social targeting in terms of the archery competition we just saw in the Olympics. If someone loaded up 5 arrows and shot them straight up into the air all at once, hoping some would land near the target, the world would have united in laughter. But sadly for hysterical YouTube video viewing, that’s not what happened. The archers sought to maximize every arrow by zeroing in on the spot that would bring them the most points. Marketers have always sought to do the same. But they can only work with the tools that are available. A firm grasp of the desired target does little good if the ad products aren’t there to deliver that target. On the social side, both Facebook and Twitter have taken steps to enhance targeting for marketers. And why not? As the demand to monetize only goes up, they’re quite motivated to leverage and deliver their incredible user bases in ways that make economic sense for advertisers. You could target keywords on Twitter with promoted accounts, and get promoted tweets into search. They would surface for your followers and some users that Twitter thought were like them. Now you can go beyond keywords and target Twitter users based on 350 interests in 25 categories. How does a user wind up in one of these categories? Twitter looks at that user’s tweets, they look at whom they follow, and they run data through some sort of Twitter secret sauce. The result is, you have a much clearer shot at Twitter users who are most likely to welcome and be responsive to your tweets. And beyond the 350 interests, you can also create custom segments that find users who resemble followers of whatever Twitter handle you give it. That means you can now use boring tweets to sell like a madman, right? Not quite. This ad product is still quality-based, meaning if you’re not putting out tweets that lead to interest and thus, engagement, that tweet will earn a low quality score and wind up costing you more under Twitter’s auction system to maintain. That means, as the old knight in “Indiana Jones and the Last Crusade” cautions, “choose wisely” when targeting based on these interests and categories to make sure your interests truly do line up with theirs. On the Facebook side, they’re rolling out ad targeting that uses email addresses, phone numbers, game and app developers’ user ID’s, and eventually addresses for you bigger brands. Why? Because you marketers asked for it. Here you were with this amazing customer list but no way to reach those same customers should they be on Facebook. Now you can find and communicate with customers you gathered outside of social, and use Facebook to do it. Fair to say such users are a sensible target and will be responsive to your message since they’ve already bought something from you. And no you’re not giving your customer info to Facebook. They’ll use something called “hashing” to make sure you don’t see Facebook user data (beyond email, phone number, address, or user ID), and Facebook can’t see your customer data. The end result, social becomes far more workable and more valuable to marketers when it delivers on the promise that made it so exciting in the first place. That promise is the ability to move past casting wide nets to the masses and toward concentrating marketing dollars efficiently on the targets most likely to yield results.

    Read the article

  • Problem with Json Date format when calling cross-domain proxy

    - by Christo Fur
    I am using a proxy service to allow my client side javascript to talk to a service on another domain The proxy is a simple ashx file with simply gets the request and forwards it onto the service on the other domain : using (var sr = new System.IO.StreamReader(context.Request.InputStream)) { requestData = sr.ReadToEnd(); } string data = HttpUtility.UrlDecode(requestData); using (var client = new WebClient()) { client.BaseAddress = serviceUrl; client.Headers.Add("Content-Type", "application/json"); response = client.UploadString(new Uri(webserviceUrl), data); } The client javascript calling this proxy looks like this function TestMethod() { $.ajax({ type: "POST", url: "/custommodules/configuratorproxyservice.ashx?m=TestMethod", contentType: "application/json; charset=utf-8", data: JSON.parse('{"testObj":{"Name":"jo","Ref":"jones","LastModified":"\/Date(-62135596800000+0000)\/"}}'), dataType: "json", success: AjaxSucceeded, error: AjaxFailed }); function AjaxSucceeded(result) { alert(result); } function AjaxFailed(result) { alert(result.status + ' - ' + result.statusText); } } This works fine until I have to pass a date. At which point I get a Bad Request error when the proxy tries to call the service I did have this working at one point but have now lost it. Have tried using JSON.Parse on the object before sending. and JSON.Stringify, but no joy anyone got any ideas what I am missing

    Read the article

  • Why isn't my WPF Datagrid showing data?

    - by Edward Tanguay
    This walkthrough says you can create a WPF datagrid in one line but doesn't give a full example. So I created an example using a generic list and connected it to the WPF datagrid, but it doesn't show any data. What do I need to change on the code below to get it to show data in the datagrid? ANSWER: This code works now: XAML: <Window x:Class="TestDatagrid345.Window1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:toolkit="http://schemas.microsoft.com/wpf/2008/toolkit" xmlns:local="clr-namespace:TestDatagrid345" Title="Window1" Height="300" Width="300" Loaded="Window_Loaded"> <StackPanel> <toolkit:DataGrid ItemsSource="{Binding}"/> </StackPanel> </Window> Code Behind: using System.Collections.Generic; using System.Windows; namespace TestDatagrid345 { public partial class Window1 : Window { private List<Customer> _customers = new List<Customer>(); public List<Customer> Customers { get { return _customers; }} public Window1() { InitializeComponent(); } private void Window_Loaded(object sender, RoutedEventArgs e) { DataContext = Customers; Customers.Add(new Customer { FirstName = "Tom", LastName = "Jones" }); Customers.Add(new Customer { FirstName = "Joe", LastName = "Thompson" }); Customers.Add(new Customer { FirstName = "Jill", LastName = "Smith" }); } } }

    Read the article

  • ASP.NET Ajax - Asynch request has separate session???

    - by Marcus King
    We are writing a search application that saves the search criteria to session state and executes the search inside of an asp.net updatepanel. Sometimes when we execute multiple searches successively the 2nd or 3rd search will sometimes return results from the first set of search criteria. Example: our first search we do a look up on "John Smith" - John Smith results are displayed. The second search we do a look up on "Bob Jones" - John Smith results are displayed. We save all of the search criteria in session state as I said, and read it from session state inside of the ajax request to format the DB query. When we put break points in VS everything behaves as normal, but without them we get the original search criteria and results. My guess is because they are saved in session, that the ajax request somehow gets its own session and saves the criteria to that, and then retrieves the criteria from that session every time, but the non-async stuff is able to see when the criteria is modified and saves the changes to state accordingly, but because they are from two different sessions there is a disparity in what is saved and read. EDIT::: To elaborate more, there was a suggestion of appending the search criteria to the query string which normally is good practice and I agree thats how it should be but following our requirements I don't see it as being viable. They want it so the user fills out the input controls hits search and there is no page reload, the only thing they see is a progress indicator on the page, and they still have the ability to navigate and use other features on the current page. If I were to add criteria to the query string I would have to do another request causing the whole page to load, which depending on the search criteria can take a really long time. This is why we are using an ajax call to perform the search and why we aren't causing another full page request..... I hope this clarifies the situation.

    Read the article

< Previous Page | 14 15 16 17 18 19 20  | Next Page >