Search Results

Search found 2145 results on 86 pages for 'training'.

Page 74/86 | < Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >

  • Release 17 is here!

    - by Cheryl
    Our training development team has been busy updating courses to keep pace with the new release of CRM On Demand. Release 17 is here! And I heard recently that it's one of our biggest releases ever. A lot of new features and functionality for you to take advantage of - too much for me to cover in this blog post. But, I thought I'd tell you about a few of my favorites - be sure to take a look at the What's New in Release 17 recording to see the full list, though...because I'm only going to touch on a few. Create your own look - okay, I'm starting with the fun stuff. But, there is a new customizable themes feature so that you can change the look of the application; colors, logo, the shape of the tabs. And it's really easy. There's also a whole new library of ready-made themes for you to pick from if you just want to go with one of those. Use this new feature to match the look of your company logo and color scheme. Or blaze new trails. You can create the look for the whole company, or a different look for each CRM On Demand role. This might especially come in handy if you're using the Partner Relationship Management (PRM) capabilities of CRM On Demand - you can create themes for your partner-facing roles to provide branded partner portals. Speaking of PRM - there are enhancements in this release to help companies better manage their partner relationships. A new Deal Registration object, which is separate from the Opportunity record, and better Special Pricing Request and Marketing Development Fund Request processes, give a lot more flexibility in how companies can build and manage their relationships with partners. Some new options for Forecasts in in Release 17, too. You can now have more than one type of forecast generated each forecast period. For example, you might need to see a forecast of the total opportunity revenue for your sales team, as well as on that breaks down revenue by product. The forecast definition now lets you do that. Other options allow you to make submitting forecasts easier, split opportunity revenue across the team and forecast that split appropriately. And - look for the new Forecast subject area in Answers, for building custom forecast reports. Ever wish you could use Workflow Rules to automatically reassign leads if they haven't been followed up on...or to email a manager if the status of a service request isn't changed after a specified period of time? Then check out the new Wait action for workflows. I think you'll be happy. Ok, enough for today. There is a lot to Release 17 that I didn't mention - a lot has been added for our Life Science industry edition, some new data visibility options, a new Data Loader tool, and more. Stay tuned for more blog posts about these and other Release 17 features in the coming weeks. In the meantime, don't forget about all of the resources we have for you to learn more (see my Learning About Release 17 blog post for details).

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Evaluating Oracle Data Mining Has Never Been Easier - Evaluation "Kit" Available

    - by chberger
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Now you can quickly and easily get set up to starting using Oracle Data Mining for evaluation purposes. Just go to the Oracle Technology Network (OTN) and follow these simple steps. Oracle Data Mining Evaluation "Kit" Instructions Step 1: Download and Install the Oracle Database 11g Release 2 Anyone can download and install the Oracle Database for free for evaluation purposes. Read OTN web site for details. 11.2.0.1.0 DB is the minimum, 11.2.0.2 is better and naturally 11.2.0.3 is best if you are a current customer and on active support. Either 32-bit or 64-bit is fine. 4GB of RAM or more works fine for SQL Developer and the Oracle Data Miner GUI extension. Downloading the database and installing it should take just about an hour or so, depending on your network and computer. For more instructions on setting up Oracle Data Mining see: http://www.oracle.com/technetwork/database/options/odm/dataminerworkflow-168677.html When you install the Oracle Database, the Sample Examples data should also be installed e.g.:Release 2 Examples win32_11gR2_examples.zip (565,154,740 bytes). Contains examples of how to use the Oracle Database. Download if you are new to Oracle and want to try some of the examples presented in the Documentation Step 2: Install SQL Developer 3.1 (the Oracle Data Mining Extension installs automatically) Step 3. Follow the four free step-by-step Oracle-by-Examples e-training lessons: Setting Up Oracle Data Miner 11g Release 2 This tutorial covers the process of setting up Oracle Data Miner 11g Release 2 for use within Oracle SQL Developer 3.0. Using Oracle Data Miner 11g Release 2 This tutorial covers the use of Oracle Data Miner to perform data mining against Oracle Database 11g Release 2. In this lesson, you examine and solve a data mining business problem by using the Oracle Data Miner graphical user interface (GUI). Star Schema Mining Using Oracle Data Miner This tutorial covers the use of Oracle Data Miner to perform star schema mining against Oracle Database 11g Release 2. Text Mining Using Oracle Data Miner This tutorial covers the use of Oracle Data Miner to perform text mining against Oracle Database 11g Release 2. That’s it! Easy, fun and the fastest way to get started evaluating Oracle Data Mining. Enjoy! Charlie

    Read the article

  • ORA-4031 Troubleshooting

    - by [email protected]
      QUICKLINK: Note 396940.1 Troubleshooting and Diagnosing ORA-4031 Error Note 1087773.1 : ORA-4031 Diagnostics Tools [Video]   Have you observed an ORA-04031 error reported in your alert log? An ORA-4031 error is raised when memory is unavailable for use or reuse in the System Global Area (SGA).  The error message will indicate the memory pool getting errors and high level information about what kind of allocation failed and how much memory was unavailable.  The challenge with ORA-4031 analysis is that the error and associated trace is for a "victim" of the problem.   The failing code ran into the memory limitation, but in almost all cases it was not part of the root problem.    Looking for the best way to diagnose? When an ORA-4031 error occurs, a trace file is raised and noted in the alert log if the process experiencing the error is a background process.   User processes may experience errors without reports in the alert log or traces generated.   The V$SHARED_POOL_RESERVED view will show reports of misses for memory over the life of the database. Diagnostics scripts are available in Note 430473.1 to help in analysis of the problem.  There is also a training video on using and interpreting the script data Note 1087773.1. 11g DiagnosabilityStarting with Oracle Database 11g Release 1, the Diagnosability infrastructure was introduced which places traces and core files into a location controlled by the DIAGNOSTIC_DEST initialization parameter when an incident, such as an ORA-4031 occurs. For earlier versions, the trace file will be written to either USER_DUMP_DEST (if the error was caught in a user process) or BACKGROUND_DUMP_DEST (if the error was caught in a background process like PMON or SMON). The trace file contains vital information about what led to the error condition.  Note 443529.1 11g Quick Steps to Package and Send Critical Error Diagnostic Information to Support[Video]Oracle Configuration Manager (OCM)Oracle Configuration Manager (OCM) works with My Oracle Support to enable proactive support capability that helps you organize, collect and manage your Oracle configurations.Oracle Configuration Manager Quick Start GuideNote 548815.1: My Oracle Support Configuration Management FAQ Note 250434.1: BULLETIN: Learn More About My Oracle Support Configuration Manager    Common Causes/Solutions The ORA-4031 can occur for many different reasons.  Some possible causes are: SGA components too small for workload Auto-tuning issues Fragmentation due to application design Bug/leaks in memory allocationsFor more on the 4031 and how this affects the SGA, see Note 396940.1 Troubleshooting and Diagnosing ORA-4031 Error Because of the multiple potential causes, it is important to gather enough diagnostics so that an appropriate solution can be identified.  However, most commonly the cause is associated with configuration tuning.   Ensuring that MEMORY_TARGET or SGA_TARGET are large enough to accommodate workload can get around many scenarios.  The default trace associated with the error provides very high level information about the memory problem and the "victim" that ran into the issue.   The data in the default trace is not going to point to the root cause of the problem. When migrating from 9i to 10g and higher, it is necessary to increase the size of the Shared Pool due to changes in the basic design of the shared memory area. Note 270935.1 Shared pool sizing in 10gNOTE: Diagnostics on the errors should be investigated as close to the time of the error(s) as possible.  If you must restart a database, it is not feasible to diagnose the problem until the database has matured and/or started seeing the problems again. Note 801787.1 Common Cause for ORA-4031 in 10gR2, Excess "KGH: NO ACCESS" Memory Allocation ***For reference to the content in this blog, refer to Note.1088239.1 Master Note for Diagnosing ORA-4031 

    Read the article

  • Retail CEO Interviews

    - by David Dorf
    Businessweek's 2012 Interview Issue has interviews with three retail CEOs that are worth a quick read.  I copied some excerpts below, but please follow the links to the entire interviews. Ron Johnson, CEO JCPenney Take me through your merchandising. One of the things I learned from Steve [Jobs]—Steve said three times in his life he had the chance to be part of the change of an interface. If you change the interface, you can dramatically change the entire experience of the product. For Steve, that was the mouse, the scroll wheel on the iPod, and then the [touch]screen. What we’re trying to do here is change the interface of retail. What we call that is the street, and you’re standing in the middle of it. When you walk into a store today, you’re overwhelmed by merchandise. There is a narrow aisle. Typically, it’s filled with product on tables and you’re overwhelmed with the noise of signs and promotions. Especially in the age of the Internet, the idea of going to a very large store and having so much abundance is actually not very appealing. The first thing you find here is you’re inspired. I have used the mannequins. The street is actually this new navigation path for a retail store. So if you come in here—you’ll notice that these aisles are 14 feet wide. These are wider than Nordstrom’s (JWN). Slide show of JCPenney store. Walter Robb, co-CEO Whole Foods What did you learn from the recent recession about selling groceries?It was a lot of humble pie, because our sales experienced a drop that I have never seen in 32 years of retail. Customers left us in droves. We also learned that there were some very loyal customers who loved Whole Foods (WFM), people who said, “I like what you stand for. I like coming here. I like this experience.” That was very affirming. I think the realization was that we’ve got some customers, and we need to make sure we know who they are. So instead of chasing every customer out there, we started doing customer discussion groups. We were growing for growth’s sake, which is not a good strategy. We were chasing the rainbow. We cut the growth in half overnight and said, “All right, slow down. Let’s make sure we’re doing this better and more thoroughly and more thoughtfully.” This company is a mission-based company. This company started to change the world by bringing healthier food to the world. It’s not about the money, it’s about the impact, and this company is back on track as a result of those experiences. Video of Whole Foods store tour. Kay Krill, CEO Ann Taylor You’ve worked in retail all your life. What drew you to it?I graduated from college, and I did not know what I wanted to do. Macy’s (M) came to campus to interview for their training program, and I thought, “Let me give it a try.” I got the job and fell in love with the industry. The president of Macy’s at the time said, “If you don’t wake up every morning dying to go to work, then retailing is not for you; it has to be in your blood.” It was in my blood. I love the fact that every day is different. You can get to be creative one day, financial the next day, marketing the next. I love going to stores. I love talking to associates. I love talking to clients. There’s not a predictable day.

    Read the article

  • Oracle celebrates a successful Oracle CloudWorld in Bogotá

    - by yaldahhakim
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 written by: Diana Tamayo Tovar Oracle CloudWorld Bogotá began with scattered showers, rain and strong winds, inviting Colombians to spend a whole day in the social, mobile and complete world of Oracle Cloud. The event took place on November 6th with 807 attendees, 15 media representatives and 65 partners, who gathered to share the business value of Cloud along with Oracle executives and Colombian market leaders. Line-of-business leaders in sales and marketing, customer service and support, HR and talent management, and finance and operations, shared their ideas with Colombian customers, giving them a chance to learn, discover and engage with the tools, trends and concepts of Cloud. The highlights of the event included the presence of keynote speakers such as Bob Evans, Chief Communications Officer, and a customer testimonial session with top business leaders from insurance, finances, retail, communications and health Colombian industries, who shared their innovation experiences and success stories on workforce empowerment, talent management, cloud security, social engagement and productivity, providing best case scenarios on how Oracle has helped them transform their business with technologies like cloud, social collaboration and mobile applications. The keynote session was preceded by a customer success story from one of the largest virtual network operator in the country, providing an interesting case study of mobile banking innovation and a great customer testimonial of the importance of cross industry strategies and cloud technology. The event provided five different tracks on the main trends of how companies communicate and engage with different audiences, providing a different perspective on the importance of empowering brands through their customers, trusting and investing in technology for growth, while Oracle University shared their knowledge with “Oracle Cloud Fundamentals” a training lesson regarding Java Cloud, Database Cloud and other Oracle Cloud product technologies and solutions. The rainy day scenario included sideshows of aerial acrobatics and speed painting performances to recreate the environment of modern and flexible Cloud Solutions in a colorful and creative way. Oracle CloudWorld Bogotá was a great opportunity to expose invalid cloud Myths and the main concerns of the Colombian customers towards cloud, considering IDC Latin America studies stating that 93% of Colombian business leaders are interested in cloud but only 47% understand its business value. Spending a day in the cloud with 6 demogrounds stations, conference sessions, interesting case studies and customer testimonials will surely widen the endless market opportunities for Colombian customers, leaving them amazed with how Oracle Cloud works towards integration with other environments, non oracle applications, social media and mobile devices with bulletproof security infrastructure. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Windows Azure Emulators On Your Desktop

    - by BuckWoody
    Many people feel they have to set up a full Azure subscription online to try out and develop on Windows Azure. But you don’t have to do that right away. In fact, you can download the Windows Azure Compute Emulator – a “cloud development environment” – right on your desktop. No, it’s not for production use, and no, you won’t have other people using your system as a cloud provider, and yes, there are some differences with Production Windows Azure, but you’ll be able code, run, test, diagnose, watch, change and configure code without having any connection to the Internet at all. The best thing about this approach is that when you are ready to deploy the code you’ve been testing, a few clicks deploys it to your subscription when you make one.   So what deep-magic does it take to run such a thing right on your laptop or even a Virtual PC? Well, it’s actually not all that difficult. You simply download and install the Windows Azure SDK (you can even get a free version of Visual Studio for it to run on – you’re welcome) from here: http://msdn.microsoft.com/en-us/windowsazure/cc974146.aspx   This SDK will also install the Windows Azure Compute Emulator and the Windows Azure Storage Emulator – and then you’re all set. Right-click the icon for Visual Studio and select “Run as Administrator”:    Now open a new “Cloud” type of project:   Add your Web and Worker Roles that you want to code:   And when you’re done with your design, press F5 to start the desktop version of Azure:   Want to learn more about what’s happening underneath? Right-click the tray icon with the Azure logo, and select the two emulators to see what they are doing:          In the configuration files, you’ll see a “Use Development Storage” setting. You can call the BLOB, Table or Queue storage and it will all run on your desktop. When you’re ready to deploy everything to Windows Azure, you simply change the configuration settings and add the storage keys and so on that you need.   Want to learn more about all this?   Overview of the Windows Azure Compute Emulator: http://msdn.microsoft.com/en-us/library/gg432968.aspx Overview of the Windows Azure Storage Emulator: http://msdn.microsoft.com/en-us/library/gg432983.aspx January 2011 Training Kit: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=413E88F8-5966-4A83-B309-53B7B77EDF78&displaylang=en      

    Read the article

  • Apps UX Unveils New Face of Fusion at OpenWorld 2012

    - by Kathy.Miedema
    By Kathy Miedema, Oracle Applications User Experience The Oracle Applications User Experience (UX) team is getting ready to unveil the new face of Oracle Fusion Applications at Oracle OpenWorld 2012 in San Francisco next week. Photos by Martin Taylor, Oracle Applications User ExperienceJeremy Ashley, Vice President of Oracle Applications User Experience, shows the new face of Fusion Applications to a group of trainers at Oracle’s headquarters in Redwood Shores, Calif. Our team spent the past 6 months working on this project, which embraces simplicity with a modern, productive user experience that aims to help our applications customers rapidly scale deployment of essential self-service tasks and speed adoption by users who need quick access to do quick-entry tasks. We have spent the week before OpenWorld at Oracle headquarters in Redwood Shores, conducting training sessions with Fusion UX Advocates (FXA), Oracle UX Sales Ambassadors (SAMBA), and members of the Oracle Usability Advisory Board (OUAB). We showed the new face of Fusion to customers, partners, ACE Directors, and people from our own sales organization. Next week during OpenWorld, they will be showing demos alongside our team members. To find them, look for the Usable Apps t-shirt, with this artwork: You can also get a look at the new face of Fusion during OpenWorld at the following sessions and demopods: GEN9433 - General Session: Oracle Fusion Applications—Overview, Strategy, and Roadmap Presenter: Chris Leone, Senior Vice President, Oracle Monday, Oct. 1, 10:45 a.m. – 11:45 a.m. in Moscone West 2002/2004 AND Wednesday, Oct. 3, 10:1 a.m. – 11:15 a.m. in Moscone West 2002/2004 CON9407 - Oracle Fusion Customer Relationship Management: Overview/Strategy/Customer Experiences/Roadmap Presenter: Anthony Lye, Senior Vice President, Oracle Monday, Oct. 1, 3:15 – 4:15 p.m. in Moscone West 2008 CON9438 - Oracle Fusion Applications: Transforming Insight into Action Presenters: Jeremy Ashley, Vice President Applications User Experience, Oracle; Katie Candland, Director Applications User Experience, Oracle; Basheer Khan, founder and CEO of Innowave Technology, an Oracle ACE Director for both Fusion Middleware and Applications, and a Fusion UX Advocate Tuesday, Oct. 2, 10:15 a.m. - 11:15 a.m. in Moscone West 2007 CON9467 - Oracle’s Roadmap to a Simple, Modern User Experience Presenter: Jeremy Ashley, Vice President Applications User Experience, Oracle Wednesday, Oct. 3, 3:30 p.m. - 4:30 p.m. in Moscone West 3002/3004 On the demogrounds: Come to the Apps UX pods for a look at enterprise applications on mobile devices such as smart phones and the iPad, and stay for a demo of the new face of Oracle Fusion Applications. Our demopods will also feature some of the cutting-edge tools in Oracle’s arsenal of usability evaluation methods. The Exhibition Hall at Oracle OpenWorld 2012 will be open Monday through Wednesday, Oct. 1-3. The demogrounds for Oracle Applications are located on the lower level of Moscone West in San Francisco. Hours for the Exhibition Hall are: · Monday, 10 a.m. to 6 p.m. · Tuesday, 9:45 a.m. to 6 p.m. · Wednesday, 9:45 a.m. to 4 p.m.

    Read the article

  • T-SQL in Chicago – the LobsterPot teams with DataEducation

    - by Rob Farley
    In May, I’ll be in the US. I have board meetings for PASS at the SQLRally event in Dallas, and then I’m going to be spending a bit of time in Chicago. The big news is that while I’m in Chicago (May 14-16), I’m going to teach my “Advanced T-SQL Querying and Reporting: Building Effectiveness” course. This is a course that I’ve been teaching since the 2005 days, and have modified over time for 2008 and 2012. It’s very much my most popular course, and I love teaching it. Let me tell you why. For years, I wrote queries and thought I was good at it. I was a developer. I’d written a lot of C (and other, more fun languages like Prolog and Lisp) at university, and then got into the ‘real world’ and coded in VB, PL/SQL, and so on through to C#, and saw SQL (whichever database system it was) as just a way of getting the data back. I could write a query to return just about whatever data I wanted, and that was good. I was better at it than the people around me, and that helped. (It didn’t help my progression into management, then it just became a frustration, but for the most part, it was good to know that I was good at this particular thing.) But then I discovered the other side of querying – the execution plan. I started to learn about the translation from what I’d written into the plan, and this impacted my query-writing significantly. I look back at the queries I wrote before I understood this, and shudder. I wrote queries that were correct, but often a long way from effective. I’d done query tuning, but had largely done it without considering the plan, just inferring what indexes would help. This is not a performance-tuning course. It’s focused on the T-SQL that you read and write. But performance is a significant and recurring theme. Effective T-SQL has to be about performance – it’s the biggest way that a query becomes effective. There are other aspects too though – such as using constructs better. For example – I can write code that modifies data nicely, but if I haven’t learned about the MERGE statement and the way that it can impact things, I’m missing a few tricks. If you’re going to do this course, a good place to be is the situation I was in a few years before I wrote this course. You’re probably comfortable with writing T-SQL queries. You know how to make a SELECT statement do what you need it to, but feel there has to be a better way. You can write JOINs easily, and understand how to use LEFT JOIN to make sure you don’t filter out rows from the first table, but you’re coding blind. The first module I cover is on Query Execution. Take a look at the Course Outline at Data Education’s website. The first part of the first module is on the components of a SELECT statement (where I make you think harder about GROUP BY than you probably have before), but then we jump straight into Execution Plans. Some stuff on indexes is in there too, as is simplification and SARGability. Some of this is stuff that you may have heard me present on at conferences, but here you have me for three days straight. I’m sure you can imagine that we revisit these topics throughout the rest of the course as well, and you’d be right. In the second and third modules we look at a bunch of other aspects, including some of the T-SQL constructs that lots of people don’t know, and various other things that can help your T-SQL be, well, more effective. I’ve had quite a lot of people do this course and be itching to get back to work even on the first day. That’s not a comment about the jokes I tell, but because people want to look at the queries they run. LobsterPot Solutions is thrilled to be partnering with Data Education to bring this training to Chicago. Visit their website to register for the course. @rob_farley

    Read the article

  • Securing Flexfield Value Sets in EBS 12.2

    - by Sara Woodhull
    Release 12.2 includes a new feature: flexfield value set security. This new feature gives you additional options for ensuring that different administrators have non-overlapping responsibilities, which in turn provides checks and balances for sensitive activities.  Separation of Duties (SoD) is one of the key concepts of internal controls and is a requirement for many regulations including: Sarbanes-Oxley (SOX) Act Health Insurance Portability and Accountability Act (HIPAA) European Union Data Protection Directive. Its primary intent is to put barriers in place to prevent fraud or theft by an individual acting alone. Implementing Separation of Duties requires minimizing the possibility that users could modify data across application functions where the users should not normally have access. For flexfields and report parameters in Oracle E-Business Suite, values in value sets can affect functionality such as the rollup of accounting data, job grades used at a company, and so on. Controlling access to the creation or modification of value set values can be an important piece of implementing Separation of Duties in an organization. New Flexfield Value Set Security feature Flexfield value set security allows system administrators to restrict users from viewing, adding or updating values in specific value sets. Value set security enables role-based separation of duties for key flexfields, descriptive flexfields, and report parameters. For example, you can set up value set security such that certain users can view or insert values for any value set used by the Accounting Flexfield but no other value sets, while other users can view and update values for value sets used for any flexfields in Oracle HRMS. You can also segregate access by Operating Unit as well as by role or responsibility.Value set security uses a combination of data security and role-based access control in Oracle User Management. Flexfield value set security provides a level of security that is different from the previously-existing and similarly-named features in Oracle E-Business Suite: Function security controls whether a user has access to a specific page or form, as well as what operations the user can do in that screen. Flexfield value security controls what values a user can enter into a flexfield segment or report parameter (by responsibility) during routine data entry in many transaction screens across Oracle E-Business Suite. Flexfield value set security (this feature, new in Release 12.2) controls who can view, insert, or update values for a particular value set (by flexfield, report, or value set) in the Segment Values form (FNDFFMSV). The effect of flexfield value set security is that a user of the Segment Values form will only be able to view those value sets for which the user has been granted access. Further, the user will be able to insert or update/disable values in that value set if the user has been granted privileges to do so.  Flexfield value set security affects independent, dependent, and certain table-validated value sets for flexfields and report parameters. Initial State of the Feature upon Upgrade Because this is a new security feature, it is turned on by default.  When you initially install or upgrade to Release 12.2.2, no users are allowed to view, insert or update any value set values (users may even think that their values are missing or invalid because they cannot see the values).  You must explicitly set up access for specific users by enabling appropriate grants and roles for those users.We recommend using flexfield value set security as part of a comprehensive Separation of Duties strategy. However, if you choose not to implement flexfield value set security upon upgrading to or installing Release 12.2, you can enable backwards compatibility--users can access any value sets if they have access to the Values form--after you upgrade. The feature does not affect day-to-day transactions that use flexfields.  However, you must either set up specific grants and roles or enable backwards compatibility before users can create new values or update or disable existing values. For more information, see: Release 12.2 Flexfield Value Set Security Documentation Update for Patch 17305947:R12.FND.C (Document 1589204.1) R12.2 TOI: Implement and Use Application Object Library (AOL) - Flexfields Security and Separation of Duties for Value Sets (recorded training)

    Read the article

  • Moving the Oracle User Experience Forward with the New Release 7 Simplified UI for Oracle Sales Cloud

    - by mvaughan
    By Kathy Miedema, Oracle Applications User ExperienceIn September 2013, Release 7 for Oracle Cloud Applications became generally available for Oracle Sales Cloud and HCM Cloud. This significant release allowed the Oracle Applications User Experience (UX) team to finally talk freely about Simplified UI, a user experience project in the works since Oracle OpenWorld 2012. Simplified UI represents the direction that the Oracle user experience – for all of its enterprise applications – is heading. Oracle’s Apps UX team began by building a Simplified UI for sales representatives. You can find that today in Release 7, and it was demoed extensively during OpenWorld 2013 in San Francisco. This screenshot shows how Opportunities appear in the new Simplified UI for Oracle Sales Cloud, a user interface built for sales reps.Analyst Rebecca Wettemann, vice president of Nucleus Research, saw Simplified UI at Oracle Openworld 2013 and talked about it with CRM Buyer in “Oracle Revs Its Cloud Engines for a Better Customer Experience.” Wettemann said there are distinct themes to the latest release: "One is usability. Oracle Sales Cloud, for example, is designed to have zero training for onboarding sales reps, which it does," she explained. "It is quite impressive, actually -- the intuitive nature of the application and the design work they have done with this goal in mind."The software uses as few buttons and fields as possible, she pointed out. "The sales rep doesn't have to ask, 'what is the next step?' because she can see what it is."In fact, there are three themes driving the usability that Wettemann noted. They are simplicity, mobility, and extensibility, and we write more about them on the Usable Apps web site. These three themes embody the strategy for Oracle’s cloud applications user experiences.  Simplified UI for Oracle Sales CloudIn developing a Simplified UI for Oracle Sales Cloud, Oracle’s UX team concentrated on the tasks that sales reps need to do most frequently, and are most important. “Knowing that the majority of their work lives are spent on the road and on the go, they need to be able to quickly get in and qualify and convert their leads, monitor and progress their opportunities, update their customer and contact information, and manage their schedule,” Jeremy Ashley, Vice President of the Applications UX team, said.Ashley said the Apps UX team has a good reason for creating a Simplified UI that focuses on self-service. “Sales people spend the day selling stuff,” he said. “The only reason they use software is because the company wants to track what they’re doing.” Traditional systems of tracking that information include filling in a spreadsheet of leads or sales. Oracle wants to automate this process for the salesperson, and enable that person to keep everyone who needs to know up-to-date easily and quickly. Simplified UI addresses that problem by providing light-touch input.  “It has to be useful to the salesperson,” Ashley said about the Sales Cloud user experience. Simplified UI can tell sales reps about key opportunities, or provide information about a contact in just a click or two. Customer information is accessible quickly and easily with Simplified UI for the Oracle Sales Cloud.Simplified UI for Sales Cloud can also be extended easily, Ashley said. Users usually just need to add various business fields or create and modify analytical reports. The way that Simplified UI is constructed allows extensibility to happen by hiding or showing a few necessary fields. The Settings user interface, starting in release 7, allows for the simple configuration of the most important visual elements. “With Sales cloud, we identified a need to make the application useful and very simple,” Ashley said. Simplified UI meets that need. Where can you find out more?To find out more about the simplified UI and Oracle’s ongoing investment in applications user experience innovations, come to one of our sessions at a user group conference near you. Stay tuned to the Voice of User Experience (VoX) blog – the next post will be about Simplified UI and HCM Cloud.

    Read the article

  • Internships at Oracle &ndash; a truly multicultural experience!

    - by cristian.condurache(at)oracle.com
    Hello everybody!!! Our names are Lena and Laura, we both study in the same Grande Ecole in France, IPAG and we are about to complete our 16 week-internship in Oracle in the UK. Below a summary of our experience! My name is Lena. I am 20 years old and joined Oracle UK in September 2010 – more specifically, I joined the EMEA Graduate's Recruitment Team (EMEA stands for Europe, Middle East and Africa), and I have learned a lot about working life. It was a really good experience, which made me realize that I soon will be looking for a fulltime employee in a company in less than 3 years. I am glad to have had this first experience in Oracle. First of all because it's a very welcoming company which treats interns as employees and gives them the opportunity to show their potential. I also discovered that it is nice to work in a company where everybody knows everybody, and where the atmosphere is really good. The multicultural aspect is one of the most important and beautiful elements of Oracle. It gives you the opportunity to have contacts in many parts of the world and discover a lot of nice people. During my internship I learned a lot about Recruitment. I discovered I want to work in a Human Resources role after I graduate. I like the contact I will have with candidates and the fact that I have to be in touch with managers and understand their needs. I would be glad to work for the company in the near future. I would like to thank all my team members for welcoming me like they did. It was a real pleasure to share this experience in Oracle and in this team and I hope to return after I graduate.   Hi all! I am Laura. My wish for this internship was to focus on training of personal skills for employees and, by the same time of course, for the company’s development.... and I did it in the OTD team (EMEA Organization Talent Development Team). I could not have done something better than this! It was truly instructive. I learnt how to work in such a big international company, the values and the rules to follow and to interact and be part of the organisation. In Oracle, there are so different aspects of every department, so many possibilities in HR as well as in Finance or Sales... The jobs are very various and the employees’ cultures are also really different thanks to this international and multicultural company. I am working with OTD for the entire EMEA region, having many of my colleagues in other countries, with other cultures, other ways to work, and other ways to think... this is so inspiring! Oracle offers the best environment to learn about a job, as well as to learn about work life in such large companies. This company is about new technologies, it always goes fast, and everything changes quickly! You have to be aware of these changes and keep track of the wishes of customers. For OTD of course, these customers are the employees. Looking back I have learnt more then I would have ever thought and I know that it is what I want to do... And now I hope to come back again! I want to thank all my team for welcoming me and integrating me with such happiness. I will truly miss them!! If you have any questions related to this article feel free to contact [email protected]. You can find our job opportunities via http://campus.oracle.com. Technorati Tags: Oracle,EMEA,Recruitment,internship,ODT,team

    Read the article

  • Duke's Choice Award Ceremony

    - by Tori Wieldt
    The 2012 Duke's Choice Awards winners and their creative, Java-based technologies and Java community contributions were honored after the Sunday night JavaOne keynotes. Sharat Chander, Group Director for Java Technology Outreach, presented the awards. "Having the community participate directly in both submission and selection truly shows how we are driving exposure of the innovation happening in the Java community," he said. Apache Software Foundation Hadoop Project The Apache Software Foundation’s Hadoop project, written in Java, provides a framework for distributed processing of big data sets across clusters of computers, ranging from a few servers to thousands of machines. This harnessing of large data pools allows organizations to better understand and improve their business. AgroSense Project Improving farming methods to feed a hungry world is the goal of AgroSense, an open source farm information management system built in Java and the NetBeans platform. AgroSense enables farmers, agribusinesses, suppliers and others to develop modular applications that will easily exchange information through a common underlying NetBeans framework. JDuchess Rather than focus on a specific geographic area like most Java User Groups (JUGs), JDuchess fosters the participation of women in the Java community worldwide. The group has more than 500 members in 60 countries, and provides a platform through which women can connect with each other and get involved in all aspects of the Java community. Jelastic, Inc. Moving existing Java applications to the cloud can be a daunting task, but startup Jelastic, Inc. offers the first all-Java platform-as-a-service (PaaS) that enables existing Java applications to be deployed in the cloud without code changes or lock-in. Liquid Robotics Robotics – Liquid Robotics is an ocean data services provider whose Wave Glider technology collects information from the world’s oceans for application in government, science and commercial applications. The organization features the “father of Java” James Gosling as its chief software architect. London Java Community The second user group receiving a Duke’s Choice Award this year, the London Java Community (LJC) and its users have been active in the OpenJDK, the Java Community Process (JCP) and other efforts within the global Java community. NATO The first-ever Community Choice Award goes to the MASE Integrated Console Environment (MICE) in use at NATO. Built in Java on the NetBeans platform, MICE provides a high-performance visualization environment for conducting air defense and battle-space operations. Parleys.com E-learning specialist Parleys.com, based in Brussels, Belgium, uses Java technologies to bring online classes and full IT conferences to desktops, laptops, tablets and mobile devices. Parleys.com has hosted more than 1,700 conferences—including Devoxx and JavaOne—for more than 800,000 unique visitors. Student Nokia Developer Group This year’s student winner, Ram Kashyap, is the founder and president of the Nokia Student Network, and was profiled in the “The New Java Developers” feature in the March/April 2012 issue of Java Magazine. Since then, Ram has maintained a hectic pace, graduating from the People’s Education Society Institute of Technology in Bangalore, India, while working on a Java mobile startup and training students on Java ME. United Nations High Commissioner for Refugees The United Nations High Commissioner for Refugees (UNHCR) is on the front lines of crises around the world, from civil wars to natural disasters. To help facilitate its mission of humanitarian relief, the UNHCR has developed a light-client Java application on the NetBeans platform. The Level One registration tool enables the UNHCR to collect information on the number of refugees and their water, food, housing, health, and other needs in the field, and combines that with geocoding information from various sources. This enables the UNHCR to deliver the appropriate kind and amount of assistance where it is needed. You can read more about the winners in the current issue of Java Magazine.

    Read the article

  • IPgallery banks on Solaris SPARC

    - by Frederic Pariente
    IPgallery is a global supplier of converged legacy and Next Generation Networks (NGN) products and solutions, including: core network components and cloud-based Value Added Services (VAS) for voice, video and data sessions. IPgallery enables network operators and service providers to offer advanced converged voice, chat, video/content services and rich unified social communications in a combined legacy (fixed/mobile), Over-the-Top (OTT) and Social Community (SC) environments for home and business customers. Technically speaking, this offer is a scalable and robust telco solution enabling operators to offer new services while controlling operating expenses (OPEX). In its solutions, IPgallery leverages the following Oracle components: Oracle Solaris, Netra T4 and SPARC T4 in order to provide a competitive and scalable solution without the price tag often associated with high-end systems. Oracle Solaris Binary Application Guarantee A unique feature of Oracle Solaris is the guaranteed binary compatibility between releases of the Solaris OS. That means, if a binary application runs on Solaris 2.6 or later, it will run on the latest release of Oracle Solaris.  IPgallery developed their application on Solaris 9 and Solaris 10 then runs it on Solaris 11, without any code modification or rebuild. The Solaris Binary Application Guarantee helps IPgallery protect their long-term investment in the development, training and maintenance of their applications. Oracle Solaris Image Packaging System (IPS) IPS is a new repository-based package management system that comes with Oracle Solaris 11. It provides a framework for complete software life-cycle management such as installation, upgrade and removal of software packages. IPgallery leverages this new packaging system in order to speed up and simplify software installation for the R&D and production environments. Notably, they use IPS to deliver Solaris Studio 12.3 packages as part of the rapid installation process of R&D environments, and during the production software deployment phase, they ensure software package integrity using the built-in verification feature. Solaris IPS thus improves IPgallery's time-to-market with a faster, more reliable software installation and deployment in production environments. Extreme Network Performance IPgallery saw a huge improvement in application performance both in CPU and I/O, when running on SPARC T4 architecture in compared to UltraSPARC T2 servers.  The same application (with the same activation environment) running on T2 consumes 40%-50% CPU, while it consumes only 10% of the CPU on T4. The testing environment comprised of: Softswitch (Call management), TappS (Telecom Application Server) and Billing Server running on same machine and initiating various services in capacity of 1000 CAPS (Call Attempts Per Second). In addition, tests showed a huge improvement in the performance of the TCP/IP stack, which reduces network layer processing and in the end Call Attempts latency. Finally, there is a huge improvement within the file system and disk I/O operations; they ran all tests with maximum logging capability and it didn't influence any benchmark values. "Due to the huge improvements in performance and capacity using the T4-1 architecture, IPgallery has engineered the solution with less hardware.  This means instead of deploying the solution on six T2-based machines, we will deploy on 2 redundant machines while utilizing Oracle Solaris Zones and Oracle VM for higher availability and virtualization" Shimon Lichter, VP R&D, IPgallery In conclusion, using the unique combination of Oracle Solaris and SPARC technologies, IPgallery is able to offer solutions with much lower TCO, while providing a higher level of service capacity, scalability and resiliency. This low-OPEX solution enables the operator, the end-customer, to deliver a high quality service while maintaining high profitability.

    Read the article

  • Lost in Translation – Common Mistakes Interpreting Patterns – Mark Simpson, Griffiths-Waite @ SOA, Cloud & Service Technology Symposium 2012

    - by JuergenKress
    ORACLE PROMOTIONAL DISCOUNT FOR EXCLUSIVE ORACLE DISCOUNT, ENTER PROMO CODE: DJMXZ370 For details please visit the registration page International SOA, Cloud + Service Technology Symposium is a yearly event that features the top experts and authors from around the world, providing a series of keynotes, talks, demonstrations, and panels, as well as training and certification workshops - all dedicated to empowering IT professionals to realize modern service technologies and practices in the real world. Click here for a two-page printable conference overview (PDF). Speaker: Mark Simpson, Griffiths-Waite Mark has been specialising in Oracle technology for 13 years, the last 10 of these with Griffiths Waite. Mark leads our SOA technology practice (covering SOA, Business Process Management and Enterprise Architecture). He is a much sought after presenter on the Oracle and SOA conference circuits, and a respected authority on these technologies. Mark has advised a host of UK leading organisations on the deployment of BPM / SOA solutions. Working closely with Oracle US Product Development Mark has contributed to Oracle's SOA Methodology and Oracle's SOA Maturity Model. Lost in Translation – Common Mistakes Interpreting Patterns Learn how small misinterpretations of high-level design patterns can have large and costly project ramifications. Good SOA design benefits from the use of a reference architecture and standardised design patterns. However both of these concepts give an abstracted view of the intended solution, which needs to be interpreted to become realised. A reference implementation is important to demonstrate how key design guidelines can be implemented in the toolset of choice, but the main success factor is how these are used through the build and post live phases of the project. This session will introduce practical design patterns with supporting implementation examples that, if used correctly, will give long term benefit. We will highlight implementations where misinterpretations or misalignment from pattern aims have led to issues post implementation. The session will add depth to the pattern discussions you are already having enabling confidence in proceeding to the next level of realisation whilst considering how they may be implemented within your solution and chosen toolset. September 25, 2012 - 13:55 KEYNOTES & SPEAKERS More than 80 international subject matter experts will be speaking at the Symposium. Below are confirmed keynotes and speakers so far. Over 50% of the agenda has not yet been finalized. Many more speakers to come. View the partial program calendars on the Conference Agenda page. CONFERENCE THEMES & TRACKS Cloud Computing Architecture & Patterns New SOA & Service-Orientation Practices & Models Emerging Service Technology Innovation Service Modeling & Analysis Techniques Service Infrastructure & Virtualization Cloud-based Enterprise Architecture Business Planning for Cloud Computing Projects Real World Case Studies Semantic Web Technologies (with & without the Cloud) Governance Frameworks for SOA and/or Cloud Computing Projects Service Engineering & Service Programming Techniques Interactive Services & the Human Factor New REST & Web Services Tools & Techniques Oracle Specialized SOA & BPM Partners Oracle Specialized partners have proven their skills by certifications and customer references. To find a local Specialized partner please visit http://solutions.oracle.com SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Mark Simpson,Griffiths Waite,SOA Patterns,SOA Symposium,Thomas Erl,SOA Community,Oracle SOA,Oracle BPM,BPM,Community,OPN,Jürgen Kress

    Read the article

  • Getting Started with StreamInsight 2.1

    - by Roman Schindlauer
    If you're just beginning to get familiar with StreamInsight, you may be looking for a way to get started. What are the basics? How can I get my first StreamInsight application running so I can see how it works? Where is the 'front door' that will get me going? If that describes you, then this blog entry might be just what you need. If you're already a StreamInsight wiz, keep reading anyway - you may find some helpful links here that you weren't aware of. But here's what we'd like from you experienced readers in particular: if you know of other good resources that we missed, please feel free to add them in the comments below. We appreciate you sharing your expertise. The Book The basic documentation for StreamInsight is located in the MSDN Library (Microsoft StreamInsight 2.1). You'll notice that previous versions of StreamInsight are still there (1.2 and 2.0), but if you're just getting started you can stick to the 2.1 section. The documentation has been organized to function as reference material, which is fine after you're familiar with the technology. But if you're trying to learn the basics, you might want to take a different path instead of just starting at the top. The following is one map you can use. What Is StreamInsight? Here is a sequence of topics that should give you a good overview of what StreamInsight is and how it works: Overview answers the question, "what is it?" StreamInsight Server Architecture gives you a quick look at a high-level architectural drawing StreamInsight Concepts lays out an overview of the basic components Deploying StreamInsight Entities to a StreamInsight Server describes the mechanics of how these components work together Getting an Example Running Once you have this background, go ahead and install StreamInsight and get a basic example up and running: Installation download and install the software StreamInsight Examples walk through a set of 3 simple StreamInsight applications that work together to demonstrate what you learned in the topics above; you can copy and paste the code into Visual Studio, compile, and run That's it - you now have a real, functioning StreamInsight system! Now that you have a handle on the basics, you might want to start digging deeper. Digging Deeper Here's a suggested path through the documentation to help you understand the next layer of StreamInsight technologies: Using Event Sources and Event Sinks sources supply data and sinks consume it; this topic gives you an overview of how they work Publishing and Connecting to the StreamInsight Server practical details on how to set up a StreamInsight server A Hitchhiker’s Guide to StreamInsight 2.1 Queries queries are the heart of how StreamInsight performs data analytics, and this whitepaper will help you really understand how they work Using StreamInsight LINQ root through this section for technical details on specific query components Using the StreamInsight Event Flow Debugger in addition to troubleshooting, the debugger is a great way to learn more about what goes on inside a StreamInsight application And Even Deeper Finally, to get a handle on some of the more complex things you can do with StreamInsight, dig into these: Input and Output Adapters adapters can be useful for handling more complex sources and sinks Building Resilient StreamInsight Applications a resilient application is able to recover from system failures Operations this section will help you monitor and troubleshoot a running StreamInsight system The StreamInsight Community As you're designing and developing your StreamInsight solutions, you probably will find it helpful to see working examples or to learn tips and tricks from others. Or maybe you need a place to post a vexing question. Here are some community resources that we have found useful. If you know of others, please add them in the comments below. Code samples and tools Official StreamInsight code samples Introduction to LinqPad Driver for StreamInsight 2.1 - LinqPad is a very useful tool for developing queries The following case studies are based on earlier versions of StreamInsight, but they still are useful examples: Microsoft Media Analytics - real-time monitoring and analytic Edgenet - responding to information from multiple source ICONICS - managing energy usage Blogs Microsoft StreamInsight Ruminations of J.net Richard Seroter's Architecture Musings pluralsight Forums MSDN StreamInsight Forum stackoverflow Training Microsoft StreamInsight Fundamentals (“Introducing StreamInsight” is free) from pluralsight Twitter @streaminsight   You’re a StreamInsight Expert That should get you going. Please add any other resources you have found useful in the comments below.   Regards, The StreamInsight Team

    Read the article

  • Upgrades in 5 Easy Pieces

    - by Anne R.
    Even though there are a few select tasks that I have to do once or twice a year, I can’t remember how to do them! Or where to find the bits and pieces to complete the task. So I love it when someone consolidates everything under one spot. That’s what the CRM On Demand team has done with the upgrade information. Specifically, they have: Provided a “one-stop” area for managing upgrades at your company. Broken down the upgrade process into 5 (yes, 5) steps. Explained when and how to perform each step with dates specific to your pod. Included details about each step, visible by expanding the step. Translated the steps into 11 languages. Added a list of release-specific resources with links from the page. Now, just head for the Training and Support portal, click the Release Info tab, and walk through the “5 Essential Steps to a Successful Upgrade.” Before you continue, though, select your language from the drop-down list on the Release Info page. CRM On Demand now has the upgrade steps translated into 11 languages. On the Step page, you can expand each section in sequence and follow the more detailed instructions that appear. This will ensure that you’ve covered all your bases for each upgrade. Here’s a shortened version of the information that you’ll find: 1. Verify your Primary Contact Information. Have you checked your primary contact information to make sure you’re being notified of all upgrade information? Or do you want more users to receive upgrade announcements? This section provides you with the navigation path to do that in CRM On Demand. 2. Review your Key Upgrade Dates. If you expand this step, a nice table appears with your critical dates for the various milestones. IMPORTANT: When your CRM On Demand pod has been officially added to the upgrade schedule, closer to the release date itself, this table will display your specific timetable. 3. Migrate your Customizations from the Staging Environment before the Snapshot Date. Oracle refreshes the Staging data with a copy of your Production data made on the Production Snapshot Date. So this section lists considerations relevant to this step. It also reminds you of the 2-week period when you should not be making any changes in your Staging environment.   4. Conduct your Upgrade Validation on the Staging Environment. When the Customer Validation Testing period begins, you need to log in to your Staging Environment to validate that your key business processes and customizations continue to behave as expected. If your company utilizes Web Services, Web Links, Web Applets or Workflow, focus on testing these first. You generally have about two weeks for testing. If you run into problems during this time, follow the instructions shown in this section for logging a service request. It describes exactly how to fill out the fields in the SR for the fastest resolution. 5. Conduct "White Glove" Testing in your Upgraded Production Environment. Before users start using the upgrade, you should access a few tabs and reports. Doing this actually warms up the cache so that frequently used pages and reports will come up at normal speed on Monday morning, when users log in to the upgraded system. Resources listed under this step help you in further preparing for the upgrade. Now there’s also a new Documentation section on the right with links to these release-specific resources.   Very nice, I commented, when discussing these improvements with the “responsible party.” She confirmed that, yes, they tried to consolidate the upgrade information, translate it for better communication, simplify it into 5 easy pieces, and drive admins responsible for handling upgrades to this one site instead of sending out elaborate emails. Yes, I just love it when someone practically reaches out and holds my hand through a process. Next best thing to a wizard!

    Read the article

  • SQL SERVER – What are Actions in SSAS and How to Make a Reporting Action

    - by Pinal Dave
    Actions are used for customized browsing and drilling of data for the end-user. It’s an event that a user can raise while accessing the cube data. They are used in cube browsers like excel and are triggered when a user in a client tool clicks on a particular member, level, dimension, cells or may be the cube itself.  For example a user might be able to see a reporting services report, open a web page or drill through to detailed information related to the cube data. Analysis server supports 3 types of actions :- Report Drill-through Standard Actions In this blog post, I will explain the Reporting  action. The objective of this action is to return a report with details of the product where the sales amount is greater than 1000 in cube browser analysis. You need to create a basic cube first with the facts and dimensions you want in the analysis. Following are the steps to create reporting action. Go to SQL server data tools and open the analysis services project. Navigate to actions and click on new reporting action. 2.) Specify the name of the action and choose target type as attribute members since we have to create the action on members for a attribute. 3.) Specify the Target object of your report action. Target object would be the dimension or attribute on which you want the report to appear. In our case it is product name. 4.) Next you have to define the condition on which you want the report link to appear. However, this is an optional feature. In this example we are specifying a condition, which will check if the sales amount is greater than 10,000. So, that the link appears only for those products where the defined condition is met. 5.) Next you have to specify the server name on which the report is present, report path  and the report format in which you want the report to appear. 6.) Additionally you can specify the parameters. As with conditional expression, the parameters should be a valid MDX expression. The parameter name should be same as the one defined in the report. 7.) Deploy your solution after you are done with specifying parameters and go to the cube browser. 8.) Click on the analyze in excel button, this will open your cube in excel 9.) Make an analysis which shows product names and their sales amount. 10.) Right click on a product where sales amount is greater than 10000 you will see the reporting action link. Click on that and you will be taken to your reporting services report. 11.) Clicking on the link will take you to the URL of the report. I created this report using report project wizard in SQL server data tools. So, this is how we can launch reports from a cube browser. Similarly you can open web pages, run applications and a number of  other tasks. Koenig Solutions offers SSAS training which contains all Analysis Services including Reporting in great detail. In my next blog post I will talk about drill-through actions. Author: Namita Sharma, Senior Corporate Trainer at Koenig Solutions. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SSAS

    Read the article

  • C#: A "Dumbed-Down" C++?

    - by James Michael Hare
    I was spending a lovely day this last weekend watching my sons play outside in one of the better weekends we've had here in Saint Louis for quite some time, and whilst watching them and making sure no limbs were broken or eyes poked out with sticks and other various potential injuries, I was perusing (in the correct sense of the word) this month's MSDN magazine to get a sense of the latest VS2010 features in both IDE and in languages. When I got to the back pages, I saw a wonderful article by David S. Platt entitled, "In Praise of Dumbing Down"  (msdn.microsoft.com/en-us/magazine/ee336129.aspx).  The title captivated me and I read it and found myself agreeing with it completely especially as it related to my first post on divorcing C++ as my favorite language. Unfortunately, as Mr. Platt mentions, the term dumbing-down has negative connotations, but is really and truly a good thing.  You are, in essence, taking something that is extremely complex and reducing it to something that is much easier to use and far less error prone.  Adding safeties to power tools and anti-kick mechanisms to chainsaws are in some sense "dumbing them down" to the common user -- but that also makes them safer and more accessible for the common user.  This was exactly my point with C++ and C#.  I did not mean to infer that C++ was not a useful or good language, but that in a very high percentage of cases, is too complex and error prone for the job at hand. Choosing the correct programming language for a job is a lot like choosing any other tool for a task.  For example: if I want to dig a French drain in my lawn, I can attempt to use a huge tractor-like backhoe and the job would be done far quicker than if I would dig it by hand.  I can't deny that the backhoe has the raw power and speed to perform.  But you also cannot deny that my chances of injury or chances of severing utility lines or other resources climb at an exponential rate inverse to the amount of training I may have on that machinery. Is C++ a powerful tool?  Oh yes, and it's great for those tasks where speed and performance are paramount.  But for most of us, it's the wrong tool.  And keep in mind, I say this even though I have 17 years of experience in using it and feel myself highly adept in utilizing its features both in the standard libraries, the STL, and in supplemental libraries such as BOOST.  Which, although greatly help with adding powerful features quickly, do very little to curb the relative dangers of the language. So, you may say, the fault is in the developer, that if the developer had some higher skills or if we only hired C++ experts this would not be an issue.  Now, I will concede there is some truth to this.  Obviously, the higher skilled C++ developers you hire the better the chance they will produce highly performant and error-free code.  However, what good is that to the average developer who cannot afford a full stable of C++ experts? That's my point with C#:  It's like a kinder, gentler C++.  It gives you nearly the same speed, and in many ways even more power than C++, and it gives you a much softer cushion for novices to fall against if they code less-than-optimally.  A bug is a bug, of course, in any language, but C# does a good job of hiding and taking on the task of handling almost all of the resource issues that make C++ so tricky.  For my money, C# is much more maintainable, more feature-rich, second only slightly in performance, faster to market, and -- last but not least -- safer and easier to use.  That's why, where I work, I much prefer to see the developers moving to C#.  The quantity of bugs is much lower, and we don't need to hire "experts" to achieve the same results since the language itself handles those resource pitfalls so prevalent in poorly written C++ code.  C++ will still have its place in the world, and I'm sure I'll still use it now and again where it is truly the correct tool for the job, but for nearly every other project C# is a wonderfully "dumbed-down" version of C++ -- in the very best sense -- and to me, that's the smart choice.

    Read the article

  • What is the right way to process inconsistent data files?

    - by Tahabi
    I'm working at a company that uses Excel files to store product data, specifically, test results from products before they are shipped out. There are a few thousand spreadsheets with anywhere from 50-100 relevant data points per file. Over the years, the schema for the spreadsheets has changed significantly, but not unidirectionally - in the sense that, changes often get reverted and then re-added in the space of a few dozen to few hundred files. My project is to convert about 8000 of these spreadsheets into a database that can be queried. I'm using MongoDB to deal with the inconsistency in the data, and Python. My question is, what is the "right" or canonical way to deal with the huge variance in my source files? I've written a data structure which stores the data I want for the latest template, which will be the final template used going forward, but that only helps for a few hundred files historically. Brute-forcing a solution would mean writing similar data structures for each version/template - which means potentially writing hundreds of schemas with dozens of fields each. This seems very inefficient, especially when sometimes a change in the template is as little as moving a single line of data one row down or splitting what used to be one data field into two data fields. A slightly more elegant solution I have in mind would be writing schemas for all the variants I can find for pre-defined groups in the source files, and then writing a function to match a particular series of files with a series of variants that matches that set of files. This is because, more often that not, most of the file will remain consistent over a long period, only marred by one or two errant sections, but inside the period, which section is inconsistent, is inconsistent. For example, say a file has four sections with three data fields, which is represented by four Python dictionaries with three keys each. For files 7000-7250, sections 1-3 will be consistent, but section 4 will be shifted one row down. For files 7251-7500, 1-3 are consistent, section 4 is one row down, but a section five appears. For files 7501-7635, sections 1 and 3 will be consistent, but section 2 will have five data fields instead of three, section five disappears, and section 4 is still shifted down one row. For files 7636-7800, section 1 is consistent, section 4 gets shifted back up, section 2 returns to three cells, but section 3 is removed entirely. Files 7800-8000 have everything in order. The proposed function would take the file number and match it to a dictionary representing the data mappings for different variants of each section. For example, a section_four_variants dictionary might have two members, one for the shifted-down version, and one for the normal version, a section_two_variants might have three and five field members, etc. The script would then read the matchings, load the correct mapping, extract the data, and insert it into the database. Is this an accepted/right way to go about solving this problem? Should I structure things differently? I don't know what to search Google for either to see what other solutions might be, though I believe the problem lies in the domain of ETL processing. I also have no formal CS training aside from what I've taught myself over the years. If this is not the right forum for this question, please tell me where to move it, if at all. Any help is most appreciated. Thank you.

    Read the article

  • Oracle Data Integration 12c: Perspectives of Industry Experts, Customers and Partners

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 As you may have seen from our recent blog posts on Oracle Data Integrator 12c and Oracle GoldenGate 12c, we are very excited to share with you the great new features the 12c release brings to Oracle’s data integration solutions. And, fortunately we are not alone in this sentiment. Since the press announcement October 17th, which incorporates our customers' and experts' testimonials, we have seen positive comments in leading technology publications and social media as well. Here are some examples: In CIO and PCWorld you can find Joab Jackson’s article, Oracle Data Integrator 12c ready for real-time analysis, where wrote about the tight integration between Oracle Data Integrator and Oracle GoldenGate . He noted “Heeding the call from enterprise customers who clamor for more immediacy in their data-driven reports, Oracle has updated its data-integration software portfolio so that it can more rapidly deliver data to data warehouses and analysis applications.” Integration Developer News’ Vance McCarthy wrote the article Oracle Ships ‘Future Proofs’ Integration Tools for Traditional, Cloud, Big Data, Real-Time Projects and mentioned that “Oracle Data Integrator 12c and Oracle GoldenGate 12c sport a wide range of improvements to let devs more easily deliver data integration for cloud, analytics, big data and other new projects that leverage multiple datasets for business.“ InformationWeek’s Doug Henschen gave a great overview to several key features including the new flow-based UI in Oracle Data Integrator. Doug said “Oracle Data Integrator 12c introduces a complete makeover of the job-building experience, while real-time oriented GoldenGate 12c introduces performance gains “. In Database Trends and Applications’ article Oracle Strengthens Data Integration with Release of Oracle Data Integrator 12c and Oracle GoldenGate 12c highlighted the productivity aspect of the new solution with his remarks: “tight integration between Oracle Data Integrator 12c and Oracle GoldenGate 12c enables developers to leverage Oracle GoldenGate’s low overhead, real-time change data capture completely within the Oracle Data Integrator Studio without additional training”. We are also thrilled about what our customers and partners have to say about our products and the new release. And we are equally excited to share those perspectives with you in our upcoming launch video webcast on November 12th. SolarWorld Industries America’s Senior Database Manager, Russ Toyama will join our executives in our studio in Redwood Shores to discuss GoldenGate’s core benefits and the new release, while Surren Partharb, CTO of Strategic Technology Services for BT, and Mark Rittman, CTO of Rittman Mead, will provide their comments via the interviews conducted in the UK. This interactive panel discussion in the video webcast will unveil the new release with the expertise of our development executives and the great insight from our customers and partners. In addition, our product experts will be available online to answer chat questions. This is really a great opportunity to learn how Oracle's data integration offering has changed the integration and replication technology space with the new release, and established itself as the new leader. If you have not registered for this free event yet, you can do so via this link. We will run the live event at 8am PT/4pm GMT, followed by a replay of the event with live chat for Q&A  at 10am PT/6pm GMT. The replay will be available on-demand for those who register but cannot attend either session on November 12th. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Times New Roman","serif"; mso-fareast-font-family:"Times New Roman";}

    Read the article

  • Best of OTN - Week of Oct 21st

    - by CassandraClark-OTN
    This week's Best of OTN, for you, the best devs, dba's, sysadmins and architects out there!  In these weekly posts the OTN team will highlight the top content from each community; Architect, Database, Systems and Java.  Since we'll be publishing this on Fridays, we'll also mix in a little fun! Architect Community Top Content- The Road Ahead for WebLogic 12c | Edwin BiemondOracle ACE Edwin Biemond shares his thoughts on announced new features in Oracle WebLogic 12.1.3 & 12.1.4 and compares those upcoming releases to Oracle WebLogic 12.1.2. A Roadmap for SOA Development and Delivery | Mark NelsonDo you know the way to S-O-A? Mark Nelson does. His latest blog post, part of an ongoing series, will help to keep you from getting lost along the way. Updated ODI Statement of Direction | Robert SchweighardtHeads up Oracle Data Integrator fans! A new statement of product direction document is available, offering an overview of the strategic product plans for Oracle’s data integration products for bulk data movement and transformation, specifically Oracle Data Integrator (ODI) and Oracle Warehouse Builder (OWB). Bob Rhubart, Architect Community Manager Friday Funny - "Some people approach every problem with an open mouth." — Adlai E. Stevenson (October 23, 1835 – June 14, 1914) 23rd Vice President of the United States Database Community Top Content - Pre-Built Developer VMs (for Oracle VM VirtualBox)Heard all the chatter about Oracle VirtualBox? Over 1 million downloads per week and look: pre-built virtual appliances designed specifically for developers. Video: Big Data, or BIG DATA?Oracle Ace Director Ben Prusinski explains the differences.?? Webcast Series - Developing Applications in Oracle's Public CloudTime to get started on developing and deploying cloud applications by moving to the cloud. Good friend Gene Eun from Oracle's Cloud team posted this two-part Webcast series that has an overview and demonstration of the Oracle Database Cloud Service. Check out the demos on how to migrate your data to the cloud, extend your application with interactive reporting, and create and access RESTful Web services. Registration required, but so worth it! Laura Ramsey, Database Community Manager Friday Funny - Systems Community Top Content - Video: What Kind of Scalability is Better, Horizontal or Vertical?Rick Ramsey asks the question "Is Oracle's approach to large vertically scaled servers at odds with today's trend of combining lots and lots of small, low-cost servers systems with networking to build a cloud, or is it a better approach?" Michael Palmeter, Director of Solaris Product Management, and Renato Ribeiro, Director Product Management for SPARC Servers, discuss.Video: An Engineer Takes a Minute to Explain CloudBart Smaalders, long-time Oracle Solaris core engineer, takes a minute to explain cloud from a sysadmin point of view. ?Hands-On Lab: How to Deploy and Manage a Private IaaS Cloud Soup to nuts. This lab shows you how to set up and manage a private cloud with Oracle Enterprise Manager Cloud Control 12c in an Infrastructure as a service (IaaS) model. You will first configure the IaaS cloud as the cloud administrator and then deploy guest virtual machines (VMs) as a self-service user. Rick Ramsey, Systems Community Manager Friday Funny - Video: Drunk Airline Pilot - Dean Martin - Foster Brooks Java Community Top Content - Video: NightHacking Interview with James GoslingJames Gosling, the Father of Java, discusses robotics, Java and how to keep his autonomous WaveGliders in the ocean for weeks at a time. Live from Hawaii.  Video: Raspberry Pi Developer Challenge: Remote Controller A developer who knew nothing about Java Embedded or Raspberry Pi shows how he can now control a robot with his phone. The project was built during the Java Embedded Challenge for Raspberry Pi at JavaOne 2013.Java EE 7 Certification Survey - Participants NeededHelp us define how to server your training and certification needs for Java EE 7. Tori Wieldt, Java Community Manager Friday Funny - Programmers have a strong sensitivity to Yak's pheromone. Causes irresistible desire to shave said Yak. Thanks, @rickasaurus! To follow and take part in the conversation follow/like etc. at one or all of the resources below -  OTN TechBlog The Java Source Blog The OTN Garage Blog The OTN ArchBeat Blog @oracletechnet @java @OTN_Garage @OTNArchBeat @OracleDBDev OTN I Love Java OTN Garage OTN ArchBeat Oracle DB Dev OTN Java

    Read the article

  • Some OBI EE Tricks and Tips in the Admin Tool By Gerry Langton

    - by hamsun
    How to set the log level from a Session variable Initialization block As we know it is normal to set the log level non-zero for a particular user when we wish to debug problems. However sometimes it is inconvenient to go into each user’s properties in the Admin tool and update the log level. So I am showing a method which allows the log level to be set for all users via a session initialization block. This is particularly useful for anyone wanting an alternative way to set the log level. The screen shots shown are using the OBIEE 11g SampleApp demo but are applicable to any environment. Open the appropriate rpd in on-line mode and navigate to Manage Variables. Select Session Initialization Blocks, right click in the white space and create a New Initialization Block. I called the Initialization block Set_Loglevel . Now click on ‘Edit Data Source’ to enter the SQL. Chose the ‘Use OBI EE Server’ option for the SQL. This means that the SQL provided must use tables which have been defined in the Physical layer of the RPD, and whilst there is no need to provide a connection pool you must work in On-Line mode. The SQL can access any of the RPD tables and is purely used to return a value of 2. The ‘Test’ button confirms that the SQL is valid. Next, click on the ‘Edit Data Target’ button to add the LOGLEVEL variable to the initialization block. Check the ‘Enable any user to set the value’ option so that this will work for any user. Click OK and the following message will display as LOGLEVEL is a system session variable: Click ‘Yes’. Click ‘OK’ to save the Initialization block. Then check in the On-LIne changes. To test that LOGLEVEL has been set, log in to OBIEE using an administrative login (e.g. weblogic) and reload server metadata, either from the Analysis editor or from Administration > Reload Files and Metadata link. Run a query then navigate to Administration > Manage Sessions and click ‘View Log’ for the query just issued (which should be approximately the last in the list). A log file should exist and with LOGLEVEL set to 2 should include both logical and physical sql. If more diagnostic information is required then set LOGLEVEL to a higher value. If logging is required only for a particular analysis then an alternative method can be used directly from the Analysis editor. Edit the analysis for which debugging is required and click on the Advanced tab. Scroll down to the Advanced SQL clauses section and enter the following in the Prefix box: SET VARIABLE LOGLEVEL = 2; Click the ‘Apply SQL’ button. The SET VARIABLE statement will now prefix the Analysis’s logical SQL. So that any time this analysis is run it will produce a log. You can find information about training for Oracle BI EE products here or in the OU Learning Paths. Please send me an email at [email protected] if you have any further questions. About the Author: Gerry Langton started at Siebel Systems in 1999 working as a technical instructor teaching both Siebel application development and also Siebel Analytics (which subsequently became Oracle BI EE). From 2006 Gerry has worked as Senior Principal Instructor within Oracle University specialising in Oracle BI EE, Oracle BI Publisher and Oracle Data Warehouse development for BI.

    Read the article

  • What Counts For A DBA: ESP

    - by Louis Davidson
    Now I don’t want to get religious here, and I’m not going to, but what I’m going to describe in this ‘What Counts for a DBA’ installment sometimes feels like magic. Often  I will spend hours thinking about the solution to a design issue or coding problem, working diligently to try to come up with a solution and then finally just give up with the feeling that I’m not even qualified to be a data entry clerk, much less a data architect.  At this point I often take a walk (or sometimes a nap), and then it hits me. I realize that I have the answer just sitting in my brain, ready to implement.  This phenomenon is not limited to walks either; it can happen almost any time after I stop my obsession about a problem. I call this phenomena ESP (or Extra-Sensory Programming.)  Another term for this could be ‘sleeping on it’, and while the idiom tends to mean to let time pass to actively think about a problem, sleeping on a problem also lets you relax and let your brain do the work. I first noticed this back in my college days when I would play video games for hours on end. We would get stuck deep in some dungeon unable to find a way out, playing for days on end until we were beaten down tired. Once we gave up and walked away, the solution would usually be there waiting for one of us before we came back to play the next day.  Sometimes it would be in the form of a dream, and sometimes it would just be that the problem was now easy to solve when we started to play again.  While it worked great for video games, it never occurred when I studied English Literature for hours on end, or even when I worked for the same sort of frustrating hours attempting to solve a homework problem in Calculus.  I believe that the difference was that I was passionate about the video game, and certainly far less so about homework where people used the word “thou” instead of “you” or x to represent a number. This phenomenon occurs somewhat more often in my current work as a professional data programmer, because I am very passionate about SQL and love those aspects of my career choice.  Every day that I get to draw a new data model to solve a customer issue, or write a complex SELECT statement to ferret out the answer to a complex data question, is a great day. I hope it is the same for any reader of this blog.  But, unfortunately, while the day on a whole is great, a heck of a lot of noise is generated in work life. There are the typical project deadlines, along with the requisite project manager sitting on your shoulders shouting slogans to try to make you to go faster: Add in office politics, and the occasional family issues that permeate the mind, and you lose the ability to think deeply about any problem, not to mention occasionally forgetting your own name.  These office realities coupled with a difficult SQL problem staring at you from your widescreen monitor will slowly suck the life force out of your body, making it seem impossible to solve the problem This is when the walk starts; or a nap. Maybe you hide from the madness under your desk like George Costanza hides from Steinbrenner on Seinfeld.  Forget about the problem. Free your mind from the insanity of the problem and your surroundings. Then let your training and education deep in your brain take over and see if it will passively do the rest for you. If you don’t end up with a solution, the worst case scenario is that you have a bit of exercise or rest, and you won’t have heard the phrase “better is the enemy of good enough” even once…which certainly will do your brain some good. Once you stop expecting whipping your brain for information, inspiration may just strike and instead of a humdrum solution you find a solution you hadn’t even considered, almost magically. So, my beloved manager, next time you have an urgent deadline and you come across me taking a nap, creep away quietly because I’m working, doing some extra-sensory programming.

    Read the article

  • PASS Summit 2011 &ndash; Part II

    - by Tara Kizer
    I arrived in Seattle last Monday afternoon to attend PASS Summit 2011.  I had really wanted to attend Gail Shaw’s (blog|twitter) and Grant Fritchey’s (blog|twitter) pre-conference seminar “All About Execution Plans” on Monday, but that would have meant flying out on Sunday which I couldn’t do.  On Tuesday, I attended Allan Hirt’s (blog|twitter) pre-conference seminar entitled “A Deep Dive into AlwaysOn: Failover Clustering and Availability Groups”.  Allan is a great speaker, and his seminar was packed with demos and information about AlwaysOn in SQL Server 2012.  Unfortunately, I have lost my notes from this seminar and the presentation materials are only available on the pre-con DVD.  Hmpf! On Wednesday, I attended Gail Shaw’s “Bad Plan! Sit!”, Andrew Kelly’s (blog|twitter) “SQL 2008 Query Statistics”, Dan Jones’ (blog|twitter) “Improving your PowerShell Productivity”, and Brent Ozar’s (blog|twitter) “BLITZ! The SQL – More One Hour SQL Server Takeovers”.  In Gail’s session, she went over how to fix bad plans and bad query patterns.  Update your stale statistics! How to fix bad plans Use local variables – optimizer can’t sniff it, so it’ll optimize for “average” value Use RECOMPILE (at the query or stored procedure level) – CPU hit OPTIMIZE FOR hint – most common value you’ll pass How to fix bad query patterns Don’t use them – ha! Catch-all queries Use dynamic SQL OPTION (RECOMPILE) Multiple execution paths Split into multiple stored procedures OPTION (RECOMPILE) Modifying parameter values Use local variables Split into outer and inner procedure OPTION (RECOMPILE) She also went into “last resort” and “very last resort” options, but those are risky unless you know what you are doing.  For the average Joe, she wouldn’t recommend these.  Examples are query hints and plan guides. While I enjoyed Andrew’s session, I didn’t take any notes as it was familiar material.  Andrew is a great speaker though, and I’d highly recommend attending his sessions in the future. Next up was Dan’s PowerShell session.  I need to look into profiles, manifests, function modules, and function import scripts more as I just didn’t quite grasp these concepts.  I am attending a PowerShell training class at the end of November, so maybe that’ll help clear it up.  I really enjoyed the Excel integration demo.  It was very cool watching PowerShell build the spreadsheet in real-time.  I must look into this more!  On a side note, I am jealous of Dan’s hair.  Fabulous hair! Brent’s session showed us how to quickly gather information about a server that you will be taking over database administration duties for.  He wrote a script to do a fast health check and then later wrapped it into a stored procedure, sp_Blitz.  I can’t wait to use this at my work even on systems where I’ve been the primary DBA for years, maybe there’s something I’ve overlooked.  We are using EPM to help standardize our environment and uncover problems, but sp_Blitz will definitely still help us out.  He even provides a cloud-based update feature, sp_BlitzUpdate, for sp_Blitz so you don’t have to constantly update it when he makes a change.  I think I’ll utilize his update code for some other challenges that we face at my work.

    Read the article

< Previous Page | 70 71 72 73 74 75 76 77 78 79 80 81  | Next Page >