Search Results

Search found 14506 results on 581 pages for 'document scanner'.

Page 523/581 | < Previous Page | 519 520 521 522 523 524 525 526 527 528 529 530  | Next Page >

  • How to Crop Pictures in Word, Excel, and PowerPoint 2010

    - by DigitalGeekery
    When you add pictures to your Office documents you might need to crop them to remove unwanted areas, or isolate a specific part. Today we’ll take a look at how to crop images in Office 2010. Note: We will show you examples in Word, but you can crop images in Word, Excel, and PowerPoint. To insert a picture into your Office document, click the Picture button on the Insert tab. The Picture Tools format ribbon should now be active. If not, click on the image. New in Office 2010 is the ability to see the area of the photo that you are keeping in addition to what will be cropped out. On the Format tab, click Crop. Click and drag inward any of the four corners to crop from any one side. Notice you can still see the area to be cropped out is show in translucent gray. Press and hold the CTRL key while you drag a corner cropping handle inward to crop equally on all four sides. To crop equally on right and left or the top and bottom, press and hold down the CTRL key while you drag the center cropping handle on either side inward. You can further adjust the cropping area by clicking and dragging the picture behind the cropping area. To accept the current dimensions and crop the photo, press escape or click anywhere outside the cropping area. You can manually crop the image to exact dimensions. This can be done by right clicking on the image and entering the dimensions in the Width and Height boxes, or in the Size group on the Format tab.   Crop to a Shape Select your photo and click Crop from the Size group on the Format tab. Select Crop to Shape and choose any of the available shapes. You photo will be cropped into that shape. Using Fit and Fill If you wish to crop a photo but fill the shape, select Fill. When you choose this option, some edges of the picture might not display but the original picture aspect ratio is maintained. If you wish to have all of the picture fit within a shape, choose Fit. The original picture aspect ratio will be maintained.   Conclusion Users moving from previous versions of Microsoft Office are sure to appreciate the improved cropping abilities in Office 2010, especially the ability to see what will and won’t be kept when you crop a photo. Similar Articles Productive Geek Tips Import Microsoft Access Data Into ExcelEmbed an Excel Worksheet Into PowerPoint or Word 2007Add Artistic Effects to Your Pictures in Office 2010Embed True Type Fonts in Word and PowerPoint 2007 DocumentsChange The Default Color Scheme In Office 2007 TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 TimeToMeet is a Simple Online Meeting Planning Tool Easily Create More Bookmark Toolbars in Firefox Filevo is a Cool File Hosting & Sharing Site Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate

    Read the article

  • Integrating Windows Form Click Once Application into SharePoint 2007 &ndash; Part 1 of 2

    - by Kelly Jones
    Last year, I had the opportunity to build a solution that involved integrating a Windows Form application into a SharePoint 2007 (WSS version 3.0). In this post, I’ll layout our architecture thinking and in part two, I’ll describe the technical details. Business Case Our challenge was this: we needed an easy way for a small group of our users to upload documents, in batches.  They also needed to quickly set the meta data values, as well as set security on individual files. Using the out of the box uploads just didn’t fit.  The single file upload allows set the meta data, but our users would be uploading dozens of files.  The multiple upload would allow our users to upload batches of files, but it doesn’t allow them to set the meta data during upload.  Also, neither upload method allows the users to set the permissions on the file. Our Solution We looked into building a web control of some kind, but ruled that out due to security complexities (if I remember correctly).  Another option would have been using a technology like Silverlight (or Flash?), but our team didn’t have the skills necessary to build with these. So, after looking at what was technically possible, and also what skills our team had, we settled on a Windows Form application.  We also decided to deliver it to the clients via Click Once, so we would have the ability to easily update the application in the future. Lessons Learned After deploying our solution, we’ve learned a few lessons.  First, you’ll need to have the .Net Framework installed on the client computers.  We knew this, but we still ran into issues making sure our users had the proper framework version installed.  Second, we had issues with authentication.  Our issues were due to our testing domain being a separate Active Directory domain from the domain that our end users and their workstations were members of.  (See my earlier post about Clearing Saved Passwords for the fix to our problem). Our third issue was how we dealt with uploading files that were named the same.  Our application would replace the existing file with the new file, which is the way we expected it to work.  However, our users wanted to upload weekly reports, named the same as the previous week.  We solved this by using folders within the document library to keep the sets of reports separate from previous weeks. One last thing to consider before implementing a solution like this, is what browsers and platforms your users will be working from.  We only needed to support IE and Windows, which works fine.  However, if you need to support Firefox, there are add-ons that allow Click Once to work with Firefox.  This is still a Windows only solution though.  In order to support Macs, you’d have to focus on either browser techniques (AJAX?) or Silverlight/Flash. Summary Our users are happy with the Click Once app.  It allowed them to move all of their content to our SharePoint site in under a couple hours, which they were thrilled with.  We’re happy because we can easily deploy updates, our development time was small, and we met all of our business requirements.

    Read the article

  • SQL SERVER – How to Compare the Schema of Two Databases with Schema Compare

    - by Pinal Dave
    Earlier I wrote about An Efficiency Tool to Compare and Synchronize SQL Server Databases and it was very much well received. Since the blog post I have received quite a many question that just like data how we can also compare schema and synchronize it. If you think about comparing the schema manually, it is almost impossible to do so. Table Schema has been just one of the concept but if you really want the all the schema of the database (triggers, views, stored procedure and everything else) it is just impossible task. If you are developer or database administrator who works in the production environment than you know that there are so many different occasions when we have to compare schema of the database. Before deploying any changes to the production server, I personally like to make note of the every single schema change and document it so in case of any issue , I can always go back and refer my documentation. As discussed earlier it is absolutely impossible to do this task without the help of third party tools. I personally use Devart Schema Compare for this task. This is an extremely easy tool. Let us see how it works. First I have two different databases – a) AdventureWorks2012 and b) AdventureWorks2012-V1. There are total three changes between these databases. Here is the list of the same. One of the table has additional column One of the table have new index One of the stored procedure is changed Now let see how dbForge Schema Compare works in this scenario. First open dbForge Schema Compare studio. Click on New Schema Comparison. It will bring you to following screen where we have to configure the database needed to configure. I have selected AdventureWorks2012 and AdventureWorks-V1 databases. In the next screen we can verify various options but for this demonstration we will keep it as it is. We will not change anything in schema mapping screen as in our case it is not required but generically if you are comparing across schema you may need this. This is the most important screen as on this screen we select which kind of object we want to compare. You can see the options which are available to select. The screen lets you select the objects from SQL Server 2000 to SQL Server 2012. Once you click on compare in previous screen it will bring you to this screen, which will essentially display the comparative difference between two of the databases which we had selected in earlier screen. As mentioned above there are three different changes in the database and the same has been listed over here. Two of the changes belongs to the tables and one changes belong to the procedure. Let us click each of them one by one to see what is the difference between them. In very first option we can see that there is an additional column in another database which did not exist earlier. In this example we can see that AdventureWorks2012 database have an additional index. Following example is very interesting as in this case, we have changed the definition of the stored procedure and the result pan contains the same. dbForget Schema Compare very effectively identify the changes in schema and lists them neatly to developers. Here is one more screen. This software not only compares the schema but also provides the options to update or drop them as per the choice. I think this is brilliant option. Well, I have been using schema compare for quite a while and have found it very useful. Here are few of the things which dbForge Schema Compare can do for developers and DBAs. Compare and synchronize SQL Server database schemas Compare schemas of live database and SQL Server backup Generate comparison reports in Excel and HTML formats Eliminate mistakes in schema changes propagation across environments Track production database changes and customizations Automate migration of schema changes using command line interface I suggest that you try out dbForge Schema Compare and let me know what you think of this product. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL

    Read the article

  • AutoFit in PowerPoint: Turn it OFF

    - by Daniel Moth
    Once a feature has shipped, it is very hard to eliminate it from the next release. If I was in charge of the PowerPoint product, I would not hesitate for a second to remove the dreadful AutoFit feature. Fortunately, AutoFit can be turned off on a slide-by-slide basis and, even better, globally: go to the PowerPoint "Options" and under "Proofing" find the "AutoCorrect Options…" button which brings up the dialog where you need to uncheck the last two checkboxes (see the screenshot to the right). AutoFit is the ability for the user to keep hitting the Enter key as they type more and more text into a slide and it magically still fits, by shrinking the space between the lines and then the text font size. It is the root of all slide evil. It encourages people to think of a slide as a Word document (which may be your goal, if you are presenting to execs in Microsoft, but that is a different story). AutoFit is the reason you fall asleep in presentations. AutoFit causes too much text to appear on a slide which by extension causes the following: When the slide appears, the text is so small so it is not readable by everyone in the audience. They dismiss the presenter as someone who does not care for them and then they stop paying attention. If the text is readable, but it is too much (hence the AutoFit feature kicked in when the slide was authored), the audience is busy reading the slide and not paying attention to the presenter. Humans can either listen well or read well at the same time, so when they are done reading they now feel that they missed whatever the speaker was saying. So they "switch off" for the rest of the slide until the next slide kicks in, which is the natural point for them to pick up paying attention again. Every slide ends up with different sized text. The less visual consistency between slides, the more your presentation feels unprofessional. You can do better than dismiss the (subconscious) negative effect a deck with inconsistent slides has on an audience. In contrast, the absence of AutoFit Leads to consistency among all slides in a deck with regards to amount of text and size of said text. Ensures the text is readable by everyone in the audience (presuming the PowerPoint template is designed for the room where the presentation is delivered). Encourages the presenter to create slides with the minimum necessary text to help the audience understand the basic structure, flow, and key points of the presentation. The "meat" of the presentation is delivered verbally by the presenter themselves, which is why they are in the room in the first place. Following on from the previous point, the audience can at a quick glance consume the text on the slide when it appears and then concentrate entirely on the presenter and what they have to say. You could argue that everything above has nothing to do with the AutoFit feature and all to do with the advice to keep slide content short. You would be right, but the on-by-default AutoFit feature is the one that stops most people from seeing and embracing that truth. In other words, the slides are the tool that aids the presenter in delivering their message, instead of the presenter being the tool that advances the slides which hold the message. To get there, embrace terse slides: the first step is to turn off this horrible feature (that was probably introduced due to the misuse of this tool within Microsoft). The next steps are described on my next post. Comments about this post welcome at the original blog.

    Read the article

  • How to Watch NCAA March Madness Online

    - by DigitalGeekery
    You’ve filled out your brackets and now you are ready for one of America’s most popular sporting events. But what if you are you stuck at work or away from your TV?  Or your local affiliate is showing a different game? Today we show how to catch all the March Madness online. March Madness on Demand You’ll need a broadband connection, 512 MB RAM or higher, with cookies and Javascript enabled in your browser. March Madness on Demand offers two viewing options, a Standard Player and a High Quality player. The High Quality player is not, unfortunately, high definition. Standard Player Requirements Windows XP/Vista/7 or Mac OS X IE 6+ (We also successfully tested it in Firefox, Chrome, & Opera) Adobe Flash Player 9 or higher High Quality Player Requirements 2.4 GHz Pentium 4 or Intel-based Macintosh Mac OS 10.4.8+ (Intel-based) Windows: XP SP2, Vista, Server 2003, Server 2008, Windows 7 Firefox 1.5+ or IE 6/7/8 Silverlight 3 browser plug-in Watching March Madness on Demand Go to the March Madness on Demand website. (Link below) Check the “Watch in High Quality” section to see if your browser is ready and compatible for the High Quality viewer. If not, you’ll see a message indicating either your browser and system are incompatible… Or that you need to install Silverlight. To install Silverlight, click on the “Get HQ” button and follow the prompts to download and install Silverlight. To launch the player, click the large red “Launch Player” button. At the top of the screen, you’ll see the current and upcoming games. Click on “Watch Now” below to begin watching. At the bottom left, is where you click to watch with the High Quality player. If to many people are watching the High Quality player, you’ll see the following message and have to go back to the Standard Player. At the lower right are volume controls, a “Full Screen” button, and a “Share” button which allows you to share the game you are watching on various social networking sites like Facebook and Twitter. Perhaps most importantly for those who want to steal a bit of viewing time while at work is the “Boss Button” at the top right. Clicking on the “Boss Button” will open a fake Office document so it may appear at first glance like you’re actually doing legitimate work. To return to the game, click anywhere on the screen with your mouse. You’ll be able to catch every single game of the tournament from the first round all the way through the championship with March Madness on Demand. If your computer and Internet connection can handle it, you can even watch multiple games at the same time by opening March Madness on Demand in multiple browser windows. Watch March Madness online Similar Articles Productive Geek Tips Weekend Fun: Watch Television on Your PC with AnyTVWatch NFL Sunday Night Football On Your PCWatch TV On Your PC with FreeZ Online TVGeek Fun: Download Favorite NBC Programs for FreeDitch the RealPlayer Bloat with Real Alternative TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional How to Browse Privately in Firefox Kill Processes Quickly with Process Assassin Need to Come Up with a Good Name? Try Wordoid StockFox puts a Lightweight Stock Ticker in your Statusbar Explore Google Public Data Visually The Ultimate Excel Cheatsheet

    Read the article

  • Oracle Text query parser

    - by Roger Ford
    Oracle Text provides a rich query syntax which enables powerful text searches.However, this syntax isn't intended for use by inexperienced end-users.  If you provide a simple search box in your application, you probably want users to be able to type "Google-like" searches into the box, and have your application convert that into something that Oracle Text understands.For example if your user types "windows nt networking" then you probably want to convert this into something like"windows ACCUM nt ACCUM networking".  But beware - "NT" is a reserved word, and needs to be escaped.  So let's escape all words:"{windows} ACCUM {nt} ACCUM {networking}".  That's fine - until you start introducing wild cards. Then you must escape only non-wildcarded searches:"win% ACCUM {nt} ACCUM {networking}".  There are quite a few other "gotchas" that you might encounter along the way.Then there's the issue of scoring.  Given a query for "oracle text query syntax", it would be nice if we could score a full phrase match higher than a hit where all four words are present but not in a phrase.  And then perhaps lower than that would be a document where three of the four terms are present.  Progressive relaxation helps you with this, but you need to code the "progression" yourself in most cases.To help with this, I've developed a query parser which will take queries in Google-like syntax, and convert them into Oracle Text queries. It's designed to be as flexible as possible, and will generate either simple queries or progressive relaxation queries. The input string will typically just be a string of words, such as "oracle text query syntax" but the grammar does allow for more complex expressions:  word : score will be improved if word exists  +word : word must exist  -word : word CANNOT exist  "phrase words" : words treated as phrase (may be preceded by + or -)  field:(expression) : find expression (which allows +,- and phrase as above) within "field". So for example if I searched for   +"oracle text" query +syntax -ctxcatThen the results would have to contain the phrase "oracle text" and the word syntax. Any documents mentioning ctxcat would be excluded from the results. All the instructions are in the top of the file (see "Downloads" at the bottom of this blog entry).  Please download the file, read the instructions, then try it out by running "parser.pls" in either SQL*Plus or SQL Developer.I am also uploading a test file "test.sql". You can run this and/or modify it to run your own tests or run against your own text index. test.sql is designed to be run from SQL*Plus and may not produce useful output in SQL Developer (or it may, I haven't tried it).I'm putting the code up here for testing and comments. I don't consider it "production ready" at this point, but would welcome feedback.  I'm particularly interested in comments such as "The instructions are unclear - I couldn't figure out how to do XXX" "It didn't work in my environment" (please provide as many details as possible) "We can't use it in our application" (why not?) "It needs to support XXX feature" "It produced an invalid query output when I fed in XXXX" Downloads: parser.pls test.sql

    Read the article

  • AutoFit in PowerPoint: Turn it OFF

    - by Daniel Moth
    Once a feature has shipped, it is very hard to eliminate it from the next release. If I was in charge of the PowerPoint product, I would not hesitate for a second to remove the dreadful AutoFit feature. Fortunately, AutoFit can be turned off on a slide-by-slide basis and, even better, globally: go to the PowerPoint "Options" and under "Proofing" find the "AutoCorrect Options…" button which brings up the dialog where you need to uncheck the last two checkboxes (see the screenshot to the right). AutoFit is the ability for the user to keep hitting the Enter key as they type more and more text into a slide and it magically still fits, by shrinking the space between the lines and then the text font size. It is the root of all slide evil. It encourages people to think of a slide as a Word document (which may be your goal, if you are presenting to execs in Microsoft, but that is a different story). AutoFit is the reason you fall asleep in presentations. AutoFit causes too much text to appear on a slide which by extension causes the following: When the slide appears, the text is so small so it is not readable by everyone in the audience. They dismiss the presenter as someone who does not care for them and then they stop paying attention. If the text is readable, but it is too much (hence the AutoFit feature kicked in when the slide was authored), the audience is busy reading the slide and not paying attention to the presenter. Humans can either listen well or read well at the same time, so when they are done reading they now feel that they missed whatever the speaker was saying. So they "switch off" for the rest of the slide until the next slide kicks in, which is the natural point for them to pick up paying attention again. Every slide ends up with different sized text. The less visual consistency between slides, the more your presentation feels unprofessional. You can do better than dismiss the (subconscious) negative effect a deck with inconsistent slides has on an audience. In contrast, the absence of AutoFit Leads to consistency among all slides in a deck with regards to amount of text and size of said text. Ensures the text is readable by everyone in the audience (presuming the PowerPoint template is designed for the room where the presentation is delivered). Encourages the presenter to create slides with the minimum necessary text to help the audience understand the basic structure, flow, and key points of the presentation. The "meat" of the presentation is delivered verbally by the presenter themselves, which is why they are in the room in the first place. Following on from the previous point, the audience can at a quick glance consume the text on the slide when it appears and then concentrate entirely on the presenter and what they have to say. You could argue that everything above has nothing to do with the AutoFit feature and all to do with the advice to keep slide content short. You would be right, but the on-by-default AutoFit feature is the one that stops most people from seeing and embracing that truth. In other words, the slides are the tool that aids the presenter in delivering their message, instead of the presenter being the tool that advances the slides which hold the message. To get there, embrace terse slides: the first step is to turn off this horrible feature (that was probably introduced due to the misuse of this tool within Microsoft). The next steps are described on my next post. Comments about this post welcome at the original blog.

    Read the article

  • Getting UPK data into Excel

    - by maria.cozzolino(at)oracle.com
    Did you ever want someone to review your UPK outline outside of the Developer? You can send your outline to an Excel report, which can be distributed through email. Depending on how much additional data you want with your outline, there are two ways you can do this task. Basic data: • You can print a listing of all the items in the outline. • With your outline open, choose File/Print... • Choose the "Save document as" command on the right, and choose Excel (or xlsx). • HINT: If you have not expanded your entire outline, it's faster to use the commands in Developer to expand the entire outline. However, you can expand specific sections by clicking on them in the print preview. • NOTE: If you have the Details view displayed rather than the Player view, you can print all the data that appears in that view. Advanced data: If you desire a more detailed report, you can use the HP Quality Center publishing style, which also creates an Excel file. This style contains a default set of fields for use with Quality Center, but any of the metadata fields can be added to the report, and it can be used for more than just importing into HP Quality Center. To add additional columns to the HP Quality Center publishing style: 1. Make a copy of the publishing style. This process ensures that you have a good copy to revert to if something goes wrong with your customizations, and also allows you to keep your modifications when the software is upgraded. 2. Open the copy of the columnspec.xml file in your favorite XML editor - I use notepad. (This file is located in a language-specific folder in the HP Quality Center publishing style.) 3. Scroll down the columnspec file until you find the column to include. All the metadata fields that can be added to the report are listed in the columnspec file - you just need to tell the system to include the columns. 4. You will see a series of sections like this: 5. Change the value for "col export" to "yes". This will include the column in the Excel file. 6. If desired, change the value for "Play_ModesColHeader" to be whatever name you wish to appear in the Excel column heading. 7. Save the columnspec file. 8. Save the publishing style package. Now, when you publish for HP Quality Center, you will see your newly added columns. You can refer to the section on Customizing HP Quality Center Output in the Content Deployment Guide for additional customization details. Happy customization! I'd be interested in hearing what other uses you have for Excel reporting. Wishing you and yours a happy and healthy New Year! ~~Maria Cozzolino, Manager of Software Requirements and UI

    Read the article

  • Install Oracle Configuration Manager's Standalone Collector

    - by Get Proactive Customer Adoption Team
    Untitled Document The Why and the How If you have heard of Oracle Configuration Manager (OCM), but haven’t installed it, I’m guessing this is for one of two reasons. Either you don’t know how it helps you or you don’t know how to install it. I’ll address both of those reasons today. First, let’s take a quick look at how My Oracle Support and the Oracle Configuration Manager work together to gain a good understanding of what their differences and roles are before we tackle the install.   Oracle Configuration Manger is the tool that actually performs the data collection task. You deploy this lightweight piece of software into your system to collect configuration information about the system and OCM uploads that data to Oracle’s customer configuration repository. Oracle Support Engineers then have the configuration data available when you file a service request. You can also view the data through My Oracle Support. The real value is that the data Oracle Configuration Manager collects can help you avoid problems and get your Service Requests solved more quickly. When you view the information in My Oracle Support’s user interface to OCM, it may help you avoid situations that create problems. The proactive tools included in Oracle Configuration Manager help you avoid issues before they occur. You also save time because you didn’t need to open a service request. For example, you can use this capability when you need to compare your system configuration at two points in time, or monitor the system health. If you make the configuration data available to Oracle Support Engineers, when you need to open a Service Request the data helps them diagnose and resolve your critical system issues more quickly, which means you get answers more quickly too. Quick Installation Process Overview Before we dive into the step-by-step details, let me provide a quick overview. For some of you, this will be all you need. Log in to My Oracle Support and download the data collector from Collector tab. If you don’t see the Collector tab, click the More tab gain access. On the Collector tab, you will find a drop-down list showing which platforms are available. You can also see more ways to the Collector can help you if you click through the carousel of benefits. After you download the software for your platform, use FTP to move that file (.zip) from your PC to the server that hosts the Oracle software. Once you have that file on the server, locate the $ORACLE_HOME directory, and unzip the file within that directory. You can then use the command line tool to start the installation process. The installation process requires the My Oracle Support credential (Support Identifier, username, and password) Proxy specification (Host IP Address, Port number, username and password) Installation Step-by-Step Download the collector zip file from My Oracle Support and place it into your $Oracle_Home Unzip the zip file you downloaded from My Oracle Support – this will create a directory named CCR with several subdirectories Using the command line go to “$ORACLE_HOME/CCR/bin” and run the following command “setupCCR” Provide your My Oracle Support credential: login, password, and Support Identifier The installer will start deploying the collector application You have installed the Collector Post Installation Now that you have installed successfully, the scheduler is ready to collect configuration information for the software available in your Oracle Home. By default, the first collection will take place the day after the installation. If you want to run an instrumentation script to start the configuration collection of your Oracle Database server, E-Business Suite, or Enterprise Manager, you will find more details on that in the Installation and Administration Guide for My Oracle Support Configuration Manager. Related documents available on My Oracle Support Oracle Configuration Manager Installation and Administration Guide [ID 728989.5] Oracle Configuration Manager Prerequisites [ID 728473.5] Oracle Configuration Manager Network Connectivity Test [ID 728970.5] Oracle Configuration Manager Collection Overview [ID 728985.5] Oracle Configuration Manager Security Overview [ID 728982.5] Oracle Software Configuration Manager: Disconnected Mode Collection [ID 453412.1]

    Read the article

  • List of available whitepapers as at 04 May 2010

    - by Anthony Shorten
    The following table lists the whitepapers available, from My Oracle Support, for any Oracle Utilities Application Framework based product: KB Id Document Title Contents 559880.1 ConfigLab Design Guidelines Whitepaper outlining how to design and implement a ConfigLab solution. 560367.1 Technical Best Practices for Oracle Utilities Application Framework Based Products Whitepaper summarizing common technical best practices used by partners, implementation teams and customers.  560382.1 Performance Troubleshooting Guideline Series A set of whitepapers on tracking performance at each tier in the framework. The individual whitepapers are as follows: Concepts - General Concepts and Performance Troublehooting processes Client Troubleshooting - General troubleshooting of the browser client with common issues and resolutions. Network Troubleshooting - General troubleshooting of the network with common issues and resolutions. Web Application Server Troubleshooting - General troubleshooting of the Web Application Server with common issues and resolutions. Server Troubleshooting - General troubleshooting of the Operating system with common issues and resolutions. Database Troubleshooting - General troubleshooting of the database with common issues and resolutions. Batch Troubleshooting - General troubleshooting of the background processing component of the product with common issues and resolutions. 560401.1 Software Configuration Management Series  A set of whitepapers on how to manage customization (code and data) using the tools provided with the framework. The individual whitepapers are as follows: Concepts - General concepts and introduction. Environment Management - Principles and techniques for creating and managing environments. Version Management - Integration of Version control and version management of configuration items.  Release Management - Packaging configuration items into a release.  Distribution - Distribution and installation of  releases across environments  Change Management - Generic change management processes for product implementations. Status Accounting -Status reporting techniques using product facilities.  Defect Management -Generic defect management processes for product implementations. Implementing Single Fixes - Discussion on the single fix architecture and how to use it in an implementation. Implementing Service Packs - Discussion on the service packs and how to use them in an implementation. Implementing Upgrades - Discussion on the the upgrade process and common techniques for minimizing the impact of upgrades. 773473.1 Oracle Utilities Application Framework Security Overview Whitepaper summarizing the security facilities in the framework. Updated for OUAF 4.0.1 774783.1 LDAP Integration for Oracle Utilities Application Framework based products Whitepaper summarizing how to integrate an external LDAP based security repository with the framework.  789060.1 Oracle Utilities Application Framework Integration Overview Whitepaper summarizing all the various common integration techniques used with the product (with case studies). 799912.1 Single Sign On Integration for Oracle Utilities Application Framework based products Whitepaper outlining a generic process for integrating an SSO product with the framework. 807068.1 Oracle Utilities Application Framework Architecture Guidelines This whitepaper outlines the different variations of architecture that can be considered. Each variation will include advice on configuration and other considerations. 836362.1 Batch Best Practices for Oracle Utilities Application Framework based products This whitepaper oulines the common and best practices implemented by sites all over the world. Updated for OUAF 4.0.1 856854.1 Technical Best Practices V1 Addendum  Addendum to Technical Best Practices for Oracle Utilities Application Framework Based Products containing only V1.x specific advice. 942074.1 XAI Best Practices This whitepaper outlines the common integration tasks and best practices for the Web Services Integration provided by the Oracle Utilities Application Framework. Updated for OUAF 4.0.1 970785.1 Oracle Identity Manager Integration Overview This whitepaper outlines the principals of the prebuilt intergration between Oracle Utilities Application Framework Based Products and Orade Identity Manager used to provision user and user group secuity information 1068958.1 Production Environment Configuration Guidelines (New!) Whitepaper outlining common production level settings for the products

    Read the article

  • Tuesday at OpenWorld: Identity Management

    - by Tanu Sood
    At Oracle OpenWorld? From keynotes, general sessions to product deep dives and executive events, this Tuesday is full of informational, educational and networking opportunities for you. Here’s a quick run-down of what’s happening today: Tuesday, October 2, 2012 KEYNOTE: The Oracle Cloud: Oracle’s Cloud Platform and Applications Strategy 8:00 a.m. – 9:45 a.m., Moscone North, Hall D Leading customers will join Oracle Executive Vice President Thomas Kurian to discuss how Oracle’s innovative cloud solutions are transforming how they manage their business, excite and retain their employees, and deliver great customer experiences through Oracle Cloud. GENERAL SESSION: Oracle Fusion Middleware Strategies Driving Business Innovation 10:15 a.m. – 11:15 a.m., Moscone North - Hall D Join Hasan Rizvi, Executive Vice President of Product in this strategy and roadmap session to hear how developers leverage new innovations in their applications and customers achieve their business innovation goals with Oracle Fusion Middleware. CON9437: Mobile Access Management 10:15 a.m. – 11:15 a.m., Moscone West 3022 The session will feature Identity Management evangelists from companies like Intuit, NetApp and Toyota to discuss how to extend your existing identity management infrastructure and policies to securely and seamlessly enable mobile user access. CON9162: Oracle Fusion Middleware: Meet This Year's Most Impressive Customer Projects 11:45 a.m. – 12:45 a.m., Moscone West, 3001 Hear from the winners of the 2012 Oracle Fusion Middleware Innovation Awards and see which customers are taking home a trophy for the 2012 Oracle Fusion Middleware Innovation Award.  Read more about the Innovation Awards here. CON9491: Enhancing the End-User Experience with Oracle Identity Governance applications 11:45 a.m. – 12:45 p.m., Moscone West 3008 Join experts from Visa and Oracle as they explore how Oracle Identity Governance solutions deliver complete identity administration and governance solutions with support for emerging requirements like cloud identities and mobile devices. CON9447: Enabling Access for Hundreds of Millions of Users 1:15 p.m. – 2:15 p.m., Moscone West 3008 Dealing with scale problems? Looking to address identity management requirements with million or so users in mind? Then take note of Cisco’s implementation. Join this session to hear first-hand how Cisco tackled identity management and scaled their implementation to bolster security and enforce compliance. CON9465: Next Generation Directory – Oracle Unified Directory 5:00 p.m. – 6:00 p.m., Moscone West 3008 Get the 360 degrees perspective from a solution provider, implementation services partner and the customer in this session to learn how the latest Oracle Unified Directory solutions can help you build a directory infrastructure that is optimized to support cloud, mobile and social networking and yet deliver on scale and performance. EVENTS: Executive Edge @ OpenWorld: Chief Security Officer (CSO) Summit 10:00 a.m. – 3:00 p.m. If you are attending the Executive Edge at Open World, be sure to check out the sessions at the Chief Security Officer Summit. Former Sr. Counsel for the National Security Agency, Joel Brenner, will be speaking about his new book "America the Vulnerable". In addition, PWC will present a panel discussion on "Crisis Management to Business Advantage: Security Leadership". See below for the complete agenda. PRODUCT DEMOS: And don’t forget to see Oracle identity Management solutions in action at Oracle OpenWorld DEMOgrounds. DEMOS LOCATION EXHIBITION HALL HOURS Access Management: Complete and Scalable Access Management Moscone South, Right - S-218 Monday, October 1 9:30 a.m.–6:00 p.m. 9:30 a.m.–10:45 a.m. (Dedicated Hours) Tuesday, October 2 9:45 a.m.–6:00 p.m. 2:15 p.m.–2:45 p.m. (Dedicated Hours) Wednesday, October 3 9:45 a.m.–4:00 p.m. 2:15 p.m.–3:30 p.m. (Dedicated Hours) Access Management: Federating and Leveraging Social Identities Moscone South, Right - S-220 Access Management: Mobile Access Management Moscone South, Right - S-219 Access Management: Real-Time Authorizations Moscone South, Right - S-217 Access Management: Secure SOA and Web Services Security Moscone South, Right - S-223 Identity Governance: Modern Administration and Tooling Moscone South, Right - S-210 Identity Management Monitoring with Oracle Enterprise Manager Moscone South, Right - S-212 Oracle Directory Services Plus: Performant, Cloud-Ready Moscone South, Right - S-222 Oracle Identity Management: Closed-Loop Access Certification Moscone South, Right - S-221 For a complete listing, keep the Focus on Identity Management document handy. And don’t forget to converse with us while at OpenWorld @oracleidm. We look forward to hearing from you.

    Read the article

  • Prepping the Raspberry Pi for Java Excellence (part 1)

    - by HecklerMark
    I've only recently been able to begin working seriously with my first Raspberry Pi, received months ago but hastily shelved in preparation for JavaOne. The Raspberry Pi and other diminutive computing platforms offer a glimpse of the potential of what is often referred to as the embedded space, the "Internet of Things" (IoT), or Machine to Machine (M2M) computing. I have a few different configurations I want to use for multiple Raspberry Pis, but for each of them, I'll need to perform the following common steps to prepare them for their various tasks: Load an OS onto an SD card Get the Pi connected to the network Load a JDK I've been very happy to see good friend and JFXtras teammate Gerrit Grunwald document how to do these things on his blog (link to article here - check it out!), but I ran into some issues configuring wi-fi that caused me some needless grief. Not knowing if any of the pitfalls were caused by my slightly-older version of the Pi and not being able to find anything specific online to help me get past it, I kept chipping away at it until I broke through. The purpose of this post is to (hopefully) help someone else recognize the same issues if/when they encounter them and work past them quickly. There is a great resource page here that covers several ways to get the OS on an SD card, but here is what I did (on a Mac): Plug SD card into reader on/in Mac Format it (FAT32) Unmount it (diskutil unmountDisk diskn, where n is the disk number representing the SD card) Transfer the disk image for Debian to the SD card (dd if=2012-08-08-wheezy-armel.img of=/dev/diskn bs=1m) Eject the card from the Mac (diskutil eject diskn) There are other ways, but this is fairly quick and painless, especially after you do it several times. Yes, I had to do that dance repeatedly (minus formatting) due to the wi-fi issues, as it kept killing the ability of the Pi to boot. You should be able to dramatically reduce the number of OS loads you do, though, if you do a few things with regard to your wi-fi. Firstly, I strongly recommend you purchase the Edimax EW-7811Un wi-fi adapter. This adapter/chipset has been proven with the Raspberry Pi, it's tiny, and it's cheap. Avoid unnecessary aggravation and buy this one! Secondly, visit this page for a script and instructions regarding how to configure your new wi-fi adapter with your Pi. Here is the rub, though: there is a missing step. At least for my combination of Pi version, OS version, and uncanny gift of timing and luck there was. :-) Here is the sequence of steps I used to make the magic happen: Plug your newly-minted SD card (with OS) into your Pi and connect a network cable (for internet connectivity) Boot your Pi. On the first boot, do the following things: Opt to have it use all space on the SD card (will require a reboot eventually) Disable overscan Set your timezone Enable the ssh server Update raspi-config Reboot your Pi. This will reconfigure the SD to use all space (see above). After you log in (UID: pi, password: raspberry), upgrade your OS. This was the missing step for me that put a merciful end to the repeated SD card re-imaging and made the wi-fi configuration trivial. To do so, just type sudo apt-get upgrade and give it several minutes to complete. Pour yourself a cup of coffee and congratulate yourself on the time you've just saved.  ;-) With the OS upgrade finished, now you can follow Mr. Engman's directions (to the letter, please see link above), download his script, and let it work its magic. One aside: I plugged the little power-sipping Edimax directly into the Pi and it worked perfectly. No powered hub needed, at least in my configuration. To recap, that OS upgrade (at least at this point, with this combination of OS/drivers/Pi version) is absolutely essential for a smooth experience. Miss that step, and you're in for hours of "fun". Save yourself! I'll pick up next time with more of the Java side of the RasPi configuration, but as they say, you have to cross the moat to get into the castle. Hopefully, this will help you do just that. Until next time! All the best, Mark 

    Read the article

  • Pimp my Silverlight Firestarter

    - by mbcrump
    So Silverlight Firestarter is over and your sitting on your couch thinking… what now? Well its time to So how exactly can you pimp the Silverlight Firestarter? Well read below and you will find out: 1) Pimp the videos: First we are going to use a program named Juice to download all of the Silverlight Firestarter videos. Go ahead and point your browser to http://juicereceiver.sourceforge.net/ and download the application. It works on Mac, Linux and PC. After it is downloaded you are going to want to add an RSS feed by clicking the button highlighted below. At this point you are going to want to add the following URL inside the textbox and hit Save: http://channel9.msdn.com/Series/Silverlight-Firestarter/RSS This RSS feed includes all the Silverlight Firestarter Labs and Presentations located below. The Future of Silverlight Data Binding Strategies with Silverlight and WP7 Building Compelling Apps with WCF using REST and LINQ Building Feature Rich Business Apps Today with RIA Services MVVM: Why and How? Tips and Patterns using MVVM and Service Patterns with Silverlight and WP7 Tips and Tricks for a Great Installation Experience Tune Your Application: Profiling and Performance Tips Performance Tips for Silverlight Windows Phone 7 Select all the videos and click the Download button located below (has blue arrow): Once all the videos are downloaded you will have about 4.64GB of Silverlight fun. You can now move these videos to your MediaServer and watch them with whatever device you want. Put it on an iPad, iPhone.. emm wait I mean WP7 or WMC7.  2) Pimp the Training Material – Download the offline installer for the labs here. This will give you almost a gig of free training materials. Here is the topics covered: Level 100: Getting Started Lab 01 - WinForms and Silverlight Lab 02 - ASP.NET and Silverlight Lab 03 - XAML and Controls Lab 04 - Data Binding Level 200: Ready for More Lab 05 - Migrating Apps to Out-of-Browser Lab 06 - Great UX with Blend Lab 07 - Web Services and Silverlight Lab 08 - Using WCF RIA Services Level 300: Take me Further Lab 09 - Deep Dive into Out-of-Browser Lab 10 - Silverlight Patterns: Using MVVM Lab 11 - Silverlight and Windows Phone 7 You will notice that it install Firestarter to the default of C:\Firestarter. So you will have to navigate to that folder and double click on Default.htm to get started. Now if you followed part one of the pimping guide then you will already have all the videos on your pc. You will notice that once you go into the lab you will get a Lab Document and Source at the bottom of the article. Now instead of opening the Source Folder in a web browser you can just copy the folder C:\Firestarter\Labs into your Visual Studio 2010 Project Folder. This will save a lot of time later.   3) Pimp my Silverlight 5 Knowledge – Always keep reading as much as possible and remember that the Silverlight 5 Beta should come Q1 of 2011 and the final release at the end of 2011. Here are 5 great blog post on Silverlight 5. Scott Gu’s Blog Mary Jo’s Article on Silverlight 5 The Future of Silverlight (Official) Kunal Chowdhury Blog Tim Heuer’s Blog Thats all that I got for now. Have fun with all the new Silverlight content.  Subscribe to my feed

    Read the article

  • How the "migrations" approach makes database continuous integration possible

    - by David Atkinson
    Testing a database upgrade script as part of a continuous integration process will only work if there is an easy way to automate the generation of the upgrade scripts. There are two common approaches to managing upgrade scripts. The first is to maintain a set of scripts as-you-go-along. Many SQL developers I've encountered will store these in a folder prefixed numerically to ensure they are ordered as they are intended to be run. Occasionally there is an accompanying document or a batch file that ensures that the scripts are run in the defined order. Writing these scripts during the course of development requires discipline. It's all too easy to load up the table designer and to make a change directly to the development database, rather than to save off the ALTER statement that is required when the same change is made to production. This discipline can add considerable overhead to the development process. However, come the end of the project, everything is ready for final testing and deployment. The second development paradigm is to not do the above. Changes are made to the development database without considering the incremental update scripts required to effect the changes. At the end of the project, the SQL developer or DBA, is tasked to work out what changes have been made, and to hand-craft the upgrade scripts retrospectively. The end of the project is the wrong time to be doing this, as the pressure is mounting to ship the product. And where data deployment is involved, it is prudent not to feel rushed. Schema comparison tools such as SQL Compare have made this latter technique more bearable. These tools work by analyzing the before and after states of a database schema, and calculating the SQL required to transition the database. Problem solved? Not entirely. Schema comparison tools are huge time savers, but they have their limitations. There are certain changes that can be made to a database that can't be determined purely from observing the static schema states. If a column is split, how do we determine the algorithm required to copy the data into the new columns? If a NOT NULL column is added without a default, how do we populate the new field for existing records in the target? If we rename a table, how do we know we've done a rename, as we could equally have dropped a table and created a new one? All the above are examples of situations where developer intent is required to supplement the script generation engine. SQL Source Control 3 and SQL Compare 10 introduced a new feature, migration scripts, allowing developers to add custom scripts to replace the default script generation behavior. These scripts are committed to source control alongside the schema changes, and are associated with one or more changesets. Before this capability was introduced, any schema change that required additional developer intent would break any attempt at auto-generation of the upgrade script, rendering deployment testing as part of continuous integration useless. SQL Compare will now generate upgrade scripts not only using its diffing engine, but also using the knowledge supplied by developers in the guise of migration scripts. In future posts I will describe the necessary command line syntax to leverage this feature as part of an automated build process such as continuous integration.

    Read the article

  • A Guide to Fusion SCM at Oracle OpenWorld 2012

    - by Pam Petropoulos
    Are you attending next week’s Oracle OpenWorld 2012 conference? Then you won’t want to miss the Fusion SCM activities and customer presenters from leading companies like Boeing and Fideltronik. Below you’ll find a day by day guide of the various Fusion SCM sessions, demos and activities during OpenWorld 2012, September 30 – October 4 in San Francisco, CA. Tuesday, October 2 All of the Fusion SCM sessions during OpenWorld will take place in various rooms at Moscone West, a convenience you are sure to appreciate, as will your feet.   The first session at 10:15 – 11:15 am (Moscone West, Room 2006), entitled “Oracle Fusion Supply Chain Management: Overview, Strategy, Customer Experiences, and Roadmap”, provides an overview of Fusion Supply Chain Management applications and will discuss Fusion SCM strategy, future roadmap, and highlights of customer examples. The next session at 11:45 am – 12:45 pm (Moscone West, Room 2022), entitled “Enabling Trusted Enterprise Product Data with Oracle Fusion Product Hub”, may be the session for you if you’re struggling with achieving consistent, high-quality product data that provides significant business value. This session will discuss how Oracle Fusion Product Hub and Oracle Enterprise Data Quality can help you to achieve this vision. A customer presenter from Fideltronik will share their experiences with Oracle Fusion Product Hub. At the end of the day unwind at the Supply Chain Management customer reception from 6:00 – 8:00 pm at the Roe Lounge, located at 651 Howard Street. Registration is required. Click here for details. Wednesday, October 3 Wednesday is a busy day with three Fusion SCM sessions on the agenda. Start your day at 10:15 am at the “Oracle Fusion Supply Chain Management: Customer Adoption and Experiences” session (Moscone West, Room 2003).  This must see session will showcase customer speakers from The Boeing Company and Fideltronik, each of whom will share their company’s experiences in selecting and implementing Fusion SCM applications. If you’re wondering how Fusion SCM applications can co-exist with your existing Oracle applications, then you’ll want to sit in on the 3:30 pm session entitled “Oracle Fusion Supply Chain Management: Coexistence with Other Oracle Applications” (Moscone West, Room 2003). Stick around until 5:00 pm for the final Fusion SCM session of the day entitled “Responsive Fulfillment with Oracle Fusion Supply Chain Management” (Moscone West, Room 2001).  This session will showcase Oracle Fusion Distributed Order Orchestration and Oracle Fusion Global Order Promising and how they are changing the way companies manage order fulfillment in environments. In addition to discussing the current business challenges, product capabilities, value propositions, industry applicability, and future roadmap this session will also feature a customer presenter from The Boeing Company. Thursday, October 4 If you are a retail customer we highly recommend that you attend the final Fusion SCM session of the week at 12:45 pm, entitled “Multichannel Fulfillment Excellence in the Direct-to-Consumer Market” (Moscone West, Room 2024).  Retailers will learn how they can transform their supply chains to meet the ever-increasing demands of buy anywhere/get anywhere cross-channel requirements with Fusion Distributed Order Orchestration and Oracle Fusion Product Hub. Throughout the week, you’ll also want to visit the Fusion SCM demo pods at the Demogrounds in Moscone West so you can see demos of these Fusion applications. Visit pod W-005 for Fusion Distributed Order Orchestration, W-008 for Fusion Inventory and Cost Management, and W-006 for Fusion Product Hub. Click here for the Demogrounds map. A reminder that you can also pre-register for these sessions to secure your spot. Visit the Schedule Builder to pre-enroll for these sessions. Finally, you'll also want to check out the Fusion SCM FocusOn document which includes additional keynote and general sessions that you may want to attend throughout the week.   We look forward to seeing you in San Francisco next week.

    Read the article

  • Personal Technology – Excel Tip: Comparing Excel Files

    - by Pinal Dave
    This guest post is by Vinod Kumar. Vinod Kumar has worked with SQL Server extensively since joining the industry over a decade ago. Working on various versionsfrom SQL Server 7.0, Oracle 7.3 and other database technologies – he now works with the Microsoft Technology Center (MTC) as a Technology Architect. Let us read the blog post in Vinod’s own voice. I have been writing about Excel Tips over my blog and thought it would be great to share one interesting tips here as a guest blog here. Assume a situation where you want to compare multiple excel files. Here is a typical scenario I have encountered as a common activity. Assume you are sending an Excel file with tons of data, formulae and multiple sheets. Now you are requesting your colleague to validate the file and if required change content for correctness. After receiving the file from your colleague, now you want to know what changes were made by this person to your document. Now here is a cool new addition to Excel 2013 that can help you achieve this task. To get to this option, click the INQUIRE Tab. Incase you don’t have the INQUIRE Tab, check Options using INQUIRE blog. In that post, we discuss all the other options of INQUIRE tab. Once you are on the INQUIRE Tab, select “Compare Files” button as shown in the figure above. This brings a dialog as below. If you are on Windows 8 or Windows 7 OS, search for an application called “Spreadsheet Compare 2013”. Ultimately both the options lead us to the same application. If you are using the stand alone app, once the App initializes, click the “Compare files” options from the toolbar. Make sure to give two different Excel files as shown in the figure above. After selecting the Excel Sheets, you can see the Compare tool has a number of other options to play from. We will talk about some of them later in this post. Just below our toolbar is a colorful side-by-side comparison of both our excel sheets. We can also see the various Tab’s from each file. There is a meaning for each of our color coding which will be discussed next. As you saw above, the color coding has a meaning. For example the bottom pane lists each of the color coding and most importantly each of the changes as compared side-by-side. The detailed information shown below can be exported using the “Export Results” options from the toolbar as a separate Excel Workbook or can be copied to clipboard to be used later. The final piece of the puzzle is to show a graphical view of these differences results based on each category. We cannot drill down per se, but this is a great way to know that the maximum changes seem to be based on “Cell Formats” and then few “Calculated Values” have changed. The INQUIRE option and Spreadsheet Compare 2013 tool is part of Excel 2013. So as you explore using the new version of Excel, there are many such hidden features that are worth exploring. Do let us know if you enjoyed learning a new feature today and I hope you will play around with this feature in your day-today challenges when working with Excel files. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology Tagged: Excel, Personal Technology

    Read the article

  • JustMock is here !!

    - by mehfuzh
    As announced earlier by Hristo Kosev at Telerik blogs , we have started giving out JustMock builds from today. This is the first of early builds before the official Q2 release and we are pretty excited to get your feedbacks. Its pretty early to say anything on it. It actually depends on your feedback. To add few, with JustMock we tried to build a mocking tool with simple and intuitive syntax as possible excluding more and more noises and avoiding any smell that can be made to your code [We are still trying everyday] and we want to make the tool even better with your help. JustMock can be used to mock virtually anything. Moreover, we left an option open that it can be used to reduce / elevate the features  just though a single click. We tried to make a strong API and make stuffs fluent and guided as possible so that you never have the chance to get de-railed. Our syntax is AAA (Arrange – Act – Assert) , we don’t believe in Record – Reply model which some of the smarter mocking tools are planning to remove from their coming release or even don’t have [its always fun to lean from each other]. Overall more signals equals more complexity , reminds me of 37 signals :-). Currently, here are the things you can do with JustMock ( will cover more in-depth in coming days) Proxied mode Mock interfaces and class with virtuals Mock properties that includes indexers Set raise event for specific calls Use matchers to control mock arguments Assert specific occurrence of a mocked calls. Assert using matchers Do recursive mocks Do Sequential mocking ( same method with argument returns different values or perform different tasks) Do strict mocking (by default and i prefer loose , so that i can use it as stubs) Elevated mode Mock static calls Mock final class Mock sealed classes Mock Extension methods Partially mock a  class member directly using Mock.Arrange Mock MsCorlib (we will support more and more members in coming days) , currently we support FileInfo, File and DateTime. These are few, you need to take a look at the test project that is provided with the build to find more [Along with the document]. Also, one of feature that will i will be using it for my next OS projects is the ability to run it separately in  proxied mode which makes it easy to redistribute and do some personal development in a more DI model and my option to elevate as it go.   I’ve surely forgotten tons of other features to mention that i will cover time but  don’t for get the URL : www.telerik.com/justmock   Finally a little mock code:   var lvMock = Mock.Create<ILoveJustMock>();    // set your goal  Mock.Arrange(() => lvMock.Response(Arg.Any<string>())).Returns((int result) => result);    //perform  string ret =  lvMock.Echo("Yes");    Assert.Equal(ret, "Yes");  // make sure everything is fine  Mock.Assert(() => lvMock.Echo("Yes"), Occurs.Once());   Hope that helps to get started,  will cover if not :-).

    Read the article

  • T-SQL Tuesday #005 : SSRS Parameters and MDX Data Sets

    - by blakmk
    Well it this weeks  T-SQL Tuesday #005  topic seems quite fitting. Having spent the past few weeks creating reports and dashboards in SSRS and SSAS 2008, I was frustrated by how difficult it is to use custom datasets to generate parameter drill downs. It also seems Reporting Services can be quite unforgiving when it comes to renaming things like datasets, so I want to share a couple of techniques that I found useful. One of the things I regularly do is to add parameters to the querys. However doing this causes Reporting Services to generate a hidden dataset and parameter name for you. One of the things I like to do is tweak these hidden datasets removing the ‘ALL’ level which is a tip I picked up from Devin Knight in his blog: There are some rules i’ve developed for myself since working with SSRS and MDX, they may not be the best or only way but they work for me. Rule 1 – Never trust the automatically generated hidden datasets Or even ANY, automatically generated MDX queries for that matter.... I’ve previously blogged about this here.   If you examine the MDX generated in the hidden dataset you will see that it generates the MDX in the context of the originiating query by building a subcube, this mean it may NOT be appropriate to use this in a subsequent query which has a different context. Make sure you always understand what is going on. Often when i’m developing a dashboard or a report there are several parameter oriented datasets that I like to manually create. It can be that I have different datasets using the same dimension but in a different context. One example of this, is that I often use a dataset for last month and a dataset for the last 6 months. Both use the same date hierarchy. However Reporting Services seems not to be too smart when it comes to generating unique datasets when working with and renaming parameters and datasets. Very often I have come across this error when it comes to refactoring parameter names and default datasets. "an item with the same key has already been added" The only way I’ve found to reliably avoid this is to obey to rule 2. Rule 2 – Follow this sequence when it comes to working with Parameters and DataSets: 1.    Create Lookup and Default Datasets in advance 2.    Create parameters (set the datasets for available and default values) 3.    Go into query and tick parameter check box 4.    On dataset properties screen, select the parameter defined earlier from the parameter value defined earlier. Rule 3 – Dont tear your hair out when you have just renamed objects and your report doesn’t build Just use XML notepad on the original report file. I found I gained a good understanding of the structure of the underlying XML document just by using XML notepad. From this you can do a search and find references of the missing object. You can also just do a wholesale search and replace (after taking a backup copy of course ;-) So I hope the above help to save the sanity of anyone who regularly works with SSRS and MDX.   @Blakmk

    Read the article

  • Part 9: EBS Customizations, how to track

    - by volker.eckardt(at)oracle.com
    In the previous blogs we were concentrating on the preparation tasks. We have defined standards, we know about the tools and techniques we will start with. Additionally, we have defined the modification strategy, and how to handle such topics best. Now we are ready to take the requirements! Such requirements coming over in spreadsheets, word files (like GAP documents), or in any other format. As we have to assign some attributes, we start numbering all that and assign a short name to each of these requirements (=CEMLI reference). We may also have already a Functional person assigned, and we might involve someone from the tech team to estimate, and we like to assign a status such as 'planned', 'estimated' etc. All these data are usually kept in spreadsheets, but I would put them into a database (yes, I am from Oracle :). If you don't have any good looking and centralized application already, please give a try with Oracle APEX. It should be up and running in a day and the imported sheets are than manageable concurrently!  For one of my clients I have created this CEMLI-DB; in between enriched with a lot of additional functionality, but initially it was just a simple centralized CEMLI tracking application. Why I am pointing out again the centralized method to manage such data? Well, your data quality will dramatically increase, if you let your project members see (also review and update) "your" data.  APEX allows you to filter, sort, print, and also export. And if you can spend some time to define proper value lists, everyone will gain from. APEX allows you to work in 'agile' mode, means you can improve your application step by step. Let's say you like to reference a document, or even upload the same, you can do that. Or, you need to classify the CEMLIs by release, just add this release field, same for business area or CEMLI type. One CEMLI record may then look like this: Prepare one or two (online) reports, to be ready to present your "workload" to the project management. Use such extracts also when you work offline (to prioritize etc.). But as soon as you are again connected, feed the data back into the central application. Note: I have combined this application with an additional issue tracker.  Here the most important element is the CEMLI reference, which acts as link to any other application (if you are not using APEX also as issue tracker :).  Please spend a minute to define such a reference (see blog Part 8: How to name Customizations).   Summary: Building the bridge from Gap analyse to the development has to be done in a controlled way. Usually the information is provided differently, but it is suggested to collect all requirements centrally. Oracle APEX is a great solution to enter and maintain such information in a structured, but flexible way. APEX helped me a lot to work with distributed development teams during the complete development cycle.

    Read the article

  • links for 2010-12-08

    - by Bob Rhubart
    What's a data architect? A comic dialog by one who knows: Oracle ACE Director Lewis Cunningham. Webcast: Oracle Business Intelligence Forum - December 15, 2010 at 9:00 am PT "The Oracle Business Intelligence Online Forum is a half-day virtual event that offers you a unique opportunity to see, in one place, the full portfolio of Oracle’s Business Intelligence (BI) offerings, and to learn what sets Oracle apart from the rest. Hear Oracle executives and industry analyst, Howard Dresner, present the current state of Business Intelligence, along with a series of customers who will share their case studies of putting analytics in action." Oracle Rolls Out Private Cloud Architecture And World-Record Transaction Performance | Forrester Blogs "Exadata has been dealt with extensively in other venues, both inside Forrester and externally, and appears to deliver the goods for I&O groups who require efficient consolidation and maximum performance from an Oracle database environment." -- Richard Fichera, Forrester Seven ways to get things started: Java EE Startup Classes with GlassFish and WebLogic "This is a blog about a topic that I realy don't like. But it comes across my ways over and over again and it's no doubt that you need it from time to time. Enough reasons for me to collect some information about it and publish it for your reference. I am talking about Startup-/Shutdown classes with Java EE applications or servers." -- Oracle ACE Director Markus "@myfear" Eisele." Monitoring Undelivered Messages in BPEL in SOA 10g (Antony Reynolds' Blog) "I am currently working with a client that wants to know how many undelivered messages they have, and if it reaches a certain threshold then they wants to alert the operator. To do this they plan on using the Enterprise Manager alert functions, but first they needs to know how many undelivered instances are out there." SOA author Antony Reynolds VirtualBox Appliances for Developers "Developers can simply download a few files, assemble them with a script , and then import and run the resulting pre-built VM in VirtualBox. This makes starting with these technologies even easier. Each appliance contains some Hands-On-Labs to start learning." -- Peter Paul van de Beek Oracle UCM 11g Remote Intradoc Client (RIDC) Integration with Oracle ADF 11g "It's great we have out of the box WebCenter ADF task flows for document management in UCM. However, for complete business scenario implementations, usually it's not enough and we need to manage Content Repository programmatically. This can be achieved through Remote Intradoc Client (RIDC) API. It's quite hard to find any practical information about this API, but I managed to get code for UCM folder creation/removal and folder information." -- Oracle ACE Director Andrejus Baranovskis Interview with Java Champion Matjaz B. Juric on Cloud Computing, SOA, and Java EE 6 "Matjaz Juric of Slovenia, head of the Cloud Computing and SOA Competence Centre at the University of Maribor, and professor at the University of Ljubljana, shares insights about cloud computing, SOA and Java EE 6." White Paper: Oracle Complex Event Processing High Availability "This whitepaper describes the high availability (HA) solutions available in Oracle CEP 11g Release 1 Patch Set 2 and  presents the results of a benchmark study demonstrating the performance of the Oracle CEP HA solutions."

    Read the article

  • Customize SharePoint list using InfoPath2010 form Part4

    - by ybbest
    Customize SharePoint list using InfoPath2010 form Part1 Customize SharePoint list using InfoPath2010 form Part2 Customize SharePoint list using InfoPath2010 form Part3 In this post, I’d like to show you how to create print functionality in InfoPath for SharePoint list. The print functionality is provided out of box in InfoPath form library; however it is not available in SharePoint list. Here are the steps to create the print functionality.You can download the new form here. 1. Create print page in the list by first copy and paste the displayifs.aspx and rename the file to Printifs.aspx. 2. Open the page in the SharePoint designer and copy the following javascript to the PlaceHolderTitleAreaClass ContentPlaceHolder. <script type="text/javascript"> $(document).ready(function(){ $("[id^='Ribbon']").hide(); $(".s4-title").hide(); $("[id='s4-leftpanel']").hide(); $("[id='s4-ribbonrow']").hide(); $("[id='s4-titlerow']").hide(); $("[id='s4-titlerow']").css("height", "0px"); $("body").css("background-color", "white"); $("body").css("zoom", "135%"); $("[id='MSO_ContentTable']").css("margin-left", "0px"); $("[id='MT-BodyContent']").css("width", "900px"); $(".MT-BodyArea").css("width", "900px"); $("[id='MT-Layout']").css("width", "900px"); $(".ms-bodyareacell").css("width", "900px"); $(".s4-wpTopTable").css("border", "none"); $("[id$='XmlFormView']").css("margin-left", "-80px"); $("body").css("margin-top", "-30px"); $(":contains('CAPEX')").css("border", "5px solid #FFCC00"); window.print(); }); </script> 3. Open InfoPath form for the list and create a field called PrintLink 4. Set the default value of printLink that points to the print page I just created before with the query string id.You can download the formula for the default value here. 5. Add a new image that looks like Print button on the display view, then I can set the url to the Print link Field. (The reason I did not use button is that you cannot set the navigate url for the button). 6.Set the url of the image to the PrintLInk field. 7.Next , create the print view. 8. Copy the contents from the display view to print view 9. Finally, go to the printifs.aspx and edit the InfoPath web part to set the view to PrintView. 9. Republish you form you will see the form as shown below 10. If you click the Print button, you will see the print page and print dialog,you can also add the company logo in the print page using css as well. 11.To deploy the customization,you can use the backup and restore content database approach , you can get more details from my previous blog post here.

    Read the article

  • Issues printing through ssh tunnel and port forwarding

    - by simogasp
    I'm having some problems trying to print through a ssh tunnel. I'd like to print from my laptop to a network printer (Toshiba es453, for what matters) which is in a local network. I can reach the local network using a gateway. So far I did the following: ssh -N -L19100:<Printer_IP>:9100 <username>@<ssh_gateway> Basically i just mapped the port 19100 of my laptop directly to the input port of the printer, passing through the gateway. So far, so good. Then, i tried to install on my laptop a new printer with the GUI config tool of ubuntu, so that the new printer is on localhost at port 19100 (as APP Socket/HP Jet Direct) , then I provided the proper driver of the printer. In theory, once the tunnel is open I should be able to print from any program just selecting this printer. Of course, it does not work. :-) The document hangs in the queue with status Processing while in the shell where I set up the tunnel I get these errors on failing opening channels debug1: Local forwarding listening on ::1 port 19100. debug1: channel 0: new [port listener] debug1: Local forwarding listening on 127.0.0.1 port 19100. debug1: channel 1: new [port listener] debug1: Requesting [email protected] debug1: Entering interactive session. debug1: Connection to port 19100 forwarding to 195.220.21.227 port 9100 requested. debug1: channel 2: new [direct-tcpip] debug1: Connection to port 19100 forwarding to 195.220.21.227 port 9100 requested. debug1: channel 3: new [direct-tcpip] channel 2: open failed: connect failed: Connection timed out debug1: channel 2: free: direct-tcpip: listening port 19100 for 195.220.21.227 port 9100, connect from ::1 port 44434, nchannels 4 debug1: Connection to port 19100 forwarding to 195.220.21.227 port 9100 requested. debug1: channel 2: new [direct-tcpip] channel 3: open failed: connect failed: Connection timed out debug1: channel 3: free: direct-tcpip: listening port 19100 for 195.220.21.227 port 9100, connect from ::1 port 44443, nchannels 4 channel 2: open failed: connect failed: Connection timed out debug1: channel 2: free: direct-tcpip: listening port 19100 for 195.220.21.227 port 9100, connect from ::1 port 44493, nchannels 3 debug1: Connection to port 19100 forwarding to 195.220.21.227 port 9100 requested. debug1: channel 2: new [direct-tcpip] As a further debugging test I tried the following. From a machine inside the local network I did a telnet <IP_printer> 9100, got access, wrote some random thing, closed the connection and correctly I got a print of what I had written. So the port and the ip of the printer should be correct. I tried the same from my laptop with the tunnel opened, the telnet succeeded but, again, the printer didn't print anything, getting the usual channel x: open failed: errors. I'm not a great expert on the matter, I just thought that in theory it was possible to do something like that, but maybe there is something that I didn't consider or I did wrong. Any clue? Thanks! Simone [update] As further debugging test, I tried to replicate the procedure from a machine in the local network. From that machine, I did ssh -N -L19100:<IP_printer>:9100 <username>@<ssh_gateway> (note that now the machine, the gateway and the printer are in the same local network) then I tried again the telnet test with telnet localhost 19100, I got access and everything, but I didn't get the print but the usual error channel 2: open failed: connect failed: Connection timed out Maybe I am missing some other connection to be forwarded or maybe this is not allowed by the administrators. Of course, if I connect via ssh tunneling to the local machine from my laptop through the gateway, I can successfully print using the lpr command (from the local machine). But this is what I would like to avoid (yes, I'm lazy...:-), I would like to have a more 'elegant' and transparent way to do that.

    Read the article

  • InfoPath 2010 Form Design and Web Part Deployment

    - by JKenderdine
    In January I had the pleasure to speak at SharePoint Saturday Virginia Beach.  I presented a session on InfoPath 2010 forms design which included some of the basics of Forms Design, description of some of the new options with InfoPath 2010 and SharePoint 2010, and other integration possibilities.  Included below is the information presented as well as the solution to create the demo: First thing you need to understand is what the difference is between an InfoPath List form and a Form Library Form?  SharePoint List Forms:  Store data directly in a SharePoint list.  Each control (e.g. text box) in the form is bound to a column in the list. SharePoint list forms are directly connected to the list, which means that you don’t have to worry about setting up the publish and submit locations. You also do not have the option for back-end code. Form Library Forms:  Store data in XML files in a SharePoint form library.  This means they are more flexible and you can do more with them.  For example, they can be configured to save drafts and submit to different locations. However, they are more complex to work with and require more decisions to be made during configuration.  You do have the option of back-end code with these type of forms. Next steps: You need to create your File Architecture Plan.  Plan the location for the saved template – both Test and Production (This is pretty much a given, but just in case - Always make sure to have a test environment) Plan for the location of the published template Then you need to document your Form Template Design Plan.  Some questions to ask to gather your requirements: What will the form be designed to do? Will it gather user information? Will it display data from a data source? Do we need to show different views to different users? What do we base this on? How will it be implemented for the users? Browser or Client based form Site collection content type – Published through Central Admin Form Library – Published directly to form library So what are the requirements for this template?  Business Card Request Form Template Design Plan Gather user information and requirements for card Pull in as much user information as possible. Use data from the user profile web services as a data source Show and hide fields as necessary for requirements Create multiple views – one for those submitting the form and another view for the executive assistants placing the orders. Browser based form integrated into SharePoint team site Published directly to form library The form was published through Central Administration and incorporated into the site as a content type. Utilizing the new InfoPath Web part, the form is integrated into the page and the users can complete the form directly from within that page. For now, if you are interested in the final form XSN, contact me using the Contact link above.   I will post soon with the details on how the form was created and how it integrated the requirements detailed above.

    Read the article

  • Siebel CRM: Alive and Jamming at OpenWorld

    - by Tony Berk
    Yes, a rock 'n roll reference in a CRM/Customer Experience blog entry! Sorry, but we are getting excited about OpenWorld and all of the great CRM and Customer Experience sessions we've been planning for the past 6 months (yes, we really do start planning in March!). I also heard that some band named Pearl Jam is making an appearance. Who's tried the Rock Band guitar solo for Alive? Way too difficult for an amateur like me. Anyhow, we are supposed to be highlighting Siebel CRM at OpenWorld. Yes, Siebel will once again have a major presence at OpenWorld and there is a lot of new things to tell you about. If you search the OpenWorld Content Catalog with the tag "siebel", you'll find over 75 sessions. That's over 75 hours of opportunity to hear from Siebel customers, product managers, and implementers. While I invite you to read through the descriptions of all 75+ sessions or check out the OpenWorld Focus On Siebel document, I'd like to try and help with some highlights. The roadmap and strategy session was mentioned in my previous post, but it is important enough to mention again. Siebel CRM Overview, Strategy, and Roadmap (CON9700) - Oct 1, 12:15PM. Come to this session to learn about the Siebel product roadmap and how Oracle is committed to accelerating the pace of innovation and value for its customers on this platform. Additionally, the session covers how Siebel customers can leverage many Oracle assets such as Oracle WebCenter Sites; InQuira, RightNow, and ATG/Endeca applications, and Oracle Policy Automation in conjunction with their current Siebel investments. This session was FULL last year, so I strongly suggest you pre-register via the OpenWorld Schedule Builder. Every year, my favorites are the customer panels, where you get hear 2, 3 or even 4 customers talk about their implementations and often share best practices and lessons learned. Customer Panel: Business Benefits of Deploying Siebel CRM (Session ID: CON9717) - Oct 1, 10:45AM featuring GlaxoSmithKline, PNC Bank and Southwest Airlines. Maximizing User Adoption Rates for Siebel Sales and Siebel Partner Relationship Management (CON9690) Oct 1, 12:15PM featuring CSL Behring, Intuit and McKesson. Best Practices for Upgrading Your Siebel CRM Implementations: Customer Successes (CON9715) - Oct 1, 3:15PM featuring Citrix, Sunlife Financial and Oracle experts. Driving Great Customer Experiences with Siebel Service Applications (CON9604) - Oct 1, 4:45 featuring Farmers Insurance, US Department of Homeland Security and Waste Management There are also a number of customer case study sessions including: Lowe's (CON9740), American Red Cross (CON6535), Ontario Lottery & Gaming's Siebel Marketing and Loyalty (CON4114), and LexisNexis (CON9551). Also, an interesting session on optimizing Siebel on Oracle with ACCOR (CON4289). Have you heard about the new Open UI for Siebel? If you haven't, you should! There are sessions focused on introducing you to the new functionality and how you can unleash the power of the new user interface: User Interface Innovations with the New Siebel “Open UI” (CON9703) Oct 2, 10:15AM and Unleash the Power of “Open UI” (CON9705) - Oct 3, 11:45AM. Other Siebel-related topics you might want to check out: Knowledge Management: Increasing Return on Your CRM Investments with Knowledge (CON9779) - Oct 1, 3:15PM Mobile: Mobile Solutions for Siebel CRM (CON9697) - Oct 2, 5:00PM Siebel Loyalty: Best Practices for Maximizing the Success of Your Loyalty Program with Siebel Loyalty (CON9588) - Oct 2, 5:00PM  Siebel Marketing: Next-Generation Cross-Channel Insight-Driven Customer Dialogue with Siebel Marketing (CON9600) - Oct 3, 10:15AM Integrating with Oracle Commerce: Administer Once and Deploy Everywhere: Integrating the Siebel, ATG, and Endeca Platforms (CON9761) - Oct 2 5:00PM Finally, don't forget the Oracle Applications User Group (OAUG) Special Interest Group for Siebel on Sunday, September 30 at 2:15PM. And of course, the Demogrounds in Moscone West will be full of Oracle and partner demos and information on new solutions. Wow! I told you there was a lot! Good luck finding the best sessions for you and have a great time at OpenWorld. Don't forget to sing along with Pearl Jam!

    Read the article

  • First Day of Data Integration Track at Oracle OpenWorld 2012

    - by Irem Radzik
    OpenWorld started full speed for us today with a great set of sessions in the Data Integration track. After the exciting keynote session on Oracle Database 12c in the morning; Brad Adelberg, VP of Development for Data Integration products, presented Oracle’s data integration product strategy. His session highlighted the new requirements for data integration to achieve pervasive and continuous access to trusted data. The new requirements and product focus areas presented in this session are: Provide access to any data at any source On premise or on cloud Enable zero downtime operations and maximum performance Leverage real-time data for accurate business insights And ensure high quality data is used across the enterprise During the session Brad walked over how Oracle’s data integration products, Oracle Data Integrator, Oracle GoldenGate, Oracle Enterprise Data Quality, and Oracle Data Service Integrator, deliver on these requirements and how recent product releases build on this strategy. Soon after Brad’s session we heard from a panel of Oracle GoldenGate customers, St. Jude Medical, Equifax, and Bank of America, how they achieved zero downtime operations using Oracle GoldenGate. The panel presented different use cases of GoldenGate, from Active-Active replication to offloading reporting. Especially St. Jude Medical’s implementation, which involves the alert management system for patients that use their pacemakers, reminded me in some cases downtime of mission-critical systems can be a matter of life or death. It is very comforting to hear that GoldenGate delivers highly-reliable continuous availability for life-saving medical systems. In the afternoon, Nick Wagner from the Product Management team and I followed the customer panel with the review of Oracle GoldenGate 11gR2’s New Features.  Many questions we received from audience were about GoldenGate’s new Integrated Capture for Oracle Database and the enhanced Conflict Management features, as well as how GoldenGate compares to Oracle Streams. In addition to giving details on GoldenGate’s unique capability to capture changed data with a direct integration to the Oracle DBMS engine, we reminded the audience that enhancements to Oracle GoldenGate will continue, while Streams will be primarily maintained. Last but not least, Tim Garrod and Ryan Fonnett from Raymond James presented a unified real-time data integration solution using Oracle Data Integrator and GoldenGate for their operational data store (ODS). The ODS supports application services across the enterprise and providing timely data is a critical requirement. In this solution, Oracle GoldenGate does the log-based change data capture for Oracle Data Integrator’s near real-time data integration between heterogeneous systems. As Raymond James’ ODS supports mission-critical services for their advisors, the project team had to set up this integration environment to be highly available. During the session, Ryan and Tim explained how they use ODI to enable automated process execution and “always-on” integration processes. Their presentation included 2 demonstrations that focused on CDC patterns deployed with ODI and the automated multi-instance execution and monitoring. We are very grateful to Tim and Ryan for their very-well prepared presentation at OpenWorld this year. Day 2 (Tuesday) will be also a busy day in our track. In addition to the Fusion Middleware Innovation Awards ceremony at 11:45am at Moscone West 3001, we have the following DI sessions Real-World Operational Reporting Customer Panel 11:45am Moscone West- 3005 Oracle Data Integrator Product Update and Future Strategy 1:15pm Moscone West- 3005 High-volume OLTP with Oracle GoldenGate: Best Practices from Comcast 1:15pm Moscone West- 3005 Everything You need to Know about Monitoring Oracle GoldenGate 5pm Moscone West-3005 If you are at OpenWorld please join us in these sessions. For a full review of data integration track at OpenWorld please see our Focus-On document.

    Read the article

< Previous Page | 519 520 521 522 523 524 525 526 527 528 529 530  | Next Page >