Search Results

Search found 2655 results on 107 pages for 'rich snippets'.

Page 91/107 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • PHP_AUTH_USER only known in certain frames

    - by Rob
    Getting very confused by PHP_AUTH_USER. Within my web pages I have .htaccess files in every directory, controlling who can (and cant) see certain folders. In order to further customise the pages I was hoping to use PHP_AUTH_USER within the PHP code, i.e. tailor page contents based on the user. This only seems to work partially. The code snippets below hopefully demonstrate my problems. The main index.php creates a framed page with a menu structure in the top left hand corners, some irrelvant stuff in top right and then the tailor made contents in bottom frame. In top left the user is correctly shown, but in the bottom frame PHP_AUTH_USER doesnt seem to be set anymore (it returns empty and when printing all $HTTP_SERVER_VARS its not listed). Script.php is in a different path, but they all have .htaccess files in them and all other contents is displayed correctly. Why does it not know about PHP_AUTH_USER there? Running version php version 5.2.12 on chrome. index.php <FRAMESET ROWS="35%, *"> <FRAMESET COLS="25%, *"> <FRAME SRC="Menu.php"> <FRAME SRC="Something.php"> </FRAMESET> <FRAME SRC="../OtherPath/Script.php?large=1" name="outputlisting"> </FRAMESET> </FRAMESET> Menu.php <ul> <li>Reporting <ul> <li>Link1 <a href="../OtherPath/Script.php" target="outputlisting">All</a>, <a href="../OtherPath/Script.php?large=1" target="outputlisting">Big</a> </ul> <?php echo 'IP Address: ' . $_SERVER['REMOTE_ADDR'] . '<br />'; echo 'User: ' . $_SERVER['PHP_AUTH_USER']; ?> Script.php <?php echo 'User: ' . $_SERVER['PHP_AUTH_USER']; ?>

    Read the article

  • What is wrong in this c++ code?

    - by narayanpatra
    Why this coder do not show error #include <iostream> int main() { using namespace std; unsigned short int myInt = 99; unsigned short int * pMark = 0; cout << myInt << endl; pMark = &myInt; *pMark = 11; cout << "*pMark:\t" << *pMark << "\nmyInt:\t" << myInt << endl; return 0; } But this one shows : #include<iostream> using namespace std; int addnumber(int *p, int *q){ cout << *p = 12 << endl; cout << *q = 14 << endl; } #include<iostream> using namespace std; int addnumber(int *p, int *q){ cout << *p = 12 << endl; cout << *q = 14 << endl; } int main() { int i , j; cout << "enter the value of first number"; cin >> i; cout << "enter the value of second number"; cin >> j; addnumber(&i, &j); cout << i << endl; cout << j << endl; } In both the code snippets, I am assigning *pointer=somevalue. In first code it do not show any error but it shows error in the line cout << *p = 12 << endl; cout << *q = 14 << endl; What mistake I am doing ?

    Read the article

  • ASP.NET MVC 3 Hosting :: New Features in ASP.NET MVC 3

    - by mbridge
    Razor View Engine The Razor view engine is a new view engine option for ASP.NET MVC that supports the Razor templating syntax. The Razor syntax is a streamlined approach to HTML templating designed with the goal of being a code driven minimalist templating approach that builds on existing C#, VB.NET and HTML knowledge. The result of this approach is that Razor views are very lean and do not contain unnecessary constructs that get in the way of you and your code. ASP.NET MVC 3 Preview 1 only supports C# Razor views which use the .cshtml file extension. VB.NET support will be enabled in later releases of ASP.NET MVC 3. For more information and examples, see Introducing “Razor” – a new view engine for ASP.NET on Scott Guthrie’s blog. Dynamic View and ViewModel Properties A new dynamic View property is available in views, which provides access to the ViewData object using a simpler syntax. For example, imagine two items are added to the ViewData dictionary in the Index controller action using code like the following: public ActionResult Index() {          ViewData["Title"] = "The Title";          ViewData["Message"] = "Hello World!"; } Those properties can be accessed in the Index view using code like this: <h2>View.Title</h2> <p>View.Message</p> There is also a new dynamic ViewModel property in the Controller class that lets you add items to the ViewData dictionary using a simpler syntax. Using the previous controller example, the two values added to the ViewData dictionary can be rewritten using the following code: public ActionResult Index() {     ViewModel.Title = "The Title";     ViewModel.Message = "Hello World!"; } “Add View” Dialog Box Supports Multiple View Engines The Add View dialog box in Visual Studio includes extensibility hooks that allow it to support multiple view engines, as shown in the following figure: Service Location and Dependency Injection Support ASP.NET MVC 3 introduces improved support for applying Dependency Injection (DI) via Inversion of Control (IoC) containers. ASP.NET MVC 3 Preview 1 provides the following hooks for locating services and injecting dependencies: - Creating controller factories. - Creating controllers and setting dependencies. - Setting dependencies on view pages for both the Web Form view engine and the Razor view engine (for types that derive from ViewPage, ViewUserControl, ViewMasterPage, WebViewPage). - Setting dependencies on action filters. Using a Dependency Injection container is not required in order for ASP.NET MVC 3 to function properly. Global Filters ASP.NET MVC 3 allows you to register filters that apply globally to all controller action methods. Adding a filter to the global filters collection ensures that the filter runs for all controller requests. To register an action filter globally, you can make the following call in the Application_Start method in the Global.asax file: GlobalFilters.Filters.Add(new MyActionFilter()); The source of global action filters is abstracted by the new IFilterProvider interface, which can be registered manually or by using Dependency Injection. This allows you to provide your own source of action filters and choose at run time whether to apply a filter to an action in a particular request. New JsonValueProviderFactory Class The new JsonValueProviderFactory class allows action methods to receive JSON-encoded data and model-bind it to an action-method parameter. This is useful in scenarios such as client templating. Client templates enable you to format and display a single data item or set of data items by using a fragment of HTML. ASP.NET MVC 3 lets you connect client templates easily with an action method that both returns and receives JSON data. Support for .NET Framework 4 Validation Attributes and IvalidatableObject The ValidationAttribute class was improved in the .NET Framework 4 to enable richer support for validation. When you write a custom validation attribute, you can use a new IsValid overload that provides a ValidationContext instance. This instance provides information about the current validation context, such as what object is being validated. This change enables scenarios such as validating the current value based on another property of the model. The following example shows a sample custom attribute that ensures that the value of PropertyOne is always larger than the value of PropertyTwo: public class CompareValidationAttribute : ValidationAttribute {     protected override ValidationResult IsValid(object value,              ValidationContext validationContext) {         var model = validationContext.ObjectInstance as SomeModel;         if (model.PropertyOne > model.PropertyTwo) {            return ValidationResult.Success;         }         return new ValidationResult("PropertyOne must be larger than PropertyTwo");     } } Validation in ASP.NET MVC also supports the .NET Framework 4 IValidatableObject interface. This interface allows your model to perform model-level validation, as in the following example: public class SomeModel : IValidatableObject {     public int PropertyOne { get; set; }     public int PropertyTwo { get; set; }     public IEnumerable<ValidationResult> Validate(ValidationContext validationContext) {         if (PropertyOne <= PropertyTwo) {            yield return new ValidationResult(                "PropertyOne must be larger than PropertyTwo");         }     } } New IClientValidatable Interface The new IClientValidatable interface allows the validation framework to discover at run time whether a validator has support for client validation. This interface is designed to be independent of the underlying implementation; therefore, where you implement the interface depends on the validation framework in use. For example, for the default data annotations-based validator, the interface would be applied on the validation attribute. Support for .NET Framework 4 Metadata Attributes ASP.NET MVC 3 now supports .NET Framework 4 metadata attributes such as DisplayAttribute. New IMetadataAware Interface The new IMetadataAware interface allows you to write attributes that simplify how you can contribute to the ModelMetadata creation process. Before this interface was available, you needed to write a custom metadata provider in order to have an attribute provide extra metadata. This interface is consumed by the AssociatedMetadataProvider class, so support for the IMetadataAware interface is automatically inherited by all classes that derive from that class (notably, the DataAnnotationsModelMetadataProvider class). New Action Result Types In ASP.NET MVC 3, the Controller class includes two new action result types and corresponding helper methods. HttpNotFoundResult Action The new HttpNotFoundResult action result is used to indicate that a resource requested by the current URL was not found. The status code is 404. This class derives from HttpStatusCodeResult. The Controller class includes an HttpNotFound method that returns an instance of this action result type, as shown in the following example: public ActionResult List(int id) {     if (id < 0) {                 return HttpNotFound();     }     return View(); } HttpStatusCodeResult Action The new HttpStatusCodeResult action result is used to set the response status code and description. Permanent Redirect The HttpRedirectResult class has a new Boolean Permanent property that is used to indicate whether a permanent redirect should occur. A permanent redirect uses the HTTP 301 status code. Corresponding to this change, the Controller class now has several methods for performing permanent redirects: - RedirectPermanent - RedirectToRoutePermanent - RedirectToActionPermanent These methods return an instance of HttpRedirectResult with the Permanent property set to true. Breaking Changes The order of execution for exception filters has changed for exception filters that have the same Order value. In ASP.NET MVC 2 and earlier, exception filters on the controller with the same Order as those on an action method were executed before the exception filters on the action method. This would typically be the case when exception filters were applied without a specified order Order value. In MVC 3, this order has been reversed in order to allow the most specific exception handler to execute first. As in earlier versions, if the Order property is explicitly specified, the filters are run in the specified order. Known Issues When you are editing a Razor view (CSHTML file), the Go To Controller menu item in Visual Studio will not be available, and there are no code snippets.

    Read the article

  • Professional WordPress Business Themes

    - by Matt
    Every now and then JustSkins.com receives quote requests for WordPress design for business websites. Most companies now keep up to date with a blog on their corporate website, that showcases their day to day activities & progresses.  Getting such professional wordpress driven website designed from the scratch costs you a lot. If you have decided to make WordPress the CMS for your business website, there are some Professional WordPress themes you can take a look at. We have created this list to help you save some time to do all the trying and the testing. Optimize by WooThemes Last year one of the most popular Business theme by WooThemes was the Coffee Break theme, Optimize is further adaptation of the same. It is simple, sleek design with great functionality. The customizable front page lets you showcase your work or product etc. Demo | Price: $70, Developer Price: $150 | DOWNLOAD WooThemes is also offering their whole Business theme pack for a very very reasonable fee, If you like multiple designs from them you can get this big deal for only $125 Onyx , Impacto by Simple Themes Simple Themes has been making very crisp & beautiful WordPress Themes & are also very reasonably priced. If their themes solve your purpose $39 membership for 3 months is a good deal.  If you are looking to create quick website, landing page or micro site their templates are best. Demo | Price: $39 for 3 Months Membership Rejuvenate by Templatic One of the most beautiful Premium WordPress Theme, Available in 4 elegant color schemes. This theme can be used for your Beauty, Spa and Studio Business. Demo | Price: $65  | DOWNLOAD Templatic has created great professional business templates, such as Gourmet, Real Estate, Job Board, Automobile & lots More. You can also get a Best Value Offer in $299 for all of Templatic Themes. TheProfessional by ElegantThemes Elegant Themes is known to provide very beautiful & straightforward designs. The professional wordpress theme is a simple, crisp & concise Theme you can use to create a business website. The 3 short blurbs on the homepage are simple, which can be used to point them to your major offerings and the prominent slider indicates a clear call to action. There are 52 themes to choose from & Elegant Themes is giving a great offer at such a small yearly fee. Demo | Price: $39 Yearly Membership  | DOWNLOAD Elegant Themes has a cluster of 52 magnificent themes, and all you have to do is pay $39 to win access to all of them. Join today! Some of the Professional designs that I like for a business website are SimplePress and Corporation. Extatic by Chimera Themes The theme includes plenty of great features including custom feature tour pages, portfolio sections, static feature areas, pricing table page, 20+ shortcodes, multiple page/post options, unlimited custom sidebars which can be assigned to posts/pages, advanced theme style editor and options page and much more. Its a must buy Demo | Price: $37 | DOWNLOAD Corporate by Clover Themes Simple Theme for a small business. Corporate is an clean, powerful and feature-rich corporate theme with dynamic and energy design. Demo | Price: $69.95 | DOWNLOAD Bizco by Themify Bizco is a very professional template for wordpress targeted at corporate and product based businesses. This theme is simple yet highly functional and is suitable for showcasing features of your service or product. With the custom page template you can change the display of your pages and posts easily with our visual custom panel. Demo | Price: $70  |DOWNLOAD Devision by Themetrust Devision is a small business wordpress theme that can be used to make a business website within a few minutes. It makes it very easy to showcase and highlight your services or product on the homepage. Demo | Price: Euro 39 | DOWNLOAD BizPress by WPZoom A professional business WordPress theme from WPZoom suitable for companies, organizations, product showcases or other business websites. The theme comes with 4 colour options, featured products / services slider on the homepage, drop down menus, theme options page etc. Demo | Price: $ 69 | DOWNLOAD Clean Classy Corporate by ThemeFuse A very impressive WordPress business theme, that can be used in multiple ways. It is suitable for many kinds, like web products, services, hosting etc etc. Clean Classy Corporate WordPress Theme has a clean crisp look and is professional in appeal. Demo | Price: $49  | DOWNLOAD Insdustry by ThemeJam A powerful Business WordPress Template along with lots of options, colors, and customizable features. This is one for almost any kind of blogger, corporate, or organization. Lots of features, gives it the kind of scalability you might need to create any kind of website. Demo | Price: $ 59 | DOWNLOAD AppPress by ChimeraThemes This professional business WordPress theme includes 5 different colour schemes, advanced theme options page, multiple homepage sliders, custom widgets and page templates. The theme also includes a range of other unique features such as custom title, live style editor to modify colours, font styles, sizes etc, and 20+ shortcodes for creating pricing tables, content columns, boxes, buttons and others. Demo | Price: $ 37 | DOWNLOAD Why WordPress Professional Template? You can modify them, these usually come with a lot of fancy features that enable you to create the website as per your usability & choice. In some cases the  Premium WordPress business themes can be accessed through a subscription service. Premium Vs Free WordPress Themes There are very good Free WordPress themes out there that you can use to modify and code further or create what you want, but this possible when you are technically able. On the contrary Premium WordPress business themes offers great features & can save you a lot of time and money. It varies from business to business, some like to keep their website simple while most want to keep cool nifty features and abilities to scale it differently for various sections, products or categories. All this & more is possible with a Professional Business theme that is suitable/close to your needs.

    Read the article

  • New MySQL Cluster 7.3 Previews: Foreign Keys, NoSQL Node.js API and Auto-Tuned Clusters

    - by Mat Keep
    At this weeks MySQL Connect conference, Oracle previewed an exciting new wave of developments for MySQL Cluster, further extending its simplicity and flexibility by expanding the range of use-cases, adding new NoSQL options, and automating configuration. What’s new: Development Release 1: MySQL Cluster 7.3 with Foreign Keys Early Access “Labs” Preview: MySQL Cluster NoSQL API for Node.js Early Access “Labs” Preview: MySQL Cluster GUI-Based Auto-Installer In this blog, I'll introduce you to the features being previewed. Review the blogs listed below for more detail on each of the specific features discussed. Save the date!: A live webinar is scheduled for Thursday 25th October at 0900 Pacific Time / 1600UTC where we will discuss each of these enhancements in more detail. Registration will be open soon and published to the MySQL webinars page MySQL Cluster 7.3: Development Release 1 The first MySQL Cluster 7.3 Development Milestone Release (DMR) previews Foreign Keys, bringing powerful new functionality to MySQL Cluster while eliminating development complexity. Foreign Key support has been one of the most requested enhancements to MySQL Cluster – enabling users to simplify their data models and application logic – while extending the range of use-cases for both custom projects requiring referential integrity and packaged applications, such as eCommerce, CRM, CMS, etc. Implementation The Foreign Key functionality is implemented directly within the MySQL Cluster data nodes, allowing any client API accessing the cluster to benefit from them – whether they are SQL or one of the NoSQL interfaces (Memcached, C++, Java, JPA, HTTP/REST or the new Node.js API - discussed later.) The core referential actions defined in the SQL:2003 standard are implemented: CASCADE RESTRICT NO ACTION SET NULL In addition, the MySQL Cluster implementation supports the online adding and dropping of Foreign Keys, ensuring the Cluster continues to serve both read and write requests during the operation.  This represents a further enhancement to MySQL Cluster's support for on0line schema changes, ie adding and dropping indexes, adding columns, etc.  Read this blog for a demonstration of using Foreign Keys with MySQL Cluster.  Getting Started with MySQL Cluster 7.3 DMR1: Users can download either the source or binary and evaluate the MySQL Cluster 7.3 DMR with Foreign Keys now! (Select the Development Release tab). MySQL Cluster NoSQL API for Node.js Node.js is hot! In a little over 3 years, it has become one of the most popular environments for developing next generation web, cloud, mobile and social applications. Bringing JavaScript from the browser to the server, the design goal of Node.js is to build new real-time applications supporting millions of client connections, serviced by a single CPU core. Making it simple to further extend the flexibility and power of Node.js to the database layer, we are previewing the Node.js Javascript API for MySQL Cluster as an Early Access release, available for download now from http://labs.mysql.com/. Select the following build: MySQL-Cluster-NoSQL-Connector-for-Node-js Alternatively, you can clone the project at the MySQL GitHub page.  Implemented as a module for the V8 engine, the new API provides Node.js with a native, asynchronous JavaScript interface that can be used to both query and receive results sets directly from MySQL Cluster, without transformations to SQL. Figure 1: MySQL Cluster NoSQL API for Node.js enables end-to-end JavaScript development Rather than just presenting a simple interface to the database, the Node.js module integrates the MySQL Cluster native API library directly within the web application itself, enabling developers to seamlessly couple their high performance, distributed applications with a high performance, distributed, persistence layer delivering 99.999% availability. The new Node.js API joins a rich array of NoSQL interfaces available for MySQL Cluster. Whichever API is chosen for an application, SQL and NoSQL can be used concurrently across the same data set, providing the ultimate in developer flexibility.  Get started with MySQL Cluster NoSQL API for Node.js tutorial MySQL Cluster GUI-Based Auto-Installer Compatible with both MySQL Cluster 7.2 and 7.3, the Auto-Installer makes it simple for DevOps teams to quickly configure and provision highly optimized MySQL Cluster deployments – whether on-premise or in the cloud. Implemented with a standard HTML GUI and Python-based web server back-end, the Auto-Installer intelligently configures MySQL Cluster based on application requirements and auto-discovered hardware resources Figure 2: Automated Tuning and Configuration of MySQL Cluster Developed by the same engineering team responsible for the MySQL Cluster database, the installer provides standardized configurations that make it simple, quick and easy to build stable and high performance clustered environments. The auto-installer is previewed as an Early Access release, available for download now from http://labs.mysql.com/, by selecting the MySQL-Cluster-Auto-Installer build. You can read more about getting started with the MySQL Cluster auto-installer here. Watch the YouTube video for a demonstration of using the MySQL Cluster auto-installer Getting Started with MySQL Cluster If you are new to MySQL Cluster, the Getting Started guide will walk you through installing an evaluation cluster on a singe host (these guides reflect MySQL Cluster 7.2, but apply equally well to 7.3 and the Early Access previews). Or use the new MySQL Cluster Auto-Installer! Download the Guide to Scaling Web Databases with MySQL Cluster (to learn more about its architecture, design and ideal use-cases). Post any questions to the MySQL Cluster forum where our Engineering team and the MySQL Cluster community will attempt to assist you. Post any bugs you find to the MySQL bug tracking system (select MySQL Cluster from the Category drop-down menu) And if you have any feedback, please post them to the Comments section here or in the blogs referenced in this article. Summary MySQL Cluster 7.2 is the GA, production-ready release of MySQL Cluster. The first Development Release of MySQL Cluster 7.3 and the Early Access previews give you the opportunity to preview and evaluate future developments in the MySQL Cluster database, and we are very excited to be able to share that with you. Let us know how you get along with MySQL Cluster 7.3, and other features that you want to see in future releases, by using the comments of this blog.

    Read the article

  • UCM 11g is 4 days old!

    - by kyle.hatlestad
    Ok...so I missed posting a blog entry when UCM 11g and the entire ECM suite released on Tuesday. Hopefully you've already seen the announcements on any number of the Oracle ECM blogs out there such as ECM Alerts, Fusion ECM, bex huff, or C4. So I won't bore you with the same talking points like 179 million check-ins per day or 124 web site page hits per second. Instead, I thought I'd show some screenshots of the new features in UCM and URM 11g. WebLogic Server and Enterprise Manager So probably the biggest change in 11g is UCM and URM now run on top of the WebLogic Server application server. This is a huge step as ECM is now on a standard platform with the rest of Oracle Fusion Middleware which makes installation, configuration, and integration consistent among all the products. From a feature perspective, it's also beneficial because it's now integrated with Oracle Enterprise Manager. Enterprise Manager provides a lot of provisioning control over servers as well as performance monitoring and access to logs and debugging information. Desktop Integration Suite Desktop Integration Suite got a complete overhaul for 11g. It exposes a lot more features within Windows Explorer such as saved searches, workflow queue, and checked-out items. It also now support metadata pop-up screens to let users fill in additional metadata when they drag-n-drop files in! And the integration within Office applications has changed significantly by introducing a dedicated UCM menu to do open, save, compare, etc. Site Studio for External Applications In UCM Site Studio 10gR4, a major architectural shift was introduced which brought several new objects such as elements, region definitions, region templates, and placeholder definitions. This truly separated the content from the display and from the definition. It also allowed separation of the content from needing to be rendered on a complete Site Studio page. Well, the new Site Studio for External Applications takes advantage of that architecture and introduces pre-built tags and plug-ins to JDeveloper to allow to go from simply adding a content area to your web application page to building an entire web site, just like you would have done in Site Studio Designer. In addition to these changes, enhancements to the core Site Studio have been added as well. One of the big ones is called Designer Mode which allows power-users to bypass the standard rules defined by the placeholder definition or template and perform any number of additional actions. This reduces the need to go back to Site Studio Designer or JDeveloper to make more advanced changes to the site. Dashboards As part of the updated records management functionality in both UCM and URM, users can now set a dashboard view on their home page to surface common functions in a single view. It has pre-built "portlets" users can choose from to display and organize they way they want. Behind the scenes, these dashboards are stored as Content Folios. So the dashboards themselves are content items that can be revisioned and shared between users. And new dashboard portlets can be easily added (like the User Profile one in the screenshots) by getting a copy of an existing one, modifying the display, and then checking it in as a new one to select from. URM Interface Enhancements URM includes several new UI and usability enhancements in 11g. There is a new view for physical records, a place to configure "favorite" items to quickly get to, and new placement of the records management menu. BI Publisher Reports Records management in UCM and URM now offer reports generated through embedded BI Publisher. Templates are controlled by rich text files checked directly into the repository, so they can be easily modified. Other Features A new Inbound Refinery conversion option is available that does native Microsoft Office HTML conversion. If your IBR is on Windows and you have the native applications loaded, the IBR can use them to produce HTML. A new GUI template editor for Dynamic Converter is available. It's written in Java so is available through all the supported browsers and platforms. The original ActiveX based editor is also still available. The Component Manager interface has changed to help provide an easier and more descriptive way to enable core components that are installed along with UCM. All of the supported components are immediately available to turn on and do not have to be installed separately as in previous versions. My Downloads is located in the My Content Server menu and provides for easy download of client installs including Desktop Integration Suite and Site Studio Designer. Well, hopefully that gives you a taste for some of the new things in 11g. We're all pretty excited here at Oracle about all the new changes and enhancements. Over the next few months I hope to highlight some of these features more in-depth, so keep your eye out for those posts.

    Read the article

  • 17 new features in Visual Studio 2010

    - by vik20000in
    Visual studio 2010 has been released to RTM a few days back. This release of Visual studio 2010 comes with a big number of improvements on many fronts. In this post I will try and point out some of the major improvements in Visual Studio 2010. 1)      Visual studio IDE Improvement. Visual studio IDE has been rewritten in WPF. The look and feel of the studio has been improved for improved readability. Start page has been redesigned and template so that anyone can change the start page as they wish. 2)      Multiple Monitor - Support for Multiple Monitor was already there in Visual studio. But in this edition it has been improved as much that we can now place the document, design and code window outside the IDE in another monitor. 3)      ZOOM in Code Editor – Making the editors in WPF has made significant improvement for them. The best one that I like is the ZOOM feature. We can now zoom in the code editor with the help of the ctrl + Mouse scroll. The zoom feature does not work on the Design surface or windows with icon like solution view and toolbox. 4)      Box Selection - Another Important improvement in the Visual studio 2010 is the box selection. We can select a rectangular by holding down the Alt Key and selecting with mouse.  Now in the rectangular selection we can insert text, Paste same code in different line etc. This is helpful if you want to convert a number of variables from public to private etc… 5)      New Improved Search – One of the best productivity improvements in Visual studio 2010 is its new search as you type support. This has been done in the Navigate To window which can be brought up by pressing (Ctrl + ,). The navigate To windows also take help of the Camel casing and will be able to search with the help of camel casing when character is entered in upper case. For example we can search AOH for AddOrederHeader. 6)      Call Hierarchy – This feature is only available to the Visual C# and Visual C++ editor. The call hierarchy windows displays the calls made to and from (yes both to and from) a selected method property or a constructor. The call hierarchy also shows the implementation of interface and the overrides of virtual or abstract methods. This window is very helpful in understanding the code flow, and evaluating the effect of making changes. The best part is it is available at design time and not at runtime only like a debugger. 7)      Highlighting references – One of the very cool stuff in Visual Studio 2010 is the fact if you select a variable then all the use of that variable will be highlighted alongside. This should work for all the result of symbols returned by Find all reference. This also works for Name of class, objects variable, properties and methods. We can also use the Ctrl + Shift + Down Arrow or Up Arror to move through them. 8)      Generate from usage - The Generate from usage feature lets you use classes and members before you define them. You can generate a stub for any undefined class, constructor, method, property, field, or enum that you want to use but have not yet defined. You can generate new types and members without leaving your current location in code, This minimizes interruption to your workflow.9)      IntelliSense Suggestion Mode - IntelliSense now provides two alternatives for IntelliSense statement completion, completion mode and suggestion mode. Use suggestion mode for situations where classes and members are used before they are defined. In suggestion mode, when you type in the editor and then commit the entry, the text you typed is inserted into the code. When you commit an entry in completion mode, the editor shows the entry that is highlighted on the members list. When an IntelliSense window is open, you can press CTRL+ALT+SPACEBAR to toggle between completion mode and suggestion mode. 10)   Application Lifecycle Management – A client application for management of application lifecycle like version control, work item tracking, build automation, team portal etc is available for free (this is not available for express edition.). 11)   Start Page – The start page has been redesigned with WPF for new functionality and look. Tabbed areas are provided for content from different source including MSDN. Once you open some project the start page closes automatically. The list of recent project also lets you remove project from the list. And above all the start page is customizable enough to be changed as per individual requirement. 12)   Extension Manager – Visual Studio 2010 has provided good ways to be extended. We can also use MEF to extend most of the features of Visual Studio. The new extension manager now can go the visual studio gallery and install the extension without even opening any explorer. 13)   Code snippets – Visual studio 2010 for HTML, Jscript and Asp.net also. 14)   Improved Intelligence for JavaScript has been improved vastly (around 2-5 times). Intelligence now also shows the XML documentation comment on the go. 15)   Web Deployment – Web Deployment has been vastly improved. We can package and publish the web application in one click. Three major supported deployment scenarios are Web packages, one click deployment and Web configuration Transformation. 16)   SharePoint - Visual Studio 2010 also brings vastly improved development experience for SharePoint. We can create, edit, debug, package, deploy and activate SharePoint project from within Visual Studio. Deployment of Site is as easy as hitting F5. 17)   Azure – Visual Studio 2010 also comes with handy improvement for developing on windows Azure environment. Vikram

    Read the article

  • Silverlight 5 Hosting :: Features in Silverlight 5 and Release Date

    - by mbridge
    Silverlight 5 is finally announced in the Silverlight FireStarter Event on the 2nd December, 2010. This new version of Silverlight which was earlier labeled as 'Future of Microsoft Silverlight' has now come much closer to go live as the first Silverlight 5 Beta version is expected to be shipped during the early months of 2011. However for the full fledged and the final release of Silverlight 5, we have to wait many more months as the same is likely to be made available within the Q3 2011. As would have been usually expected, this latest edition would feature many new capabilities thereby extending the developer productivity to a whole new dimension of premium media experience and feature-rich business applications. It comes along with many new feature updates as well as the inclusion of new technologies to improve the standard of the Silverlight applications which are now fine-tuned to produce next generation business and media solutions that is capable to meet the requirements of the advanced web-based app development. The Silverlight 5 is all set to replace the previous fourth version which now includes more than forty new features while also dropping various deprecated elements that was prevalent earlier. It has brought around some major performance enhancements and also included better support for various other tools and technologies. Following are some of the changes that are registered to be available under the Silverlight 5 Beta edition which is scheduled to be launched during the Q1 2011. Silverlight 5 : Premium Media Experiences The media features of Silverlight 5 has seen some major enhancements with a lot of optimizations being made to deliver richer solutions. It's capability has now been extended to make things easier, faster and capable of performing the desired tasks in the most efficient manner. The Silverlight media solutions has already been a part of many companies in the recent days where various on-demand Silverlight services were featured but with the arrival of the next generation premium media solution of Silverlight 5, it is expected to register new heights of success and global user acclamation for using it with many esteemed web-based projects and media solutions. - The most happening element in the new Silverlight 5 will be its support for utilizing the GPU based hardware acceleration which is intended to lower down the CPU load to a significant extent and thereby allowing faster rendering of media contents without consuming much resources. This feature is believed to be particularly helpful for low configured machines to run full HD media content without any lagging caused due to processor load. It will hence be one great feature to revolutionize the new generation high quality media contents to be available within the web in a more efficient manner with its hardware decoded video playback capabilities. - With the inclusion of hardware video decoding to minimize the processor load, the Silverlight 5 also comes with another optimization enhancement to also reduce the power consumption level by making new methods to deal with the power-saver settings. With this optimization in effect, the computer would be automatically allowed to switch to sleep mode while no video playback is in progress and also to prevent any screensavers to popup and cause annoyances during any video playback. There would also be other power saver options which will be made available to best suit the users requirements and purpose. - The Silverlight trickplay feature is another great way to tweak any silverlight powered media content as is used for many video tutorial sites or for dealing with any sort of presentations. This feature enables the user to modify the playback speed to either slowdown or speedup during the playback durations based on the requirements without compromising on the quality of output. Normally such manipulations always makes the content's audio to go off-pitch, but the same will not be the case with TrickPlay and the audio would seamlessly progress with the video without skipping any of its part. - In addition to all of the above, the new Silverlight 5 will be featuring wireless control of all the media contents by making use of remote controllers. With the use of such remote devices, it will be easier to handle the various media playback controls thereby providing more freedom while experiencing the premium media services. Silverlight 5 : Business Application Development The application development standard has been extended with more possibilities by bringing forth new and useful technologies and also reviving the existing methods to work better than what it was used to. From the UI improvements to advanced technical aspects, the Silverlight 5 scores high on all grounds to produce great next generation business delivered applications by putting in more creativity and resourceful touch to all the apps being produced with it. - The WPF feature of Silverlight is made more effective by introducing new standards of Databinding which is intended to improve the productivity standards of the Silverlight application developer. It brings in a lot of convenience in debugging the databinding components or expressions and hence making things work in a flawless manner. Some additional features related to databinding includes that of Ancestor RelativeSource, Implicit DataTemplates and Model View ViewModel (MVVM) support with DataContextChanged event and many other new features relating it. - It now comes with a refined text and printing service which facilitates better clarity of the text rendering and also many positive changes which are being applied to the layout pattern. New supports has been added to include OpenType font, multi-column text, linked-text containers and character leading support to name a few among the available features.This also includes some important printing aspects like that of Postscript Vector Printing API which allows to program our printing tasks in a user defined way and Pivot functionality for visualization concerns of informations. - The Graphics support is the key improvements being incorporated which now enables to utilize three dimensional graphics pattern using GPU acceleration. It can manage to provide some really cool visualizations being curved to provide media contents within the business apps with also the support for full HD contents at 1080p quality. - Silverlight 5 includes the support for 64-bit operating systems and relevant browsers and is also optimized to provide better performance. It can support the background thread for the networking which can reduce the latency of the network to a considerable extent. The Out-of-Browser functionality adds the support for utilizing various libraries and also the Win32 API. It also comes with testing support with VS 2010 which is mostly an automated procedure and has also enabled increased security aspects of all the Silverlight 5 developed applications by using the improved version of group policy support.

    Read the article

  • Surface development: it&rsquo;s just like software development

    - by Dennis Vroegop
    Surface is magic. Everyone using it seems to think that way. And I have to be honest, after working for almost 2 years with the platform I still get that special feeling the moment I turn on the unit to do some more work. The whole user experience, the rich environment of the SDK, the touch, even the look and feel of the Surface environment is so much different from the stuff I’ve been working on all my career that I am still bewildered by it. But… and this is a big but.. in the end we’re still talking about a computer and that needs software to become useful. Deep down the magic of the Surface unit there is a PC somewhere, running Windows Vista and the .net framework 3.5. When you write that magic software that makes the platform come alive you’re still working with .net, WPF/XNA, C#, VB.Net and all those other tools and technologies you know so well. Sure, the whole user experience is different from what you’ve known. And the way of thinking about users, their interaction and the positioning of screen elements requires a whole new paradigm. And that takes time. It took me about half a year before I had the feeling I got it nailed down. But when that moment came (about 18 months ago…) I realized that everything I had learned so far on software development still is true when it comes to Surface. The last 6 months I have been working with some people with a different background to start a new company. The idea was that the new company would be focussing on Surface and Surface only. These people come from a marketing background and had some good ideas for some applications. And I have to admit: their ideas were good. Very good. Where it all fell down of course is that these ideas need to be implemented in a piece of software. And creating great software takes skilled developers and a lot of time and money. That’s where things went wrong: the marketing guys didn’t realize and didn’t want to realize that software development is a job that takes skill. You can’t just hire a bunch of developers and expect them to deliver the best sort of software, especially not when it comes to Surface. I tried to explain that yes, their User Interface in Photoshop looked great, but no: I couldn’t develop an application like that in a weeks time. Even worse: the while backend of the software (WCF for communications, SQL Server for the database, etc) would take a lot more time than the frontend. They didn’t understand. It took them a couple of days to drawn the UI in Photoshop so in Blend I should be able to build the software in about the same amount of time. Well, you and I know that it doesn’t work that way. Software is hard to write, and even harder to write well, and it takes skill and dedication. It’s not something you can do as fast as you can draw a mock up for a Surface application in Photohop. The same holds true for web applications of course. A lot of designers there fail to appreciate the hard work that goes into writing the plumbing for a good web app that can handle thousands of users. Although the UI is very important, it’s not all there is to it. And in Surface development this is the same. The UI should create the feeling of magic, but the software behind it is what makes it come alive. And that takes time. A lot of time. So brush of you skills and don’t throw them away if you start developing for Surface. Because projects (and colaborations) can fail there as hard as they can in any other area of software development. On a side note: we decided to stop the colaboration (something the other parties involved didn’t appreciate and were very angry about) and decided to hire a designer for the Surface projects. The focus is back where it belongs: on the software development we know so well and have been doing very well for 13 years. UI is just a part of the whole project and not the end product. So my company Detrio is still going strong when it comes to develivering Surface solutions but once again from a technological background, not a marketing background.

    Read the article

  • Social Network Updates: While You Were Busy Marketing 2

    - by Mike Stiles
    Since social moves at the speed of data, it’s already time for another update, as we did back in April, on the changes the various social networks have made or gone through while you were busy marketing. Facebook There’s a lot of talk Facebook’s developing a mobile product to act like Flipboard and surface news, from both users and media outlets. The biggest news was Facebook/Instagram’s introduction of 15-second videos, enhanced with with filters, to take some of Vine’s candy. You can also delete parts of videos and rerecord them, and there’s image stabilization. Facebook’s ad revenue is coming along just fine, thank you very much. 35% quarter-to-quarter growth in Q2. And it looks like new formats like Mobile App Install Ads and Unpublished Page Posts are adding to the mix. If you don’t already, you’ll soon see a little camera in comment boxes letting you insert photos right into the comments you make. The drive toward “more visual” continues. The other big news is Facebook’s adoption of our Twitter friend, the hashtag. Adding # sets apart the post topic so it can be easily found or discovered. It’s also being added to Google Plus, Tumblr, and Pinterest. Twitter Want to send someone a promoted tweet when they’re in range of your store? That could be happening by the end of this year. Some users have been seeing automatic in-stream previews of images on Twitter.com. Right now it’s images in your own tweets, but we can assume all tweets are next. Get your followers organized! Twitter raised the limit on the number of lists you can create from 20 to 1,000. They also raised the number of accounts you can have in a list from 500 to 5,000. Twitter started notifying you when someone favorites a tweet you’re mentioned in or re-tweets a tweet you re-tweeted. Anyway, it’s the first time Twitter’s notified you about indirect interactions like that. Who’s afraid of Instagram? A study shows 6-second Vine videos are being posted to Twitter at the rate of 9/second, up from 5/second 2 months ago. Vine has over 13 million users and branded Vines are 4x more likely to be shared than video ads. Google Plus Now featuring a 3-column redesigned stream, and images that take up a whole column. And photo filters Auto Highlight and Auto Awesome work to turn your photos into a real show. Google Hangouts is the workhorse for all Google messaging now, it’s not just an online chat with 9 people anymore. Google Plus Dashboard improves the connection between your company’s Google Plus business page and your Google Plus Local. Updates go out across all Google properties and you can do your managing from the dashboard. With Google Plus’ authorship system, you can build “Author Rank” based on what you write and put on the web. If your stuff is +1’ed and shared a lot, you’re the real deal and there are search result benefits. LinkedIn "Who's Viewed Your Updates" shows you what you’ve shared recently, who saw it and what they did about it in real-time. “Influencers” is, well, influential. Traffic to all LI news products has gone up 8x since it was introduced. LinkedIn is quickly figuring out how to get users to stick around awhile. You and your brand can post images and documents in status updates now. In fact, that whole “document posting” thing is making some analysts wonder if LinkedIn will drift on over to the Dropboxes and YouSendIts of the world. C’mon, admit it. Your favorite part of LinkedIn is being able to see who’s viewed your profile. Now you’ve got even more info and can see what/who you have in common. Premium users get even deeper insights about how people are finding them. If you’re a big fan of security, you’ll love that LinkedIn started offering two-factor authentication (2FA). It’s optional, but step 2 is a one-time code texted to your registered mobile. Pinterest A study showed pins have a looong shelf life compared to other social net posts. “Clicks kept coming for 30 days and beyond.” Most pins are timeless, and the infinite scroll causes people to see older pins. Is it a keeper? Pinterest jumped 82% to 54 million users in the past year. It’s valued at $2.5 billion and is one of the biggest sources of referral traffic there is. That said, CEO Ben Silbermann adds, "Right now, we don't make money." A new search feature stops you from having to endlessly scroll through your own pins looking for that waterfall picture you posted. Simply select “just my pins” in the search bar. New "Rich Pins" lets brands add info like price and availability to pins that can be updated daily via a data feed from your merchant site. Not so fast, you have to apply to Pinterest for it first. Like other social nets, Pinterest does not allow sexual content, nudity, or even partial nudity. However…some art contains nudity, and Pinterest wants to allow art. What constitutes “art” will be judged by…what we have to assume are Pinterest employees who love their job. @mikestilesPhoto: stock.xchng, Tim Marmon

    Read the article

  • C# in Depth, Third Edition by Jon Skeet, Manning Publications Co. Book Review

    - by Compudicted
    Originally posted on: http://geekswithblogs.net/Compudicted/archive/2013/10/24/c-in-depth-third-edition-by-jon-skeet-manning-publications.aspx I started reading this ebook on September 28, 2013, the same day it was sent my way by Manning Publications Co. for review while it still being fresh off the press. So 1st thing – thanks to Manning for this opportunity and a free copy of this must have on every C# developer’s desk book! Several hours ago I finished reading this book (well, except a for a large portion of its quite lengthy appendix). I jumped writing this review right away while still being full of emotions and impressions from reading it thoroughly and running code examples. Before I go any further I would like say that I used to program on various platforms using various languages starting with the Mainframe and ending on Windows, and I gradually shifted toward dealing with databases more than anything, however it happened with me to program in C# 1 a lot when it was first released and then some C# 2 with a big leap in between to C# 5. So my perception and experience reading this book may differ from yours. Also what I want to tell is somewhat funny that back then, knowing some Java and seeing C# 1 released, initially made me drawing a parallel that it is a copycat language, how wrong was I… Interestingly, Jon programs in Java full time, but how little it was mentioned in the book! So more on the book: Be informed, this is not a typical “Recipes”, “Cookbook” or any set of ready solutions, it is rather targeting mature, advanced developers who do not only know how to use a number of features, but are willing to understand how the language is operating “under the hood”. I must state immediately, at the same time I am glad the author did not go into the murky depths of the MSIL, so this is a very welcome decision on covering a modern language as C# for me, thank you Jon! Frankly, not all was that rosy regarding the tone and structure of the book, especially the the first half or so filled me with several negative and positive emotions overpowering each other. To expand more on that, some statements in the book appeared to be bias to me, or filled with pre-justice, it started to look like it had some PR-sole in it, but thankfully this was all gone toward the end of the 1st third of the book. Specifically, the mention on the C# language popularity, Java is the #1 language as per https://sites.google.com/site/pydatalog/pypl/PyPL-PopularitY-of-Programming-Language (many other sources put C at the top which I highly doubt), also many interesting functional languages as Clojure and Groovy appeared and gained huge traction which run on top of Java/JVM whereas C# does not enjoy such a situation. If we want to discuss the popularity in general and say how fast a developer can find a new job that pays well it would be indeed the very Java, C++ or PHP, never C#. Or that phrase on language preference as a personal issue? We choose where to work or we are chosen because of a technology used at a given software shop, not vice versa. The book though it technically very accurate with valid code, concise examples, but I wish the author would give more concrete, real-life examples on where each feature should be used, not how. Another point to realize before you get the book is that it is almost a live book which started to be written when even C# 3 wasn’t around so a lot of ground is covered (nearly half of the book) on the pre-C# 3 feature releases so if you already have a solid background in the previous releases and do not plan to upgrade, perhaps half of the book can be skipped, otherwise this book is surely highly recommended. Alas, for me it was a hard read, most of it. It was not boring (well, only may be two times), it was just hard to grasp some concepts, but do not get me wrong, it did made me pause, on several occasions, and made me read and re-read a page or two. At times I even wondered if I have any IQ at all (LOL). Be prepared to read A LOT on generics, not that they are widely used in the field (I happen to work as a consultant and went thru a lot of code at many places) I can tell my impression is the developers today in best case program using examples found at OpenStack.com. Also unlike the Java world where having the most recent version is nearly mandated by the OSS most companies on the Microsoft platform almost never tempted to upgrade the .Net version very soon and very often. As a side note, I was glad to see code recently that included a nullable variable (myvariable? notation) and this made me smile, besides, I recommended that person this book to expand her knowledge. The good things about this book is that Jon maintains an active forum, prepared code snippets and even a small program (Snippy) that is happy to run the sample code saving you from writing any plumbing code. A tad now on the C# language itself – it sure enjoyed a wonderful road toward perfection and a very high adoption, especially for ASP development. But to me all the recent features that made this statically typed language more dynamic look strange. Don’t we have F#? Which supposed to be the dynamic language? Why do we need to have a hybrid language? Now the developers live their lives in dualism of the static and dynamic variables! And LINQ to SQL, it is covered in depth, but wasn’t it supposed to be dropped? Also it seems that very little is being added, and at a slower pace, e.g. Roslyn will come in late 2014 perhaps, and will be probably the only main feature. Again, it is quite hard to read this book as various chapters, C# versions mentioned every so often only if I only could remember what was covered exactly where! So the fact it has so many jumps/links back and forth I recommend the ebook format to make the navigations easier to perform and I do recommend using software that allows bookmarking, also make sure you have access to plenty of coffee and pizza (hey, you probably know this joke – who a programmer is) ! In terms of closing, if you stuck at C# 1 or 2 level, it is time to embrace the power of C# 5! Finally, to compliment Manning, this book unlike from any other publisher so far, was the only one as well readable (put it formatted) on my tablet as in Adobe Reader on a laptop.

    Read the article

  • Automating Form Login

    - by Greg_Gutkin
    Introduction A common task in configuring a web application for proxying in Pagelet Producer is setting up form autologin. PP provides a wizard-like tool for detecting the login form fields, but this is usually only the first step in configuring this feature. If the generated configuration doesn't seem to work, some additional manual modifications will be needed to complete the setup. This article will try to guide you through this process while steering you away from common pitfalls. For the purposes of this article, let's assume the following characteristics about your environment: Web Application Base URL: http://host/app (configured as Resource Source URL in PP) Pagelet Producer Base URL: http://pp/pagelets Form Field Auto-Detection Form Autologin is configured in the PP Admin UI under resource_name/Autologin/Form Login. First, you'll enter the URL to the login form under "Login Form Identification". This will enable the admin wizard to connect to and display the login page. Caution: RedirectsMake sure the entered URL matches what you see in the browser's address bar, when the application login page is displayed. For example, even though you may be able to reach the login page by simply typing http://host/app, the URL you end up on may change to http://host/app/login via browser redirect(s).The second URL is the one you will want to use. Caution: External Login ServersThe login page may actually come from a different server than the application you are trying to proxy. For example, you may notice that the login page URL changes to http://hostB/appB. This is common when external SSO products are involved. There are two ways of dealing with this situation. One is to configure Pagelet Producer to participate in SSO. This approach is out of scope of this article and is discussed in a separate whitepaper (TODO add link). The second approach is to use the autologin feature to provide stored credentials to the SSO login form. Since the login form URL is not an extension of the application base URL (PP resource URL), you will need to add a new PP resource for the SSO server and configure the login form on that resource instead of the original application resource. One side benefit of this additional resource is that it can reused for other applications relying on the same SSO server for login. After entering the login page URL (make sure dropdown says "URL"), click "Automatically Detect Form Fields". This will bring up the web app's login page in a new browser window. Fill it out and submit it as you would normally. If everything goes right, Pagelet Producer will intercept the submitted values and fill out all the needed configuration data in the Admin UI. If the login form window doesn't close or configuration data doesn't get filled in, you may have not entered the login page URL correctly. Review the two cautionary notes above and make any necessary changes. If the form fields got filled automatically, it's time to save the configuration and test it out. If you can access a protected area of the backend application via a proxied PP URL without filling out its login form, then you are pretty much done with login form configuration. The only other step you will need to complete before declaring this aspect of configuration production ready is configuring form field source. You may skip to that section below. Manual Login Form Identification Let's take a closer look at Login Form Identification. This determines how Pagelet Producer recognizes login forms as such. URL The most efficient way of detecting login forms is by looking at the page URL. This method can only be used under the following conditions: Login page URL must be different from the post login application URLs. Login page URL must stay constant regardless of the path it takes to reach the page. For example, reaching the login page by going to the application base URL or to a specific protected URL must result in a redirect to the same login page URL (query string excluded). If only the query string parameters change, just leave out the query string from the configured login page URL. If either of these conditions is not fullfilled, you must switch to the RegEx approach below. RegEx If the login page URL is not uniform enough across all scenarios or is indistinguishable from other page locations, PP can be configured to recognize it by looking at the page markup itself. This is accomplished by changing the dropdown to "RegEx". If regular expressions scare you, take comfort from the fact that in most cases you won't need to enter any special regex characters. Let's look at an example: Say you have a login form that looks like <form id='loginForm' action='login?from=pageA' > <input id='user'> <input id='pass'> </form> Since this form has an id attribute, you can be reasonably sure that this login form can be uniquely identified across the web application by this snippet: "id='loginForm'". (Unless, of course your backend web application contains login forms to other apps). Since no wildcards are needed to find this snippet, you can just enter it as is into the RegEx field - no special regular expression characters needed! If the web developer who created the form wasn't kind enough to provide a unique id, you will need to look for other snippets of the page to uniquely identify it. It could be the action URL, an input field id, or some other markup fragment. You should abstain from using UI text as an identifier it may change in translated versions of the page and prevent the login page logic from working for international users. You may need to turn to regular expression wildcard syntax if no simple matches work. For more information on regular expression, refer to the Resources section. Form Submit Location Now we'll look at the form submit location. If the captured URL contains query string parameters that will likely change from one form submission to the next, you will need to change its type to RegEx. This type will tell Pagelet Producer to parse the login page for the action URL and submit to the value found. The regular expression needs to point at the actual action URL with its first grouping expression. Taking the example form definition above, the form submit location regex would be: action='(.*?)' The parentheses are used to identify the actual action URL, while the rest of the expression provides the context for finding it. Expression .*? is a so-called reluctant wildcard that matches any character excluding the single quote that follows. See Resources section below for further information on regular expressions. Manual Form Field Detection If the Admin UI form field detection wizard fails to populate login form configuration page, you will have to enter the fields by hand. Use a built-in browser developer tool or addon (e.g. Firebug) to inspect the form element and its children input elements. For each input element (including hidden elements), create an entry under Form Fields. Change its Source according to the next section. Form Field Source Change the source of any of the fields not exposed to the users of the login form (i.e. hidden fields) to "Generated". This means Pagelet Producer will just use the values returned by the web app rather than supplying values it stored. For fields that contain sensitive data or vary from user to user (e.g. username & password), change the source to User (Credential) Vault. Logging Support To help you troubleshoot you autologin configuration, PP provides some useful logging support. To turn on detailed logging for the autologin feature, navigate to Settings in Admin UI. Under Logging, change the log level for AutoLogin to Finest. Known Limitations Autologin feature may not work as expected if login form fields (not just the values, but the DOM elements themselves) are generated dynamically by client side JavaScript. Resources RegEx RegEx Reference from Java RegEx Test Tool

    Read the article

  • What’s New for Oracle Commerce? Executive QA with John Andrews, VP Product Management, Oracle Commerce

    - by Katrina Gosek
    Oracle Commerce was for the fifth time positioned as a leader by Gartner in the Magic Quadrant for E-Commerce. This inspired me to sit down with Oracle Commerce VP of Product Management, John Andrews to get his perspective on what continues to make Oracle a leader in the industry and what’s new for Oracle Commerce in 2013. Q: Why do you believe Oracle Commerce continues to be a leader in the industry? John: Oracle has a great acquisition strategy – it brings best-of-breed technologies into the product fold and then continues to grow and innovate them. This is particularly true with products unified into the Oracle Commerce brand. Oracle acquired ATG in late 2010 – and then Endeca in late 2011. This means that under the hood of Oracle Commerce you have market-leading technologies for cross-channel commerce and customer experience, both designed and developed in direct response to the unique challenges online businesses face. And we continue to innovate on capabilities core to what our customers need to be successful – contextual and personalized experience delivery, merchant-inspired tools, and architecture for performance and scalability. Q: It’s not a slow moving industry. What are you doing to keep the pace of innovation at Oracle Commerce? John: Oracle owes our customers the most innovative commerce capabilities. By unifying the core components of ATG and Endeca we are delivering on this promise. Oracle Commerce is continuing to innovate and redefine how commerce is done and in a way that drive business results and keeps customers coming back for experiences tailored just for them. Our January and May 2013 releases not only marked the seventh significant releases for the solution since the acquisitions of ATG and Endeca, we also continue to demonstrate rapid and significant progress on the unification of commerce and customer experience capabilities of the two commerce technologies. Q: Can you tell us what was notable about these latest releases under the Oracle Commerce umbrella? John: Specifically, our latest product innovations give businesses selling online the ability to get to market faster with more personalized commerce experiences in the following ways: Mobile: the latest Commerce Reference Application in this release offers a wider range of examples for online businesses to leverage for iOS development and specifically new iPad reference capabilities. This release marks the first release of the iOS Universal application that serves both the iPhone and iPad devices from a single download or binary. Business users can now drive page content management and layout of search results and category pages, as well as create additional storefront elements such as categories, facets / dimensions, and breadcrumbs through Experience Manager tools. Cross-Channel Commerce: key commerce platform capabilities have been added to support cross-channel commerce, including an expanded inventory model to maintain inventory for stores, pickup in stores and Web-based returns. Online businesses with in-store operations can now offer advanced shipping options on the web and make returns and exchange logic easily available on the web. Multi-Site Capabilities: significant enhancements to the Commerce Platform multi-site architecture that allows business users to quickly launch and manage multiple sites on the same cluster and share data, carts, and other components. First introduced in 2010, with this latest release business users can now partition or share customer profiles, control users’ site-based access, and manage personalization assets using site groups. Internationalization: continued language support and enhancements for business user tools as well and search and navigation. Guided Search now supports 35 total languages with 11 new languages (including Danish, Arabic, Norwegian, Serbian Cyrillic) added in this release. Commerce Platform tools now include localized support for 17 locales with 4 new languages (Danish, Portuguese (European), Finnish, and Thai). No development or customization is required in order for business users to use the applications in any of these supported languages. Business Tool Experience: valuable new Commerce Merchandising features include a new workflow for making emergency changes quickly and increased visibility into promotions rules and qualifications in preview mode. Oracle Commerce business tools continue to become more and more feature rich to provide intuitive, easy- to-use (yet powerful) capabilities to allow business users to manage content and the shopping experience. Commerce & Experience Unification: demonstrable unification of commerce and customer experience capabilities include – productized cartridges that provide supported integration between the Commerce Platform and Experience Management tools, cross-channel returns, Oracle Service Cloud integration, and integrated iPad application. The mission guiding our product development is to deliver differentiated, personalized user experiences across any device in a contextual manner – and to give the business the best tools to tune and optimize those user experiences to meet their business objectives. We also need to do this in a way that makes it operationally efficient for the business, keeping the overall total cost of ownership low – yet also allows the business to expand, whether it be to new business models, geographies or brands. To learn more about the latest Oracle Commerce releases and mission, visit the links below: • Hear more from John about the Oracle Commerce mission • Hear from Oracle Commerce customers • Documentation on the new releases • Listen to the Oracle ATG Commerce 10.2 Webcast • Listen to the Oracle Endeca Commerce 3.1.2 Webcast

    Read the article

  • 202 blog articles

    - by mprove
    All my blog articles under blogs.oracle.com since August 2005: 202 blog articles Apr 2012 blogs.oracle.com design patch Mar 2012 Interaction 12 - Critique Mar 2012 Typing. Clicking. Dancing. Feb 2012 Desktop Mobility in Hospitals with Oracle VDI /video Feb 2012 Interaction 12 in Dublin - Highlights of Day 3 Feb 2012 Interaction 12 in Dublin - Highlights of Day 2 Feb 2012 Interaction 12 in Dublin - Highlights of Day 1 Feb 2012 Shit Interaction Designers Say Feb 2012 Tips'n'Tricks for WebCenter #3: How to display custom page titles in Spaces Jan 2012 Tips'n'Tricks for WebCenter #2: How to create an Admin menu in Spaces and save a lot of time Jan 2012 Tips'n'Tricks for WebCenter #1: How to apply custom resources in Spaces Jan 2012 Merry XMas and a Happy 2012! Dec 2011 One Year Oracle SocialChat - The Movie Nov 2011 Frank Ludolph's Last Working Day Nov 2011 Hans Rosling at TED Oct 2011 200 Countries x 200 Years Oct 2011 Blog Aggregation for Desktop Virtualization Oct 2011 Oracle VDI at OOW 2011 Sep 2011 Design for Conversations & Conversations for Design Sep 2011 All Oracle UX Blogs Aug 2011 Farewell Loriot Aug 2011 Oracle VDI 3.3 Overview Aug 2011 Sutherland's Closing Remarks at HyperKult Aug 2011 Surface and Subface Aug 2011 Back to Childhood in UI Design Jul 2011 The Art of Engineering and The Engineering of Art Jul 2011 Oracle VDI Seminar - June-30 Jun 2011 SGD White Paper May 2011 TEDxHamburg Live Feed May 2011 Oracle VDI in 3 Minutes May 2011 Space Ship Earth 2011 May 2011 blog moving times Apr 2011 Frozen tag cloud Apr 2011 Oracle: Hardware Software Complete in 1953 Apr 2011 Interaction Design with Wireframes Apr 2011 A guide to closing down a project Feb 2011 Oracle VDI 3.2.2 Jan 2011 free VDI charts Jan 2011 Sun Founders Panel 2006 Dec 2010 Sutherland on Leadership Dec 2010 SocialChat: Efficiency of E20 Dec 2010 ALWAYS ON Desktop Virtualization Nov 2010 12,000 Desktops at JavaOne Nov 2010 SocialChat on Sharing Best Practices Oct 2010 Globe of Visitors Oct 2010 SocialChat about the Next Big Thing Oct 2010 Oracle VDI UX Story - Wireframes Oct 2010 What's a PC anyway? Oct 2010 SocialChat on Getting Things Done Oct 2010 SocialChat on Infoglut Oct 2010 IT Twenty Twenty Oct 2010 Desktop Virtualization Webcasts from OOW Oct 2010 Oracle VDI 3.2 Overview Sep 2010 Blog Usability Top 7 Sep 2010 100 and counting Aug 2010 Oracle'izing the VDI Blogs Aug 2010 SocialChat on Apple Aug 2010 SocialChat on Video Conferencing Aug 2010 Oracle VDI 3.2 - Features and Screenshots Aug 2010 SocialChat: Don't stop making waves Aug 2010 SocialChat: Giving Back to the Community Aug 2010 SocialChat on Learning in Meetings Aug 2010 iPAD's Natural User Interface Jul 2010 Last day for Sun Microsystems GmbH Jun 2010 SirValUse Celebration Snippets Jun 2010 10 years SirValUse - Happy Birthday! Jun 2010 Wim on Virtualization May 2010 New Home for Oracle VDI Apr 2010 Renaissance Slide Sorter Comments Apr 2010 Unboxing Sun Ray 3 Plus Apr 2010 Desktop Virtualisierung mit Sun VDI 3.1 Apr 2010 Blog Relaunch Mar 2010 Social Messaging Slides from CeBIT Mar 2010 Social Messaging Talk at CeBIT Feb 2010 Welcome Oracle Jan 2010 My last presentation at Sun Jan 2010 Ivan Sutherland on Leadership Jan 2010 Learning French with Sun VDI Jan 2010 Learning Danish with Sun Ray Jan 2010 VDI workshop in Nieuwegein Jan 2010 Happy New Year 2010 Jan 2010 On Creating Slides Dec 2009 Best VDI Ever Nov 2009 How to store the Big Bang Nov 2009 Social Enterprise Tools. Beipiel Sun. Nov 2009 Nov-19 Nov 2009 PDF and ODF links on your blog Nov 2009 Q&A on VDI and MySQL Cluster Nov 2009 Zürich next week: Swiss Intranet Summit 09 Nov 2009 Designing for a Sustainable World - World Usabiltiy Day, Nov-12 Nov 2009 How to export a desktop from VDI 3 Nov 2009 Virtualisation Roadshow in the UK Nov 2009 Project Wonderland at EDUCAUSE 09 Nov 2009 VDI Roadshow in Dublin, Nov-26, 2009 Nov 2009 Sun VDI at EDUCAUSE 09 Nov 2009 Sun VDI 3.1 Architecture and New Features Oct 2009 VDI 3.1 is Early-Access Sep 2009 Virtualization for MySQL on VMware Sep 2009 Silpion & 13. Stock Sommerparty Sep 2009 Sun Ray and VMware View 3.1.1 2009-08-31 New Set of Sun Ray Status Icons 2009-08-25 Virtualizing the VDI Core? 2009-08-23 World Usability Day Hamburg 2009 - CfP 2009-07-16 Rising Sun 2009-07-15 featuring twittermeme 2009-06-19 ISC09 Student Party on June-20 /Hamburg 2009-06-18 Before and behind the curtain of JavaOne 2009-06-09 20k desktops at JavaOne 2009-06-01 sweet microblogging 2009-05-25 VDI 3 - Why you need 3 VDI hosts and what you can do about that? 2009-05-21 IA Konferenz 2009 2009-05-20 Sun VDI 3 UX Story - Power of the Web 2009-05-06 Planet of Sun and Oracle User Experience Design 2009-04-22 Sun VDI 3 UX Story - User Research 2009-04-08 Sun VDI 3 UX Story - Concept Workshops 2009-04-06 Localized documentation for Sun Ray Connector for VMware View Manager 1.1 2009-04-03 Sun VDI 3 Press Release 2009-03-25 Sun VDI 3 launches today! 2009-03-25 Sun Ray Connector for VMware View Manager 1.1 Update 2009-03-11 desktop virtualization wiki relaunch 2009-03-06 VDI 3 at CeBIT hall 6, booth E36 2009-03-02 Keyboard layout problems with Sun Ray Connector for VMware VDM 2009-02-23 wikis.sun.com tips & tricks 2009-02-23 Sun VDI 3 is in Early Access 2009-02-09 VirtualCenter unable to decrypt passwords 2009-02-02 Sun & VMware Desktop Training 2009-01-30 VDI at next09? 2009-01-16 Sun VDI: How to use virtual machines with multiple network adapters 2009-01-07 Sun Ray and VMware View 2009-01-07 Hamburg World Usability Day 2008 - Webcasts 2009-01-06 Sun Ray Connector for VMware VDM slides 2008-12-15 mother of all demos 2008-12-08 Build your own Thumper 2008-12-03 Troubleshooting Sun Ray Connector for VMware VDM 2008-12-02 My Roller Tag Cloud 2008-11-28 Sun Ray Connector: SSL connection to VDM 2008-11-25 Setting up SSL and Sun Ray Connector for VMware VDM 2008-11-13 Inspiration for Today and Tomorrow 2008-10-23 Sun Ray Connector for VMware VDM released 2008-10-14 From Sketchpad to ILoveSketch 2008-10-09 Desktop Virtualization on Xing 2008-10-06 User Experience Forum on Xing 2008-10-06 Sun Ray Connector for VMware VDM certified 2008-09-17 Virtual Clouds over Las Vegas 2008-09-14 Bill Verplank sketches metaphors 2008-09-04 End of Early Access - Sun Ray Connector for VMware 2008-08-27 Early Access: Sun Ray Connector for VMware Virtual Desktop Manager 2008-08-12 Sun Virtual Desktop Connector - Insides on Recycling Part 2 2008-07-20 Sun Virtual Desktop Connector - Insides on Recycling Part 3 2008-07-20 Sun Virtual Desktop Connector - Insides on Recycling 2008-07-20 lost in wiki space 2008-07-07 Evolution of the Desktop 2008-06-17 Virtual Desktop Webcast 2008-06-16 Woodstock 2008-06-16 What's a Desktop PC anyway? 2008-06-09 Virtual-T-Box 2008-06-05 Virtualization Glossary 2008-05-06 Five User Experience Principles 2008-04-25 Virtualization News Feed 2008-04-21 Acetylcholinesterase - Second Season 2008-04-18 Acetylcholinesterase - End of Signal 2007-12-31 Produkt-Management ist... 2007-10-22 Usability Verbände, Verteiler und Netzwerke. 2007-10-02 The Meaning is the Message 2007-09-28 Visualization Methods 2007-09-10 Inhouse und Open Source Projekte – Usability verankern und Synergien nutzen 2007-09-03 Der Schwabe Darth Vader entdeckt das Virale Marketing 2007-08-29 Dick Hardt 3.0 on Identity 2.0 2007-08-27 quality of written text depends on the tool 2007-07-27 podcasts for reboot9 2007-06-04 It is the user's itch that need to be scratched 2007-05-25 A duel at reboot9 2007-05-14 Taxonomien und Folksonomien - Tagging als neues HCI-Element 2007-05-10 Dueling Interaction Models of Personal-Computing and Web-Computing 2007-03-01 22.März: Weizenbaum. Rebel at Work. /Filmpremiere Hamburg 2007-02-25 Bruce Sterling at UbiComp 2006 /webcast 2006-11-12 FSOSS 2006 /webcasts 2006-11-10 Highway 101 2006-11-09 User Experience Roundtable Hamburg: EuroGEL 2006 2006-11-08 Douglas Adams' Hyperland (BBC 1990) 2006-10-08 Taxonomien und Folksonomien – Tagging als neues HCI-Element 2006-09-13 Usability im Unternehmen 2006-09-13 Doug does HyperScope 2006-08-26 TED Talks and TechTalks 2006-08-21 Kai Krause über seine Freundschaft zu Douglas Adams 2006-07-20 Rebel At Work: Film Portrait on Weizenbaum 2006-07-04 Gabriele Fischer, mp3 2006-06-07 Dick Hardt at ETech 06 2006-06-05 Weinberger: From Control to Conversation 2006-04-16 Eye Tracking at User Experience Roundtable Hamburg 2006-04-14 dropping knowledge 2006-04-09 GEL 2005 2006-03-13 slide photos of reboot7 2006-03-04 Dick Hardt on Identity 2.0 2006-02-28 User Experience Newsletter #13: Versioning 2006-02-03 Ester Dyson on Choice and Happyness 2006-02-02 Requirements-Engineering im Spannungsfeld von Individual- und Produktsoftware 2006-01-15 User Experience Newsletter #12: Intuition Quiz 2005-11-30 User Experience und Requirements-Engineering für Software-Projekte 2005-10-31 Ivan Sutherland on "Research and Fun" 2005-10-18 Ars Electronica / Mensch und Computer 2005 2005-09-14 60 Jahre nach Memex: Über die Unvereinbarkeit von Desktop- und Web-Paradigma 2005-08-31 reboot 7 2005-06-30

    Read the article

  • Oracle Unveils Industry’s Broadest Cloud Strategy

    - by kellsey.ruppel
    Oracle Unveils Industry’s Broadest Cloud Strategy Adds Social Cloud and Showcases early customers Redwood Shores, Calif. – June 6, 2012 “Almost seven years of relentless engineering and innovation plus key strategic acquisitions. An investment of billions. We are now announcing the most comprehensive Cloud on the planet Earth,” said Oracle CEO, Larry Ellison. “Most cloud vendors only have niche assets. They don’t have platforms to extend. Oracle is the only vendor that offers a complete suite of modern, socially-enabled applications, all based on a standards-based platform.” News Facts In a major strategy update today, Larry Ellison announced the industry’s broadest and most advanced Cloud strategy and introduced Oracle Cloud Social Services, a broad Enterprise Social Platform offering. Oracle Cloud delivers a broad set of industry-standards based, integrated services that provide customers with subscription-based access to Oracle Platform Services, Application Services, and Social Services, all completely managed, hosted and supported by Oracle. Offering a wide range of business applications and platform services, the Oracle Cloud is the only cloud to enable customers to avoid the data and business process fragmentation that occurs when using multiple, siloed public clouds. Oracle Cloud is powered by leading enterprise-grade infrastructure, including Oracle Exadata and Oracle Exalogic, providing customers and partners with a high-performance, reliable, and secure infrastructure for running critical business applications. Oracle Cloud enables easy self-service for both business users and developers. Business users can order, configure, extend, and monitor their applications. Developers and administrators can easily develop, deploy, monitor and manage their applications. As part of the event, Oracle also showcased several early Oracle Cloud customers and partners including system integrators and independent software vendors. Oracle Cloud Platform Services Built on a common, complete, standards-based and enterprise-grade set of infrastructure components, Oracle Cloud Platform Services enable customers to speed time to market and lower costs by quickly building, deploying and managing bespoke applications. Oracle Cloud Platform Services will include: Database Services to manage data and build database applications with the Oracle Database. Java Services to develop, deploy and manage Java applications with Oracle WebLogic. Developer Services to allow application developers to collaboratively build applications. Web Services to build Web applications rapidly using PHP, Ruby, and Python. Mobile Services to allow developers to build cross-platform native and HTML5 mobile applications for leading smartphones and tablets. Documents Services to allow project teams to collaborate and share documents through online workspaces and portals. Sites Services to allow business users to develop and maintain visually engaging .com sites Analytics Services to allow business users to quickly build and share analytic dashboards and reports through the Cloud. Oracle Cloud Application Services Oracle Cloud Application Services provides customers access to the industry’s broadest range of enterprise applications available in the cloud today, with built-in business intelligence, social and mobile capabilities. Easy to setup, configure, extend, use and administer, Oracle Cloud Application Services will include: ERP Services: A complete set of Financial Accounting, Project Management, Procurement, Sourcing, and Governance, Risk & Compliance solutions. HCM Services: A complete Human Capital Management solution including Global HR, Workforce Lifecycle Management, Compensation, Benefits, Payroll and other solutions. Talent Management Services: A complete Talent Management solution including Recruiting, Sourcing, Performance Management, and Learning. Sales and Marketing Services: A complete Sales and Marketing solution including Sales Planning, Territory Management, Leads & Opportunity Management, and Forecasting. Customer Experience Services: A complete Customer Service solution including Web Self-Service, Contact Centers, Knowledge Management, Chat, and e-mail Management. Oracle Cloud Social Services Oracle Cloud Social Services provides the most broad and complete enterprise social platform available in the cloud today.  With Oracle Cloud Social Services, enterprises can engage with their customers on a range of social media properties in a comprehensive and meaningful fashion including social marketing, commerce, service and listening. The platform also provides enterprises with a rich social networking solution for their employees to collaborate effectively inside the enterprise. Oracle’s integrated social platform will include: Oracle Social Network to enable secure enterprise collaboration and purposeful social networking for business. Oracle Social Data Services to aggregate data from social networks and enterprise data sources to enrich business applications. Oracle Social Marketing and Engagement Services to enable marketers to centrally create, publish, moderate, manage, measure and report on their social marketing campaigns. Oracle Social Intelligence Services to enable marketers to analyze social media interactions and to enable customer service and sales teams to engage with customers and prospects effectively. Supporting Resources Oracle Cloud – learn more cloud.oracle.com – sign up now Webcast – watch the replay About Oracle Oracle engineers hardware and software to work together in the cloud and in your data center. For more information about Oracle (NASDAQ:ORCL), visit www.oracle.com. TrademarksOracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners.

    Read the article

  • CQRS &ndash; Questions and Concerns

    - by Dylan Smith
    I’ve been doing a lot of learning on CQRS and Event Sourcing over the last little while and I have a number of questions that I haven’t been able to answer. 1. What is the benefit of CQRS when compared to a typical DDD architecture that uses Event Sourcing and properly captures intent and behavior via verb-based commands? (other than Scalability) 2. When using CQRS what do you do with complex query-based logic? I’m going to elaborate on #1 in this blog post and I’ll do a follow-up post on #2. I watched through Greg Young’s video on the business benefits of CQRS + Event Sourcing and first let me say that I thought it was an excellent presentation that really drives home a lot of the benefits to this approach to architecture (I watched it twice in a row I enjoyed it so much!). But it didn’t answer some of my questions fully (I wish I had been there to ask these of Greg in person!). So let me pick apart some of the points he makes and how they relate to my first question above. I’m completely sold on the idea of event sourcing and have a clear understanding of the benefits that it brings to the table, so I’m not going to question that. But you can use event sourcing without going to a CQRS architecture, so my main question is around the benefits of CQRS + Event Sourcing vs Event Sourcing + Typical DDD architecture Architecture with Event Sourcing + Commands on Left, CQRS on Right Greg talks about how the stereotypical architecture doesn’t support DDD, but is that only because his diagram shows DTO’s coming up from the client. If we use the same diagram but allow the client to send commands doesn’t that remove a lot of the arguments that Greg makes against the stereotypical architecture? We can now introduce verbs into the system. We can capture intent now (storing it still requires event sourcing, but you can implement event sourcing without doing CQRS) We can create a rich domain model (as opposed to an anemic domain model) Scalability is obviously a benefit that CQRS brings to the table, but like Greg says, very few of the systems we create truly need significant scalability Greg talks about the ability to scale your development efforts. He says CQRS allows you to split the system into 3 parts (Client, Domain/Commands, Reads) and assign 3 teams of developers to work on them in parallel; letting you scale your development efforts by 3x with nearly linear gains. But in the stereotypical architecture don’t you already have 2 separate modules that you can split your dev efforts between: The client that sends commands/queries and receives DTO’s, and the Domain which accepts commands/queries, and generates events/DTO’s. If this is true it’s not really a 3x scaling you achieve with CQRS but merely a 1.5x scaling which while great doesn’t sound nearly as dramatic (“I can do it with 10 devs in 12 months – let me hire 5 more and we can have it done in 8 months”). Making the Query side “stupid simple” such that you can assign junior developers (or even outsource it) sounds like a valid benefit, but I have some concerns over what you do with complex query-based logic/behavior. I’m going to go into more detail on this in a follow-up blog post shortly. He also seemed to focus on how “stupid-simple” it is doing queries against the de-normalized data store, but I imagine there is still significant complexity in the event handlers that interpret the events and apply them to the de-normalized tables. It sounds like Greg suggests that because we’re doing CQRS that allows us to apply Event Sourcing when we otherwise wouldn’t be able to (~33:30 in the video). I don’t believe this is true. I don’t see why you wouldn’t be able to apply Event Sourcing without separating out the Commands and Queries. The queries would just operate against the domain model instead of the database. But you’d still get the benefits of Event Sourcing. Without CQRS the queries would only be able to operate against the current state rather than the event history, but even in CQRS the domain behaviors can only operate against the current state and I don’t see that being a big limiting factor. If some query needs to operate against something that is not captured by the current state you would just have to update the domain model to capture that information (no different than if that statement were made about a Command under CQRS). Some of the benefits I do see being applicable are that your domain model might end up being simpler/smaller since it only needs to represent the state needed to process commands and not worry about the reads (like the Deactivate Inventory Item and associated comment example that Greg provides). And also commands that can be handled in a Transaction Script style manner by the command handler simply generating events and not touching the domain model. It also makes it easier for your senior developers to focus on the command behavior and ignore the queries, which is usually going to be a better use of their time. And of course scalability. If anybody out there has any thoughts on this and can help educate me further, please either leave a comment or feel free to get in touch with me via email:

    Read the article

  • Oracle Enterprise Data Quality: Ever Integration-ready

    - by Mala Narasimharajan
    It is closing in on a year now since Oracle’s acquisition of Datanomic, and the addition of Oracle Enterprise Data Quality (EDQ) to the Oracle software family. The big move has caused some big shifts in emphasis and some very encouraging excitement from the field.  To give an illustration, combined with a shameless promotion of how EDQ can help to give quick insights into your data, I did a quick Phrase Profile of the subject field of emails to the Global EDQ mailing list since it was set up last September. The results revealed a very clear theme:   Integration, Integration, Integration! As well as the important Siebel and Oracle Data Integrator (ODI) integrations, we have been asked about integration with a huge variety of Oracle applications, including EBS, Peoplesoft, CRM on Demand, Fusion, DRM, Endeca, RightNow, and more - and we have not stood still! While it would not have been possible to develop specific pre-integrations with all of the above within a year, we have developed a package of feature-rich out-of-the-box web services and batch processes that can be plugged into any application or middleware technology with ease. And with Siebel, they work out of the box. Oracle Enterprise Data Quality version 9.0.4 includes the Customer Data Services (CDS) pack – a ready set of standard processes with standard interfaces, to provide integrated: Address verification and cleansing  Individual matching Organization matching The services can are suitable for either Batch or Real-Time processing, and are enabled for international data, with simple configuration options driving the set of locale-specific dictionaries that are used. For example, large dictionaries are provided to support international name transcription and variant matching, including highly specialized handling for Arabic, Japanese, Chinese and Korean data. In total across all locales, CDS includes well over a million dictionary entries.   Excerpt from EDQ’s CDS Individual Name Standardization Dictionary CDS has been developed to replace the OEM of Informatica Identity Resolution (IIR) for attached Data Quality on the Oracle price list, but does this in a way that creates a ‘best of both worlds’ situation for customers, who can harness not only the out-of-the-box functionality of pre-packaged matching and standardization services, but also the flexibility of OEDQ if they want to customize the interfaces or the process logic, without having to learn more than one product. From a competitive point of view, we believe this stands us in good stead against our key competitors, including Informatica, who have separate ‘Identity Resolution’ and general DQ products, and IBM, who provide limited out-of-the-box capabilities (with a steep learning curve) in both their QualityStage data quality and Initiate matching products. Here is a brief guide to the main services provided in the pack: Address Verification and Standardization EDQ’s CDS Address Cleaning Process The Address Verification and Standardization service uses EDQ Address Verification (an OEM of Loqate software) to verify and clean addresses in either real-time or batch. The Address Verification processor is wrapped in an EDQ process – this adds significant capabilities over calling the underlying Address Verification API directly, specifically: Country-specific thresholds to determine when to accept the verification result (and therefore to change the input address) based on the confidence level of the API Optimization of address verification by pre-standardizing data where required Formatting of output addresses into the input address fields normally used by applications Adding descriptions of the address verification and geocoding return codes The process can then be used to provide real-time and batch address cleansing in any application; such as a simple web page calling address cleaning and geocoding as part of a check on individual data.     Duplicate Prevention Unlike Informatica Identity Resolution (IIR), EDQ uses stateless services for duplicate prevention to avoid issues caused by complex replication and synchronization of large volume customer data. When a record is added or updated in an application, the EDQ Cluster Key Generation service is called, and returns a number of key values. These are used to select other records (‘candidates’) that may match in the application data (which has been pre-seeded with keys using the same service). The ‘driving record’ (the new or updated record) is then presented along with all selected candidates to the EDQ Matching Service, which decides which of the candidates are a good match with the driving record, and scores them according to the strength of match. In this model, complex multi-locale EDQ techniques can be used to generate the keys and ensure that the right balance between performance and matching effectiveness is maintained, while ensuring that the application retains control of data integrity and transactional commits. The process is explained below: EDQ Duplicate Prevention Architecture Note that where the integration is with a hub, there may be an additional call to the Cluster Key Generation service if the master record has changed due to merges with other records (and therefore needs to have new key values generated before commit). Batch Matching In order to allow customers to use different match rules in batch to real-time, separate matching templates are provided for batch matching. For example, some customers want to minimize intervention in key user flows (such as adding new customers) in front end applications, but to conduct a more exhaustive match on a regular basis in the back office. The batch matching jobs are also used when migrating data between systems, and in this case normally a more precise (and automated) type of matching is required, in order to minimize the review work performed by Data Stewards.  In batch matching, data is captured into EDQ using its standard interfaces, and records are standardized, clustered and matched in an EDQ job before matches are written out. As with all EDQ jobs, batch matching may be called from Oracle Data Integrator (ODI) if required. When working with Siebel CRM (or master data in Siebel UCM), Siebel’s Data Quality Manager is used to instigate batch jobs, and a shared staging database is used to write records for matching and to consume match results. The CDS batch matching processes automatically adjust to Siebel’s ‘Full Match’ (match all records against each other) and ‘Incremental Match’ (match a subset of records against all of their selected candidates) modes. The Future The Customer Data Services Pack is an important part of the Oracle strategy for EDQ, offering a clear path to making Data Quality Assurance an integral part of enterprise applications, and providing a strong value proposition for adopting EDQ. We are planning various additions and improvements, including: An out-of-the-box Data Quality Dashboard Even more comprehensive international data handling Address search (suggesting multiple results) Integrated address matching The EDQ Customer Data Services Pack is part of the Enterprise Data Quality Media Pack, available for download at http://www.oracle.com/technetwork/middleware/oedq/downloads/index.html.

    Read the article

  • Announcing the June 2012 Release of the Ajax Control Toolkit

    - by Stephen.Walther
    I’m excited to announce the June 2012 release of the Ajax Control Toolkit. You can download the new release by visiting http://AjaxControlToolkit.CodePlex.com or (better) download the new release with NuGet: Install-Package AjaxControlToolkit The Ajax Control Toolkit continues to be super popular. The previous release (May 2012) had over 87,000 downloads from CodePlex.com and over 16,000 downloads from NuGet. That’s over 100,000 downloads in less than 2 months. Security Improvements for the HtmlEditorExtender Unfortunately, in the previous release, we made the HtmlEditorExtender too secure! We upgraded the version of the Microsoft Anti-Cross Site Scripting Library included in the Ajax Control Toolkit to the latest version (version 4.2.1) and the latest version turned out to be way too aggressive about stripping HTML. It not only strips dangerous tags such as <script> tags, it also strips innocent tags such as <b> tags. When the latest version of the Microsoft Anti-Cross Site Scripting Library is used with the HtmlEditorExtender, the library strips all rich content from the HtmlEditorExtender control which defeats the purpose of using the control. Therefore, we had to find a replacement for the Microsoft Anti-Cross Site Scripting Library. In this release, we’ve created a new HTML sanitizer built on the HTML Agility Pack. If you were using the AntiXssSanitizerProvider then you will need to substitute the HtmlAgilityPackSanitizerProvider. In particular, you need to modify the sanitizer sections in your Web.config file like this: <configuration> <configSections> <sectionGroup name="system.web"> <section name="sanitizer" requirePermission="false" type="AjaxControlToolkit.Sanitizer.ProviderSanitizerSection, AjaxControlToolkit" /> </sectionGroup> </configSections> <system.web> <sanitizer defaultProvider="HtmlAgilityPackSanitizerProvider"> <providers> <add name="HtmlAgilityPackSanitizerProvider" type="AjaxControlToolkit.Sanitizer.HtmlAgilityPackSanitizerProvider"></add> </providers> </sanitizer> </system.web> </configuration> We made one other backwards-breaking change to improve the security of the HtmlEditorExtender. We want to make sure that users don’t accidently use the HtmlEditorExtender without an HTML sanitizer by accident. Therefore, if you don’t configure a HTML sanitizer provider in the web.config file then you’ll get the following error: If you really want to use the HtmlEditorExtender without using an HTML sanitizer – for example, you are using the HtmlEditorExtender for an Intranet application and you trust all of your fellow employees – then you can explicitly indicate that you don’t want to enable HTML sanitization by setting the EnableSanitization property to false like this: <ajaxToolkit:HtmlEditorExtender TargetControlID="txtComments" EnableSanitization="false" runat="server" /> Please don’t ever set the EnableSanitization property to false for a public website. If you disable HTML sanitization then you are making your website an easy target for Cross-Site Scripting attacks. Lots of Fixes for the ComboBox Control In the latest release, we also made several important bug fixes and feature enhancements to the ComboBox control. Here’s the list of issues that we fixed: 22930 — ComboBox doesn’t close its drop down list when losing input focus to another ComboBox control 23140 — ComboBox Issues – Delete, Backspace, Period 23142 — ComboxBox SelectedIndex = -1 does not clear text 24440 — ComboBox postback on enter 25295 — ComboBox problems when container is hidden at page load 25469 — ComboBox – MaxLength ignored 26686 — Backspace and Delete exception when optionList is null 27148 — Combobox breaks if ClientIDMode is static Fixes to Other Controls In this release, we also made bug fixes and enhancements to the UpdatePanelAnimation, Tabs, and Seadragon controls: 21310 — OnUpdated animation starts before OnUpdating has finished 26690 — Seadragon Control’s openTileSource() method doesn’t work (with fix) Title is required We also fixed an issue with the Tabs control which would result in an InvalidOperation exception. Summary I want to thank the Superexpert team for the hard work that they put into this release. In particular, I want to thank them for their effort in researching, building, and writing unit tests for the HtmlAgilityPack HTML sanitizer.

    Read the article

  • JavaScript Intellisense with Telerik in ASP.NET Master Page Project with VS 2010

    - by Otto Neff
    Today I was looking for a solution to get finally the JScript/Javascript/jQuery Intellisense Featureworking with my ASP.Net Webform Project to work. I found some good articles: - JScript IntelliSense Overview- JScript IntelliSense: A Reference for the “Reference” Tag- Enabling JavaScript intellisense in VS.NET 2010 to work with SharePoint 2010- Rich IntelliSense for jQueryBUT, all of suggested solutions did not work right with my Master Page based Visual Studio 2010 Solution.Only with physical Javascript Files (Telerik includes certain Javascript Files like jQuery as Ressource) or/andconfigure always a new ASP.NET Scriptmanager / RadScriptManager on every page derived from the Master Page, wasn't exactly what I was looking for. So I came up with the following simple Solution, to Trick VS2010and still have the Project running with multiple runat="server" Scriptmanagers. In short:- New ASP.NET control derived from ScriptManager with emtpy overwritten OnInit() to use it as emtpy wrapper for VS2010. In detail:New RadScriptManager Classusing System; using System.Collections.Generic; using System.Linq; using System.Web; using Telerik.Web.UI; namespace IntellisenseJavascript.Controls { public class IntelliJS : RadScriptManager { protected override void OnInit(EventArgs e) { } protected override void OnPreRender(EventArgs e) { } protected override void OnLoad(EventArgs e) { } protected override void Render(System.Web.UI.HtmlTextWriter writer) { } public override void RenderControl(System.Web.UI.HtmlTextWriter writer) { } } } web.config<configuration> ... <system.web> ... <pages> <controls> <add tagPrefix="telerik" namespace="Telerik.Web.UI" assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4"/> <add tagPrefix="VSFix" namespace="IntellisenseJavascript.Controls" assembly="IntellisenseJavascript"/> </controls> </pages> ... Master Page<%@ Master Language="C#" AutoEventWireup="true" CodeBehind="Site.master.cs" Inherits="IntellisenseJavascript.Site" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html > <head id="head" runat="server"> <title></title> <telerik:RadStyleSheetManager ID="radStyleSheetManager" runat="server" /> </head> <body> <form id="form" runat="server"> <telerik:RadScriptManager ID="radScriptManager" runat="server"> <Scripts> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.Core.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQuery.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQueryInclude.js" /> </Scripts> </telerik:RadScriptManager> <telerik:RadAjaxManager ID="radAjaxManager" runat="server"> </telerik:RadAjaxManager> <div> #MASTER CONTENT# <asp:ContentPlaceHolder ID="contentPlaceHolder" runat="server"> </asp:ContentPlaceHolder> </div> </form> <script type="text/javascript"> $(function () { // Masterpage ready $('body').css('margin', '50px'); }); </script> </body> </html> ASPX Page<%@ Page Title="" Language="C#" MasterPageFile="~/Site.Master" AutoEventWireup="true" CodeBehind="Default.aspx.cs" Inherits="IntellisenseJavascript.Default" %> <asp:Content ID="Content1" ContentPlaceHolderID="contentPlaceHolder" runat="server"> <VSFix:IntelliJS runat="server" ID="intelliJS"> <Scripts> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.Core.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQuery.js" /> <asp:ScriptReference Assembly="Telerik.Web.UI, Version=2011.3.1115.40, Culture=neutral, PublicKeyToken=121fae78165ba3d4" Name="Telerik.Web.UI.Common.jQueryInclude.js" /> </Scripts> </VSFix:IntelliJS> <div style="border: 5px solid #FF9900;"> #PAGE CONTENT# </div> <script type="text/javascript"> $(function () { // Page ready $('body').css('border', '5px solid #888'); }); </script> </asp:Content> The Result I know, this is not the way it meant to be... but now at least you can have a Main ScriptManager for all Common Scripts and Settings, inject page specific Javascripts in PageLoad Event in normal ASPX Files and have JavaScript Intellisense for defined Scripts from JS Files or Assembly Ressouce in your Content Maybe, vNext will fix this.

    Read the article

  • Algorithmia Source Code released on CodePlex

    - by FransBouma
    Following the release of our BCL Extensions Library on CodePlex, we have now released the source-code of Algorithmia on CodePlex! Algorithmia is an algorithm and data-structures library for .NET 3.5 or higher and is one of the pillars LLBLGen Pro v3's designer is built on. The library contains many data-structures and algorithms, and the source-code is well documented and commented, often with links to official descriptions and papers of the algorithms and data-structures implemented. The source-code is shared using Mercurial on CodePlex and is licensed under the friendly BSD2 license. User documentation is not available at the moment but will be added soon. One of the main design goals of Algorithmia was to create a library which contains implementations of well-known algorithms which weren't already implemented in .NET itself. This way, more developers out there can enjoy the results of many years of what the field of Computer Science research has delivered. Some algorithms and datastructures are known in .NET but are re-implemented because the implementation in .NET isn't efficient for many situations or lacks features. An example is the linked list in .NET: it doesn't have an O(1) concat operation, as every node refers to the containing LinkedList object it's stored in. This is bad for algorithms which rely on O(1) concat operations, like the Fibonacci heap implementation in Algorithmia. Algorithmia therefore contains a linked list with an O(1) concat feature. The following functionality is available in Algorithmia: Command, Command management. This system is usable to build a fully undo/redo aware system by building your object graph using command-aware classes. The Command pattern is implemented using a system which allows transparent undo-redo and command grouping so you can use it to make a class undo/redo aware and set properties, use its contents without using commands at all. The Commands namespace is the namespace to start. Classes you'd want to look at are CommandifiedMember, CommandifiedList and KeyedCommandifiedList. See the CommandQueueTests in the test project for examples. Graphs, Graph algorithms. Algorithmia contains a sophisticated graph class hierarchy and algorithms implemented onto them: non-directed and directed graphs, as well as a subgraph view class, which can be used to create a view onto an existing graph class which can be self-maintaining. Algorithms include transitive closure, topological sorting and others. A feature rich depth-first search (DFS) crawler is available so DFS based algorithms can be implemented quickly. All graph classes are undo/redo aware, as they can be set to be 'commandified'. When a graph is 'commandified' it will do its housekeeping through commands, which makes it fully undo-redo aware, so you can remove, add and manipulate the graph and undo/redo the activity automatically without any extra code. If you define the properties of the class you set as the vertex type using CommandifiedMember, you can manipulate the properties of vertices and the graph contents with full undo/redo functionality without any extra code. Heaps. Heaps are data-structures which have the largest or smallest item stored in them always as the 'root'. Extracting the root from the heap makes the heap determine the next in line to be the 'maximum' or 'minimum' (max-heap vs. min-heap, all heaps in Algorithmia can do both). Algorithmia contains various heaps, among them an implementation of the Fibonacci heap, one of the most efficient heap datastructures known today, especially when you want to merge different instances into one. Priority queues. Priority queues are specializations of heaps. Algorithmia contains a couple of them. Sorting. What's an algorithm library without sort algorithms? Algorithmia implements a couple of sort algorithms which sort the data in-place. This aspect is important in situations where you want to sort the elements in a buffer/list/ICollection in-place, so all data stays in the data-structure it already is stored in. PropertyBag. It re-implements Tony Allowatt's original idea in .NET 3.5 specific syntax, which is to have a generic property bag and to be able to build an object in code at runtime which can be bound to a property grid for editing. This is handy for when you have data / settings stored in XML or other format, and want to create an editable form of it without creating many editors. IEditableObject/IDataErrorInfo implementations. It contains default implementations for IEditableObject and IDataErrorInfo (EditableObjectDataContainer for IEditableObject and ErrorContainer for IDataErrorInfo), which make it very easy to implement these interfaces (just a few lines of code) without having to worry about bookkeeping during databinding. They work seamlessly with CommandifiedMember as well, so your undo/redo aware code can use them out of the box. EventThrottler. It contains an event throttler, which can be used to filter out duplicate events in an event stream coming into an observer from an event. This can greatly enhance performance in your UI without needing to do anything other than hooking it up so it's placed between the event source and your real handler. If your UI is flooded with events from data-structures observed by your UI or a middle tier, you can use this class to filter out duplicates to avoid redundant updates to UI elements or to avoid having observers choke on many redundant events. Small, handy stuff. A MultiValueDictionary, which can store multiple unique values per key, instead of one with the default Dictionary, and is also merge-aware so you can merge two into one. A Pair class, to quickly group two elements together. Multiple interfaces for helping with building a de-coupled, observer based system, and some utility extension methods for the defined data-structures. We regularly update the library with new code. If you have ideas for new algorithms or want to share your contribution, feel free to discuss it on the project's Discussions page or send us a pull request. Enjoy!

    Read the article

  • NoSQL Memcached API for MySQL: Latest Updates

    - by Mat Keep
    With data volumes exploding, it is vital to be able to ingest and query data at high speed. For this reason, MySQL has implemented NoSQL interfaces directly to the InnoDB and MySQL Cluster (NDB) storage engines, which bypass the SQL layer completely. Without SQL parsing and optimization, Key-Value data can be written directly to MySQL tables up to 9x faster, while maintaining ACID guarantees. In addition, users can continue to run complex queries with SQL across the same data set, providing real-time analytics to the business or anonymizing sensitive data before loading to big data platforms such as Hadoop, while still maintaining all of the advantages of their existing relational database infrastructure. This and more is discussed in the latest Guide to MySQL and NoSQL where you can learn more about using the APIs to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database The native Memcached API is part of the MySQL 5.6 Release Candidate, and is already available in the GA release of MySQL Cluster. By using the ubiquitous Memcached API for writing and reading data, developers can preserve their investments in Memcached infrastructure by re-using existing Memcached clients, while also eliminating the need for application changes. Speed, when combined with flexibility, is essential in the world of growing data volumes and variability. Complementing NoSQL access, support for on-line DDL (Data Definition Language) operations in MySQL 5.6 and MySQL Cluster enables DevOps teams to dynamically update their database schema to accommodate rapidly changing requirements, such as the need to capture additional data generated by their applications. These changes can be made without database downtime. Using the Memcached interface, developers do not need to define a schema at all when using MySQL Cluster. Lets look a little more closely at the Memcached implementations for both InnoDB and MySQL Cluster. Memcached Implementation for InnoDB The Memcached API for InnoDB is previewed as part of the MySQL 5.6 Release Candidate. As illustrated in the following figure, Memcached for InnoDB is implemented via a Memcached daemon plug-in to the mysqld process, with the Memcached protocol mapped to the native InnoDB API. Figure 1: Memcached API Implementation for InnoDB With the Memcached daemon running in the same process space, users get very low latency access to their data while also leveraging the scalability enhancements delivered with InnoDB and a simple deployment and management model. Multiple web / application servers can remotely access the Memcached / InnoDB server to get direct access to a shared data set. With simultaneous SQL access, users can maintain all the advanced functionality offered by InnoDB including support for Foreign Keys, XA transactions and complex JOIN operations. Benchmarks demonstrate that the NoSQL Memcached API for InnoDB delivers up to 9x higher performance than the SQL interface when inserting new key/value pairs, with a single low-end commodity server supporting nearly 70,000 Transactions per Second. Figure 2: Over 9x Faster INSERT Operations The delivered performance demonstrates MySQL with the native Memcached NoSQL interface is well suited for high-speed inserts with the added assurance of transactional guarantees. You can check out the latest Memcached / InnoDB developments and benchmarks here You can learn how to configure the Memcached API for InnoDB here Memcached Implementation for MySQL Cluster Memcached API support for MySQL Cluster was introduced with General Availability (GA) of the 7.2 release, and joins an extensive range of NoSQL interfaces that are already available for MySQL Cluster Like Memcached, MySQL Cluster provides a distributed hash table with in-memory performance. MySQL Cluster extends Memcached functionality by adding support for write-intensive workloads, a full relational model with ACID compliance (including persistence), rich query support, auto-sharding and 99.999% availability, with extensive management and monitoring capabilities. All writes are committed directly to MySQL Cluster, eliminating cache invalidation and the overhead of data consistency checking to ensure complete synchronization between the database and cache. Figure 3: Memcached API Implementation with MySQL Cluster Implementation is simple: 1. The application sends reads and writes to the Memcached process (using the standard Memcached API). 2. This invokes the Memcached Driver for NDB (which is part of the same process) 3. The NDB API is called, providing for very quick access to the data held in MySQL Cluster’s data nodes. The solution has been designed to be very flexible, allowing the application architect to find a configuration that best fits their needs. It is possible to co-locate the Memcached API in either the data nodes or application nodes, or alternatively within a dedicated Memcached layer. The benefit of this flexible approach to deployment is that users can configure behavior on a per-key-prefix basis (through tables in MySQL Cluster) and the application doesn’t have to care – it just uses the Memcached API and relies on the software to store data in the right place(s) and to keep everything synchronized. Using Memcached for Schema-less Data By default, every Key / Value is written to the same table with each Key / Value pair stored in a single row – thus allowing schema-less data storage. Alternatively, the developer can define a key-prefix so that each value is linked to a pre-defined column in a specific table. Of course if the application needs to access the same data through SQL then developers can map key prefixes to existing table columns, enabling Memcached access to schema-structured data already stored in MySQL Cluster. Conclusion Download the Guide to MySQL and NoSQL to learn more about NoSQL APIs and how you can use them to scale new generations of web, cloud, mobile and social applications on the world's most widely deployed open source database See how to build a social app with MySQL Cluster and the Memcached API from our on-demand webinar or take a look at the docs Don't hesitate to use the comments section below for any questions you may have 

    Read the article

  • The Complementary Roles of PLM and PIM

    - by Ulf Köster
    Oracle Product Value Chain Solutions (aka Enterprise PLM Solutions) are a comprehensive set of product management solutions that work together to provide Oracle customers with a broad array of capabilities to manage all aspects of product life: innovation, design, launch, and supply chain / commercialization processes beyond the capabilities and boundaries of traditional engineering-focused Product Lifecycle Management applications. They support companies with an integrated managed view across the product value chain: From Lab to Launch, From Farm to Fork, From Concept to Product to Customer, From Product Innovation to Product Design and Product Commercialization. Product Lifecycle Management (PLM) represents a broad suite of software solutions to improve product-oriented business processes and data. PLM success stories prove that PLM helps companies improve time to market, increase product-related revenue, reduce product costs, reduce internal costs and improve product quality. As a maturing suite of enterprise solutions, PLM is still evolving to realize the promise it can provide across all facets of a business and all phases of the product lifecycle. The vision for PLM includes everything from gathering early requirements for a product through multiple stages of the product lifecycle from product design, through commercialization and eventual product retirement or replacement. In discrete or process industries, PLM is typically more focused on Product Definition as items with respect to the technical view of a material or part, including specifications, bills of material and manufacturing data. With Agile PLM, this is specifically related to capabilities addressing Product Collaboration, Governance and Compliance, Product Quality Management, Product Cost Management and Engineering Collaboration. PLM today is mainly addressing key requirements in the early product lifecycle, in engineering changes or in the “innovation cycle”, and primarily adds value related to product design, development, launch and engineering change process. In short, PLM is the master for Product Definition, wherever manufacturing takes place. Product Information Management (PIM) is a product suite that has evolved in parallel to PLM. Product Information Management (PIM) can extend the value of PLM implementations by providing complementary tools and capabilities. More relevant in the area of Product Commercialization, the vision for PIM is to manage product information throughout an enterprise and supply chain to improve product-related knowledge management, information sharing and synchronization from multiple data sources. PIM success stories have shown the ability to provide multiple benefits, with particular emphasis on reducing information complexity and information management costs. Product Information in PIM is typically treated as the commercial view of a material or part, including sales and marketing information and categorization. PIM collects information from multiple manufacturing sites and multiple suppliers into its repository, but also provides integration tools to push the information back out to the other systems, serving as an active central repository with the aim to provide a holistic view on any product sold by a company (hence the name “Product Hub”). In short, PIM is the master of commercial Product Information. So PIM is quickly becoming mandatory because of its value in optimizing multichannel selling processes and relationships with customers, as you can see from the following table: Viewpoint PLM Current State PIM Key Benefits PIM adds to PLM Product Lifecycle Primarily R&D Front end Innovation Cycle Change process Primarily commercial / transactional state of lifecycle Provides a seamless information flow from design and manufacturing through the ultimate selling and servicing of products Data Primarily focused on “item” vs. “product” data Product structures Specifications Technical information Repository for all product information. Reaches out to entire enterprise and its various silos of product information and descriptions Provides a “trusted source” of accurate product information to the internal organization and trading partners Data Lifecycle Repository for all design iterations Historical information Released, current information, with version management and time stamping Provides a single location to track and audit historical product information Communication PLM release finished product to ERP PLM is the master for Product Definition Captures information from disparate sources, including in-house data stores Recognizes the reality of today’s data “mess” across information silos Provides the ability to package product information to its audience in the desired, relevant format to meet their exacting business requirements Departmental R&D Manufacturing Quality Compliance Procurement Strategic Marketing Focus on Marketing and Sales Gathering information from other Departments, multiple sites, multiple suppliers A singular enterprise solution that leverages existing information silos and data stores Supply Chain Multi-site internal collaboration Supplier collaboration Customer collaboration Works with customers, exchanges / data pools, and trading partners to provide relevant product information packaged the way the customer desires Provides ability to provide trading partners and internal customers with information in a manner they desire, continuously Tools Data Management Collaboration Innovation Management Cleansing Synchronization Hub functions Consistent, clean and complete commercial product information The goals of both PLM and PIM, put simply, are to help companies make more profit from their products. PLM and PIM solutions can be easily added as they share some of the same goals, while coming from two different perspectives: the definition of the product and the commercialization of the product. Both can serve as a form of product “system of record”, but take different approaches to delivering value. Oracle Product Value Chain solutions offer rich new strategies for executives to collectively leverage Agile PLM, Product Data Hub, together with Enterprise Data Quality for Products, and other industry leading Oracle applications to achieve further incremental value, like Oracle Innovation Management. This is unique on the market today.

    Read the article

  • World Record Oracle Business Intelligence Benchmark on SPARC T4-4

    - by Brian
    Oracle's SPARC T4-4 server configured with four SPARC T4 3.0 GHz processors delivered the first and best performance of 25,000 concurrent users on Oracle Business Intelligence Enterprise Edition (BI EE) 11g benchmark using Oracle Database 11g Release 2 running on Oracle Solaris 10. A SPARC T4-4 server running Oracle Business Intelligence Enterprise Edition 11g achieved 25,000 concurrent users with an average response time of 0.36 seconds with Oracle BI server cache set to ON. The benchmark data clearly shows that the underlying hardware, SPARC T4 server, and the Oracle BI EE 11g (11.1.1.6.0 64-bit) platform scales within a single system supporting 25,000 concurrent users while executing 415 transactions/sec. The benchmark demonstrated the scalability of Oracle Business Intelligence Enterprise Edition 11g 11.1.1.6.0, which was deployed in a vertical scale-out fashion on a single SPARC T4-4 server. Oracle Internet Directory configured on SPARC T4 server provided authentication for the 25,000 Oracle BI EE users with sub-second response time. A SPARC T4-4 with internal Solid State Drive (SSD) using the ZFS file system showed significant I/O performance improvement over traditional disk for the Web Catalog activity. In addition, ZFS helped get past the UFS limitation of 32767 sub-directories in a Web Catalog directory. The multi-threaded 64-bit Oracle Business Intelligence Enterprise Edition 11g and SPARC T4-4 server proved to be a successful combination by providing sub-second response times for the end user transactions, consuming only half of the available CPU resources at 25,000 concurrent users, leaving plenty of head room for increased load. The Oracle Business Intelligence on SPARC T4-4 server benchmark results demonstrate that comprehensive BI functionality built on a unified infrastructure with a unified business model yields best-in-class scalability, reliability and performance. Oracle BI EE 11g is a newer version of Business Intelligence Suite with richer and superior functionality. Results produced with Oracle BI EE 11g benchmark are not comparable to results with Oracle BI EE 10g benchmark. Oracle BI EE 11g is a more difficult benchmark to run, exercising more features of Oracle BI. Performance Landscape Results for the Oracle BI EE 11g version of the benchmark. Results are not comparable to the Oracle BI EE 10g version of the benchmark. Oracle BI EE 11g Benchmark System Number of Users Response Time (sec) 1 x SPARC T4-4 (4 x SPARC T4 3.0 GHz) 25,000 0.36 Results for the Oracle BI EE 10g version of the benchmark. Results are not comparable to the Oracle BI EE 11g version of the benchmark. Oracle BI EE 10g Benchmark System Number of Users 2 x SPARC T5440 (4 x SPARC T2+ 1.6 GHz) 50,000 1 x SPARC T5440 (4 x SPARC T2+ 1.6 GHz) 28,000 Configuration Summary Hardware Configuration: SPARC T4-4 server 4 x SPARC T4-4 processors, 3.0 GHz 128 GB memory 4 x 300 GB internal SSD Storage Configuration: "> Sun ZFS Storage 7120 16 x 146 GB disks Software Configuration: Oracle Solaris 10 8/11 Oracle Solaris Studio 12.1 Oracle Business Intelligence Enterprise Edition 11g (11.1.1.6.0) Oracle WebLogic Server 10.3.5 Oracle Internet Directory 11.1.1.6.0 Oracle Database 11g Release 2 Benchmark Description Oracle Business Intelligence Enterprise Edition (Oracle BI EE) delivers a robust set of reporting, ad-hoc query and analysis, OLAP, dashboard, and scorecard functionality with a rich end-user experience that includes visualization, collaboration, and more. The Oracle BI EE benchmark test used five different business user roles - Marketing Executive, Sales Representative, Sales Manager, Sales Vice-President, and Service Manager. These roles included a maximum of 5 different pre-built dashboards. Each dashboard page had an average of 5 reports in the form of a mix of charts, tables and pivot tables, returning anywhere from 50 rows to approximately 500 rows of aggregated data. The test scenario also included drill-down into multiple levels from a table or chart within a dashboard. The benchmark test scenario uses a typical business user sequence of dashboard navigation, report viewing, and drill down. For example, a Service Manager logs into the system and navigates to his own set of dashboards using Service Manager. The BI user selects the Service Effectiveness dashboard, which shows him four distinct reports, Service Request Trend, First Time Fix Rate, Activity Problem Areas, and Cost Per Completed Service Call spanning 2002 to 2005. The user then proceeds to view the Customer Satisfaction dashboard, which also contains a set of 4 related reports, drills down on some of the reports to see the detail data. The BI user continues to view more dashboards – Customer Satisfaction and Service Request Overview, for example. After navigating through those dashboards, the user logs out of the application. The benchmark test is executed against a full production version of the Oracle Business Intelligence 11g Applications with a fully populated underlying database schema. The business processes in the test scenario closely represent a real world customer scenario. See Also SPARC T4-4 Server oracle.com OTN Oracle Business Intelligence oracle.com OTN Oracle Database 11g Release 2 Enterprise Edition oracle.com OTN WebLogic Suite oracle.com OTN Oracle Solaris oracle.com OTN Disclosure Statement Copyright 2012, Oracle and/or its affiliates. All rights reserved. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Other names may be trademarks of their respective owners. Results as of 30 September 2012.

    Read the article

  • HTML5 and CSS3 Editing in Windows Live Writer

    - by Rick Strahl
    Windows Live Writer is a wonderful tool for editing blog posts and getting them posted to your blog. What makes it nice is that it has a small set of useful features, plus a simple plug-in model that has spawned many useful add-ins. Small tool with a reasonably decent plug-in model to extend equals a great solution to a simple problem. If you're running Windows, have a blog and aren’t using Live Writer you’re probably doing it wrong…One of Live Writer’s nice features is that it can download your blog’s CSS for preview and edit displays. It lets you edit your content inside of the context of that CSS using the WYSIWYG editor, so your content actually looks very close to what you’ll see on your blog while you’re editing your post. Unfortunately Live Writer renders the HTML content in the Web Browser Control’s  default IE 7 rendering mode. Yeah you read that right: IE 7 is the default for the Web Browser control and most applications that use it, are stuck in this modus unless the application explicitly overrides this default. The Web Browser control does not use the version of Internet Explorer installed on the system (IE 10 on my Win8 machine) but uses IE 7 mode for ‘compatibility’ for old applications.If you are importing your blog’s CSS that may suck if you’re using rich HTML 5 and CSS 3 formatting. Hack the Registry to get Live Writer to render using IE 9 or 10In order to get Live Writer (or any other application that uses the Web Browser Control for that matter) to render you can apply a registry hack that overrides the Web Browser Control engine usage for a specific application. I wrote about this in detail in a previous blog post a couple of years back.Here’s how you can set up Windows Live Writer to render your CSS 3 by making a change in your registry:The above is for setup on a 64 bit machine, where I configure Live Writer which is a 32 bit application for using IE 10 rendering. The keys set are as follows:32bit Configuration on 64 bit machine:HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATIONKey: WindowsLiveWriter.exeValue: 9000 or 10000  (IE 9 or 10 respectively) (DWORD value)On a 32 bit only machine: HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Internet Explorer\Main\FeatureControl\FEATURE_BROWSER_EMULATIONKey: WindowsLiveWriter.exeValue: 9000 or 10000  (IE 9 or 10 respectively) (DWORD value)Use decimal values of 9000, 10000 or 11000 to specify specific versions of Internet Explorer. This is a minor tweak, but it’s nice to actually see my blog posts now with the proper CSS formatting intact. Notice the rounded borders and shadow on the code blocks as well as the overflow-x and scrollbars that show up. In this particular case I can see what the code blocks actually look like in a specific resolution – much better than in the old plain view which just chopped things off at the end of the window frame. There are a few other elements that now show properly in the editor as well including block quotes and note boxes that I occasionally use. It’s minor stuff, but it makes the editing experience better yet and closer to the final things so there are less republish operations than I previously had. Sweet!Note that this approach of putting an IE version override into the registry works with most applications that use the Web Browser control. If you are using the Web Browser control in your own applications, it’s a good idea to switch the browser to a more recent version so you can take advantage of HTML 5 and CSS 3 in your browser displayed content by automatically setting this flag in the registry or as part of the application’s startup routine if not dedicated setup tool is used. At the very least you might set it to 9000 (IE 9) which supports most of the basic CSS3 features and is a decent baseline that works for most Windows 7 and 8 machines. If running pre-IE9, the browser will fall back to IE7 rendering and look bad but at least more recent browsers will see an improved experience.I’m surprised that there aren’t more vendors and third party apps using this feature. You can see in my first screen shot that there are only very few entries in the registry key group on my machine – any other apps use the Web Browser control are using IE7. Go figure. Certainly Windows Live Writer should be writing this key into the registry automatically as part of installation to support this functionality out of the box, but alas since it does not, this registry hack lets you get your way anyway…Resources.reg Files to register Live Write Browser Emulation (set for IE9)Specifying Internet Explorer Version for ApplicationsSnagIt LiveWriter Plug-inDownload Windows Live WriterDownload Windows Live Writer with Chocolatey© Rick Strahl, West Wind Technologies, 2005-2013Posted in Live Writer  Windows   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • CodePlex Daily Summary for Saturday, April 03, 2010

    CodePlex Daily Summary for Saturday, April 03, 2010New ProjectsASP.NET MVC Demo: aspnetmvcdemoClasslessInterDomainRouting: ClasslessInterDomainRouting provides a class that is designed to detail with CIDR requests and ranges, it is developed within the C# Langauge and f...ClientSideRefactor: Plugin for Visual Studio.ColinTest: ColinTestePMS: An educational project to learn ASP.Net MVC, entity framework using vs 2010Extensible ASP.NET: Extensible Framework on top of ASP.NET - infrastructure level. Uses MEF for extensibility.Franchise Computing Model: Franchise Computing is a client-centric, contract-oriented, consumption-based computing model. Its framework allows service providers and consumers...GameEngine ReactorFX: Set of tools and code snippets for creation DirectX based games. Also provides a number of ideas, algorythms and problem-solutions.It's All Just Ones And Zeros: Utility code libraries for Vault API developers.Live Writer Picasa Plugin: Live Writer Picasa Plugin is a plugin for Windows Live Writer that allows you to embed photos from your Picasa Web Albums into your blog posts. Liv...Managed SDK for Meizu Cell Phone: The goal of this project is to deliver an open source managed SDK for Meizu cell phones, currently for M8. Media Player Field Type: Display a media player in a column of you document library. The library can contain movie files of diferent formats. The player will appear in the ...praca magisterska: This is my thesis: Algebraical aspects of modern cryptography,Pyx: An experimental programming language for statistics.SharpHydroLiDAR: A C# version of Lidar Hydrographic ExtractionSql Server Mds Destination: SSIS destination transform component for SQL Server Master Data ServicesStackOverflow.Net: A C# library for the StackOverflow API (currently in beta). Provides methods for every call currently in the StackOverflow API.TRX Merger Utility: People working on test projects that involve test management and execution from Visual Studio Team System 2008 and who do not have a TFS server for...UniPlanner: The UniPlanner project goal is to develop a web application able to visualize and schedule a university timetable.WikiNETParser: Wiki .NET Parser, Open Source project powered by ANTLR. Syntax defined in 3(4) files Lexer, Grammar, AST Parser.New ReleasesaaronERP builder - a framework to create customized ERP solutions: aaronERP_0.4.0.0: Changes (compared to version 0.3.0.0) : Businesslayer : - Caching of data-tables - ITranslatable Interface for mutli-language DAOs Web-Frontend: ...BatterySaver: Version 0.5: Add support for executing a power state event manually (Issue) Add support for battery percentage thresholds (Issue)ColinTest: asdfzxcv: asdfasdfComposer: V1.0.402.2001 Beta: Minor bug fixes Minor changes in interfaces Added documentation to the setup packageDynamic Configuration: Dynamic Configuration Release 2: Added ConfigurationChanged event fired whenever changes in .config file detected. Improved file watching filtering.Facebook Developer Toolkit: Version 3.1 BETA: Lots of bug fixes. Issues addressed: http://facebooktoolkit.codeplex.com/WorkItem/View.aspx?WorkItemId=14808 http://facebooktoolkit.codeplex.com/W...iExporter - iTunes playlist exporting: iExporter gui v2.5.0.0 - console v1.2.1.0: Paypal donate! New features and redesign for iExporter Gui You can now select/deselect all visible items with one click in the overview When yo...Line Counter: 1.5.5: The Line Counter is a tool to calculate lines of your code files. The tool was written in .NET 2.0. Line Counter 1.5.5 Fixed bugs in C# counter an...Live Writer Picasa Plugin: Live Writer Picasa Plugin 1.0.0: Changelog Since this is the first version there are no changes.Media Player Field Type: Media Player Field Type v1.0: Display a media player in a column of you document library. The library can contain movie files of diferent formats. The player will appear in the ...Numina Application/Security Framework: Numina.Framework Core 49601: Added .LESS library for CSS Updated default style and logo Added a few methods and method overloads to the .NET libraryOver Store: OverStore 1.16.0.0: Version 1.16.0.0 Runtime components uses PersistingRuntimeException instead of many exception types. PersistingRuntimeException message includes...patterns & practices Web Client Developer Guidance: Web Client Software Factory 2010 beta source code: The Web Client Software Factory 2010 provides an integrated set of guidance that assists architects and developers in creating web client applicati...SCSI Interface for Multimedia and Block Devices: Release 12 - View CD-DVD Drive Features: Changes in this version: - Added the ability to view the features of a CD/DVD device (e.g.: what discs it supports, whether it supports Mount Raini...SharePoint Labs: SPLab5006A-FRA-Level100: SPLab5006A-FRA-Level100 This SharePoint Lab will teach you how to create a Feature within Visual Studio, how to brand it, how to incorporate ressou...SharePoint Labs: SPLab5007A-FRA-Level300: SPLab5007A-FRA-Level300 This SharePoint Lab will teach you how to create a reusable and distributable project model for developping Features within...SharePoint Labs: SPLab5008A-FRA-Level100: SPLab5008A-FRA-Level100 This SharePoint Lab will teach you how to add an option in the ECB menu (Edit Control Block) only for specific file types w...SharePoint Labs: SPLab5009A-FRA-Level100: SPLab5009A-FRA-Level100 This SharePoint Lab will teach you the "Site Pages" model and the differences between customized/uncustomized pages (ghoste...SharePoint Labs: SPLab5010A-FRA-Level100: SPLab5010A-FRA-Level100 This SharePoint Lab will teach you the "Application Pages" model and the differences between "Site Pages" and "Application ...SharePoint Labs: SPLab5011A-FRA-Level100: SPLab5011A-FRA-Level100 This SharePoint Lab will teach you how to create a basic Application Page in the 12\TEMPLATE\LAYOUTS. Lab Language : French...sPATCH: sPatch v0.9b: + Fixed: an issue most webservers need leading slash to return filestreamsTASKedit: sTASKedit (pre-Alpha Release): This release is only for playing around, currently not useful Supported Files:Open 1.3.6 client tasks.data Export to 1.3.6 client tasks.data E...TRX Merger Utility: TRX Merger v1.0: First versionttgLib: ttgLib-0.01-beta1: In beta-version we've implemented basic functionality of ttgLib - now it can solve various problems using CPU+GPU bundle. Most important things: ...WikiNETParser: Wiki .NET Parser 2.5: Wiki .NET Parser 2.5 The documentation, binaries and source code could be downloaded from http://catarsa.com portal The latest release to downloa...WPF Zen Garden: Release 1.0: This is the first release.XNA 3D World Studio Content Pipeline: XNA 3DWS Content Pipeline - R2: This version adds terrains and brush based modelsMost Popular ProjectsRawrWBFS ManagerMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesDotNetNuke® Community EditionMost Active ProjectsGraffiti CMSRawrjQuery Library for SharePoint Web ServicesFacebook Developer ToolkitBlogEngine.NETN2 CMSBase Class LibrariesFarseer Physics EngineLINQ to TwitterMicrosoft Biology Foundation

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >