Search Results

Search found 12291 results on 492 pages for 'fluent api'.

Page 422/492 | < Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >

  • MySQL and Hadoop Integration - Unlocking New Insight

    - by Mat Keep
    “Big Data” offers the potential for organizations to revolutionize their operations. With the volume of business data doubling every 1.2 years, analysts and business users are discovering very real benefits when integrating and analyzing data from multiple sources, enabling deeper insight into their customers, partners, and business processes. As the world’s most popular open source database, and the most deployed database in the web and cloud, MySQL is a key component of many big data platforms, with Hadoop vendors estimating 80% of deployments are integrated with MySQL. The new Guide to MySQL and Hadoop presents the tools enabling integration between the two data platforms, supporting the data lifecycle from acquisition and organisation to analysis and visualisation / decision, as shown in the figure below The Guide details each of these stages and the technologies supporting them: Acquire: Through new NoSQL APIs, MySQL is able to ingest high volume, high velocity data, without sacrificing ACID guarantees, thereby ensuring data quality. Real-time analytics can also be run against newly acquired data, enabling immediate business insight, before data is loaded into Hadoop. In addition, sensitive data can be pre-processed, for example healthcare or financial services records can be anonymized, before transfer to Hadoop. Organize: Data is transferred from MySQL tables to Hadoop using Apache Sqoop. With the MySQL Binlog (Binary Log) API, users can also invoke real-time change data capture processes to stream updates to HDFS. Analyze: Multi-structured data ingested from multiple sources is consolidated and processed within the Hadoop platform. Decide: The results of the analysis are loaded back to MySQL via Apache Sqoop where they inform real-time operational processes or provide source data for BI analytics tools. So how are companies taking advantage of this today? As an example, on-line retailers can use big data from their web properties to better understand site visitors’ activities, such as paths through the site, pages viewed, and comments posted. This knowledge can be combined with user profiles and purchasing history to gain a better understanding of customers, and the delivery of highly targeted offers. Of course, it is not just in the web that big data can make a difference. Every business activity can benefit, with other common use cases including: - Sentiment analysis; - Marketing campaign analysis; - Customer churn modeling; - Fraud detection; - Research and Development; - Risk Modeling; - And more. As the guide discusses, Big Data is promising a significant transformation of the way organizations leverage data to run their businesses. MySQL can be seamlessly integrated within a Big Data lifecycle, enabling the unification of multi-structured data into common data platforms, taking advantage of all new data sources and yielding more insight than was ever previously imaginable. Download the guide to MySQL and Hadoop integration to learn more. I'd also be interested in hearing about how you are integrating MySQL with Hadoop today, and your requirements for the future, so please use the comments on this blog to share your insights.

    Read the article

  • Oracle releases new Java Embedded products

    - by Henrik Stahl
    With less than one week to go to JavaOne 2012, we've spiced things up a little by releasing not one but two net new embedded Java products. This is an important step towards realizing the vision of Java as the standard platform for the Internet of Things that I outlined in a recent blog post. The two new products are: Java ME Embedded 3.2. Based on same code as the widely deployed Oracle Java Wireless Client for feature phones, this new product provides a Java ME implementation optimized for very small microcontroller-based devices and adds - among other things - a new Device Access API that enables interaction with peripherals common in edge devices such as various types of sensors. In addition to the new Java ME Embedded platform, we have also released an update of the Java ME SDK which adds support for the development of small embedded devices. Java Embedded Suite 7.0. This is an integrated middleware stack for embedded devices, incorporating Java SE Embedded and versions of JavaDB, GlassFish and a Web Services stack optimized for remote operation and small footprint. A typical Internet of Things (or M2M) infrastructure contains three types of compute nodes: The edge device which is typically a sensor or control point of some kind. These devices can be connected directly to a backend through a mobile network if they are installed in - for example - a remote vending machine; or, they can be part of a local short-range network and be connected to the backend through a more powerful gateway device. A gateway is the second type of compute node and acts as an aggregator and control point for a local network. A good example of this could be a generalized home Internet access point, or home gateway. Gateways are mostly using normal wall power and are used for multiple applications, deployed by multiple service providers. Finally, the last type of compute node is the normal enterprise or cloud backend. Java ME Embedded and Java Embedded Suite are perfect base software stacks for the edge devices and the gateway respectively, providing the Java promise of a platform independent runtime and a complete set of libraries as well as allowing a programmer to focus on the business logic rather than plumbing. We are very thrilled with these new releases that open up exciting opportunities for Java developers to extend services and enterprise applications in ways that will make organizations more efficient and touch our daily lives. To find out more, come to the JavaOne conference (for technical content) and to the Java Embedded @ JavaOne subconference (for business content). There will be plenty of cool demos showing complete end-to-end applications, provided by Oracle and our partners, as well as keynotes and numerous sessions where you can learn more about the technology and business opportunities.

    Read the article

  • What is a better abstraction layer for D3D9 and OpenGL vertex data management?

    - by Sam Hocevar
    My rendering code has always been OpenGL. I now need to support a platform that does not have OpenGL, so I have to add an abstraction layer that wraps OpenGL and Direct3D 9. I will support Direct3D 11 later. TL;DR: the differences between OpenGL and Direct3D cause redundancy for the programmer, and the data layout feels flaky. For now, my API works a bit like this. This is how a shader is created: Shader *shader = Shader::Create( " ... GLSL vertex shader ... ", " ... GLSL pixel shader ... ", " ... HLSL vertex shader ... ", " ... HLSL pixel shader ... "); ShaderAttrib a1 = shader->GetAttribLocation("Point", VertexUsage::Position, 0); ShaderAttrib a2 = shader->GetAttribLocation("TexCoord", VertexUsage::TexCoord, 0); ShaderAttrib a3 = shader->GetAttribLocation("Data", VertexUsage::TexCoord, 1); ShaderUniform u1 = shader->GetUniformLocation("WorldMatrix"); ShaderUniform u2 = shader->GetUniformLocation("Zoom"); There is already a problem here: once a Direct3D shader is compiled, there is no way to query an input attribute by its name; apparently only the semantics stay meaningful. This is why GetAttribLocation has these extra arguments, which get hidden in ShaderAttrib. Now this is how I create a vertex declaration and two vertex buffers: VertexDeclaration *decl = VertexDeclaration::Create( VertexStream<vec3,vec2>(VertexUsage::Position, 0, VertexUsage::TexCoord, 0), VertexStream<vec4>(VertexUsage::TexCoord, 1)); VertexBuffer *vb1 = new VertexBuffer(NUM * (sizeof(vec3) + sizeof(vec2)); VertexBuffer *vb2 = new VertexBuffer(NUM * sizeof(vec4)); Another problem: the information VertexUsage::Position, 0 is totally useless to the OpenGL/GLSL backend because it does not care about semantics. Once the vertex buffers have been filled with or pointed at data, this is the rendering code: shader->Bind(); shader->SetUniform(u1, GetWorldMatrix()); shader->SetUniform(u2, blah); decl->Bind(); decl->SetStream(vb1, a1, a2); decl->SetStream(vb2, a3); decl->DrawPrimitives(VertexPrimitive::Triangle, NUM / 3); decl->Unbind(); shader->Unbind(); You see that decl is a bit more than just a D3D-like vertex declaration, it kinda takes care of rendering as well. Does this make sense at all? What would be a cleaner design? Or a good source of inspiration?

    Read the article

  • CodePlex Daily Summary for Monday, January 31, 2011

    CodePlex Daily Summary for Monday, January 31, 2011Popular ReleasesMVC Controls Toolkit: Mvc Controls Toolkit 0.8: Fixed the following bugs: *Variable name error in the jvascript file that prevented the use of the deleted item template of the Datagrid *Now after the changes applied to an item of the DataGrid are cancelled all input fields are reset to the very initial value they had. *Other minor bugs. Added: *This version is available both for MVC2, and MVC 3. The MVC 3 version has a release number of 0.85. This way one can install both version. *Client Validation support has been added to all control...Office Web.UI: Beta preview (Source): This is the first Beta. it includes full source code and all available controls. Some designers are not ready, and some features are not finalized allready (missing properties, draft styles) ThanksASP.net Ribbon: Version 2.2: This release brings some new controls (part of Office Web.UI). A few bugs are fixed and it includes the "auto resize" feature as you resize the window. (It can cause an infinite loop when the window is too reduced, it's why this release is not marked as "stable"). I will release more versions 2.3, 2.4... until V3 which will be the official launch of Office Web.UI. Both products will evolve at the same speed. Thanks.Barcode Rendering Framework: 2.1.1.0: Final release for VS2008 Finally fixed bugs with code 128 symbology.HERB.IQ: HERB.IQ.UPGRADE.0.5.3.exe: HERB.IQ.UPGRADE.0.5.3.exexUnit.net - Unit Testing for .NET: xUnit.net 1.7: xUnit.net release 1.7Build #1540 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision) Ad...DoddleReport - Automatic HTML/Excel/PDF Reporting: DoddleReport 1.0: DoddleReport will add automatic tabular-based reporting (HTML/PDF/Excel/etc) for any LINQ Query, IEnumerable, DataTable or SharePoint List For SharePoint integration please click Here PDF Reporting has been placed into a separate assembly because it requies AbcPdf http://www.websupergoo.com/download.htmSpark View Engine: Spark v1.5: Release Notes There have been a lot of minor changes going on since version 1.1, but most important to note are the major changes which include: Support for HTML5 "section" tag. Spark has now renamed its own section tag to "segment" instead to avoid clashes. You can still use "section" in a Spark sense for legacy support by specifying ParseSectionAsSegment = true if needed while you transition Bindings - this is a massive feature that further simplifies your views by giving you a powerful ...Marr DataMapper: Marr DataMapper 1.0.0 beta: First release.WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.3: Version: 2.0.0.3 (Milestone 3): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...Rawr: Rawr 4.0.17 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 and on the Version Notes page: http://rawr.codeplex.com/wikipage?title=VersionNotes As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you...Squiggle - A Free open source LAN Messenger: Squiggle 2.5 Beta: In this release following are the new features: Localization: Support for Arabic, French, German and Chinese (Simplified) Bridge: Connect two Squiggle nets across the WAN or different subnets Aliases: Special codes with special meaning can be embedded in message like (version),(datetime),(time),(date),(you),(me) Commands: cls, /exit, /offline, /online, /busy, /away, /main Sound notifications: Get audio alerts on contact online, message received, buzz Broadcast for group: You can ri...VivoSocial: VivoSocial 7.4.2: Version 7.4.2 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.4.2 release and see if they persist. If you have any questions about this release, please post them in our Support forums. If you are experiencing a bug or would like to request a new feature, please submit it to our issue tracker. Web Controls * Updated Business Objects and added a new SQL Data Provider File. Groups * Fixed a security issue whe...PHP Manager for IIS: PHP Manager 1.1.1 for IIS 7: This is a minor release of PHP Manager for IIS 7. It contains all the functionality available in 56962 plus several bug fixes (see change list for more details). Also, this release includes Russian language support. SHA1 codes for the downloads are: PHPManagerForIIS-1.1.0-x86.msi - 6570B4A8AC8B5B776171C2BA0572C190F0900DE2 PHPManagerForIIS-1.1.0-x64.msi - 12EDE004EFEE57282EF11A8BAD1DC1ADFD66A654mojoPortal: 2.3.6.1: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2361-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Parallel Programming with Microsoft Visual C++: Drop 6 - Chapters 4 and 5: This is Drop 6. It includes: Drafts of the Preface, Introduction, Chapters 2-7, Appendix B & C and the glossary Sample code for chapters 2-7 and Appendix A & B. The new material we'd like feedback on is: Chapter 4 - Parallel Aggregation Chapter 5 - Futures The source code requires Visual Studio 2010 in order to run. There is a known bug in the A-Dash sample when the user attempts to cancel a parallel calculation. We are working to fix this.NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel Template, version 1.0.1.160: The NodeXL Excel template displays a network graph using edge and vertex lists stored in an Excel 2007 or Excel 2010 workbook. What's NewThis release improves NodeXL's Twitter and Pajek features. See the Complete NodeXL Release History for details. Installation StepsFollow these steps to install and use the template: Download the Zip file. Unzip it into any folder. Use WinZip or a similar program, or just right-click the Zip file in Windows Explorer and select "Extract All." Close Ex...Kooboo CMS: Kooboo CMS 3.0 CTP: Files in this downloadkooboo_CMS.zip: The kooboo application files Content_DBProvider.zip: Additional content database implementation of MSSQL, RavenDB and SQLCE. Default is XML based database. To use them, copy the related dlls into web root bin folder and remove old content provider dlls. Content provider has the name like "Kooboo.CMS.Content.Persistence.SQLServer.dll" View_Engines.zip: Supports of Razor, webform and NVelocity view engine. Copy the dlls into web root bin folder to enable...UOB & ME: UOB ME 2.6: UOB ME 2.6????: ???? V1.0: ???? V1.0 ??New ProjectsAuto Complete Control for ASP.NET: Autocomplete Control is a fully functional ASP.NET control for word suggestions and autocomplete. We had been using Ajax Control Toolkit AutoComplete Extender in our projects before, but we have needed some extra features and functionalities.Cours ESIEE: MAJ des cours ESIEE depuis la plateforme Icampus et autres documentsEngineering World Expenses: Demo expenses application for Engineering World 2011Entity Framework / Linq to Sql Poco Code Generator: Poco Orm data access layer (Dto) code generator for Entity Framework and Linq to Sql. Customizable code generation via simple templating system. Utilizes Managed Extensibility Framework (MEF) in order for application parts to dynamically composed and plug-able.linqish.py: Python module for manipulating iterables. An implementation of the .Net Framework's Linq to Objects for Python.Machinekey setter: This code sample is Windows Azure SDK 1.3 custom plugin. This sample do working at set custom key to machinekey of web.config file in your WebRole.MapReduce.NET: MapReduce.NET intends to implement the original paper proposed by Google on MapReduce.Marr DataMapper: Marr DataMapper provides a fast and easy to use wrapper around ADO.NET that enables you to focus more on your data access queries without having to write plumbing code. Load one-to-one, one-to-many, and hierarchical entity models with ease. No special base class required.Orchard Silverlight: Orchard module enabling embedding Silverlight applications and creating Silverlight-based content.RouteMagic: Library of useful routing helpers and classes.Smart Skelta Utilites: Smart Skelta Utilies will provide utilties like Visual Studio 2008 Skelta Starter Kit(Project Templates and Project Item Templates),Code Snippets for Skelta Components,Skleta Attachment Extracter Web based Logger,Skelta Server utility and others for skelta based development.Solfix: Solfix is a programming language tbat is work-in-progress, but it has a lot of functionality! You can make applications for console to windows applications. The main point of Solfix is to make coding easier and less time than before.SQLite Manager: A minimal manage for sqlite databases.State Search: StateSearch provides state search algoritms such as A*, IDA*, BestFirst, etc to solve problems such as puzzles and/or path searchingTable Check Custom Field Type: SharePoint Custom Field Type for displaying a list of values with checkboxes and people editors.testsgb: testWindows Phone 7 Extension Framework: An extension method framework for Windows Phone 7 to make your code more fluent and adding a lot of common functions you don't need to reproduce.

    Read the article

  • Partner Blog: aurionPro SENA - Mobile Application Convenience, Flexibility & Innovation Delivered

    - by Darin Pendergraft
    About the Writer: Des Powley is Director of Product Management for aurionPro SENA inc. the leading global Oracle Identity and Access Management specialist delivery and product development partner. In October 2012 aurionPro SENA announced the release of the Mobile IDM application that delivers key Identity Management functions from any mobile device. The move towards an always on, globally interconnected world is shifting Business and Consumers alike away from traditional PC based Enterprise application access and more and more towards an ‘any device, same experience’ world. It is estimated that within five years in many developing regions of the world the PC will be obsolete, replaced entirely by cheaper mobile and tablet devices. This will give a vast amount of new entrants to the Internet their first experience of the online world, and it will only be via these newer, mobile access channels. Designed to address this shift in working and social environments and released in October of 2012 the aurionPro SENA Mobile IDM application directly addresses this emerging market and requirement by enhancing administrators, consumers and managers Identity Management (IDM) experience by delivering a mobile application that provides rapid access to frequently used IDM services from any Mobile device. Built on the aurionPro SENA Identity Service platform the mobile application uses Oracle’s Cloud, Mobile and Social capabilities and Oracle’s Identity Governance Suite for it’s core functions. The application has been developed using standards based API’s to ensure seamless integration with a client’s on premise IDM implementation or equally seamlessly with the aurionPro SENA Hosted Identity Service. The solution delivers multi platform support including iOS, Android and Blackberry and provides many key features including: • Providing easy to access view all of a users own access privileges • The ability for Managers to approve and track requests • Simply raising requests for new applications, roles and entitlements through the service catalogue This application has been designed and built with convenience and security in mind. We protect access to critical applications by enforcing PIN based authentication whilst also providing the user with mobile single sign on capability. This is just one of the many highly innovative products and services that aurionPro SENA is developing for our clients as we continually strive to enhance the value of their investment in Oracle’s class leading 11G R2 Identity and Access Management suite. The Mobile IDM application is a key component of our Identity Services Suite that also includes Managed, Hosted and Cloud Identity Services. The Identity Services Suite has been designed and built specifically to break the barriers to delivering Enterprise, Mobile and Social Identity Management services from the Cloud. aurionPro SENA - Building next generation Identity Services for modern enterprises. To view the app please visit http://youtu.be/btNgGtKxovc For more information please contact [email protected]

    Read the article

  • CodePlex Daily Summary for Thursday, December 23, 2010

    CodePlex Daily Summary for Thursday, December 23, 2010Popular ReleasesSSH.NET Library: 2010.12.23: This release includes some bug fixes and few new fetures. Fixes Allow to retrieve big directory structures ssh-dss algorithm is fixed Populate sftp file attributes New Features Support for passhrase when private key is used Support added for diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256 and diffie-hellman-group-exchange-sha1 key exchange algorithms Allow to provide multiple key files for authentication Add support for "keyboard-interactive" authentication method...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 2.3.0: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. Release notesThis will be the last release targeting ASP.NET MVC 2 and .NET 3.5. MvcSiteMapProvider 3.0.0 will be targeting ASP.NET MVC 3 and .NET 4 Web.config setting skipAssemblyScanOn has been deprecated in favor of excludeAssembliesForScan and includeAssembliesForScan ISiteMapNodeUrlResolver is now completely responsible for generating th...SuperSocket, an extensible socket application framework: SuperSocket 1.3 beta 2: Compared with SuperSocket 1.3 beta 1, the changes listed below have been done in SuperSocket 1.3 beta 2: added supports for .NET 3.5 replaced Logging Application Block of EntLib with Log4Net improved the code about logging fixed a bug in QuickStart sample project added IPv6 supportTibiaPinger: TibiaPinger v1.0: TibiaPinger v1.0Media Companion: Media Companion 3.400: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. A manual is included to get you startedPackage that minifies and combines JavaScript and CSS files: Release 1.1: Bug fixes. The package now correctly handles inlined images and image urls in CSS files surrounded by quotes. CombineAndMinify can now be used in conjunction with Microsoft's Sprite and Image Optimization Framework. That framework combines several small images into one, reducing overall load times.Multicore Task Framework: MTF 1.0.1: Release 1.0.1 of Multicore Task Framework.SQL Monitor - tracking sql server activities: SQL Monitor 3.0 alpha 7: 1. added script save/load in user query window 2. fixed problem with connection dialog when choosing windows auth but still ask for user name 3. auto open user table when double click one table node 4. improved alert message, added log only methodOpen NFe: Open NFe 2.0 (Beta): Última versão antes da versão final a ser lançada nos próximos dias.EnhSim: EnhSim 2.2.6 ALPHA: 2.2.6 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Fixing up some r...ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4.3: Helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: Improvements for confirm, popup, popup form RenderView controller extension the user experience for crud in live demo has been substantially improved + added search all the features are shown in the live demoGanttPlanner: GanttPlanner V1.0: GanttPlanner V1.0 include GanttPlanner.dll and also a Demo application.N2 CMS: 2.1 release candidate 3: * Web platform installer support available N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) A bunch of bugs were fixed File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the templ...TweetSharp: TweetSharp v2.0.0.0 - Preview 6: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 6 ChangesMaintenance release with user reported fixes Preview 5 ChangesMaintenance release with user reported fixes Preview 4 ChangesReintroduced fluent interface support via satellite assembly Added entities support, entity segmentation, and ITweetable/ITweeter interfaces for client development Numer...Team Foundation Server Administration Tool: 2.1: TFS Administration Tool 2.1, is the first version of the TFS Administration Tool which is built on top of the Team Foundation Server 2010 object model. TFS Administration Tool 2.1 can be installed on machines that are running either Team Explorer 2010, or Team Foundation Server 2010.SubtitleTools: SubtitleTools 1.3: - Added .srt FileAssociation & Win7 ShowRecentCategory feature. - Applied UnifiedYeKe to fix Persian search problems. - Reduced file size of Persian subtitles for uploading @OSDB.Facebook C# SDK: 4.1.0: - Lots of bug fixes - Removed Dynamic Runtime Language dependencies from non-dynamic platforms. - Samples included in release for ASP.NET, MVC, Silverlight, Windows Phone 7, WPF, WinForms, and one Visual Basic Sample - Changed internal serialization to use Json.net - BREAKING CHANGE: Canvas Session is no longer support. Use Signed Request instead. Canvas Session has been deprecated by Facebook. - BREAKING CHANGE: Some renames and changes with Authorizer, CanvasAuthorizer, and Authorization ac...NuGet: NuGet 1.0 build 11217.102: Note: this release is slightly newer than RC1, and fixes a couple issues relating to updating packages to newer versions. NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog. This new build targets the newer feed (h...WCF Community Site: WCF Web APIs 10.12.17: Welcome to the second release of WCF Web APIs on codeplex Here is what is new in this release. WCF Support for jQuery - create WCF web services that are easy to consume from JavaScript clients, in particular jQuery. Better support for using JsonValue as dynamic Support for JsonValue change notification events for databinding and other purposes Support for going between JsonValue and CLR types WCF HTTP - create HTTP / REST based web services. This is a minor release which contains fixe...Orchard Project: Orchard 0.9: Orchard Release Notes Build: 0.9.253 Published: 12/16/2010 How to Install OrchardTo install the Orchard tech preview using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard-Using-Web-PI.ashx Web PI will detect your hardware environment and install the application. --OR-- Alternatively, to install the release manually, download the Orchard.Web.0.9.253.zip file. The zip contents are pre-built and ready-to-run. Simply extract the contents of the Orch...New ProjectsArabic Silverlight IE8 Web Slices: Group of IE8 web slices built on SilverlightAssaultCube Special Edition: Making the classic better!Blazonisation: If have emblem of some state, but don't know anything about it, you can use this tool to resolve your problem. Content Maintenance Dashboard: Content Maintenance Dashboard is a package for Umbraco which lets you bulk publlish, unpublish, move and delete of content items.DFTBASP: Drop files on the window choose permissions or Just backup option DMNarrator Manager: A tool for RPG Narrators (and dungeon masters). First version is focused to old Wolrd of DarknessdnnGr2 - Free DotNetNuke Skin: A free DotNetNuke skin initially created for the DotNetNuke Skinning Presentation at the 2nd meeting of the DNN Greek Community.Duplicate File Explorer: File search utility, that also shows what files are duplicate by name. Supports searching up to 3 different folders at one time, excluding folders or extensions and multiple search patterns.Elmax: Yet another easy to use C++ XML parsing library on Windows! Tutorial coming soon!Firemap: Generates a html page which displays key performance statistics of chosen computers. Jumony Guide: Guide for Jumony Milestone 1Loc Tracker: Location tracking system for android and a php server for managing setup of triggers and events to be notified.MBDeveloper: This is my personal project.MECMS: Its an self-developed ecms.Media Companion: Media Companion is a windows program that allows people to catalogue and browse movie and tv episode collections. Folders are scanned and information is automatically gathered from various sources, including posters, fanart and episode screenshots. MOSS 2007 WebPart for SSO to BlackBoard: This project includes a MOSS 2007 WebPart for Single Sign On (SSO) to a Blackboard 9 Learning Management System. Existing projects and code samples do not include resources for SharePoint 2007 (MOSS) or C#. The goal of to provide these missing resources.MutitouchTest Flash AS3 on IOS (iPhone iPad): MutitouchTest Flash AS3 on IOS (iPhone iPad) www.fbair.net ??????Ipad ????? ??, ???????????????????, ???11?. iTouch4 ????5?? ??????? ? ? ??? ???????????. iPad??????????????Netropya XCogen: XCogen is an xml file config generator for Castle, Unity Spring and StructureMap containers. It's a simple console application for post-build action.Nota Algoritmica EUI: Programa que obtiene del moodle de la asignatura las notas y calcula la media final.Powershell CmdLets for Citrix NetScaler: Powershell CmdLets for the Citrix Netscaler product. This enables for querying & modifying the netscaler configuration through powershell.Scenario Framework - BPM Suite: Business Process Management SystemSchool Authorization System: School Authorization SystemSharePoint Delete Dead Users: This is a small WPF Application that presents all dead users it finds and allows the user to delete them from SharePoint. dead users are users that are not found inside active directory. There is also an LDAP Filter to speed up the query from active directorySugataTools: SugataTools are the helper classes that I usually use in my projects.The Phoenix Foundation - OSAL: Another OSAL for Windows (desktop/mobile) & Linux support.TibiaPinger: Tibia Pinger, Tools, Bot, ArkBot, Tibia, NeoBot, TibiaAuto, NG, ElfBottracer: Modernizes the System.Diagnostics trace infrastructure APIs to bring dynamism, testability and composability to .NET logging.uQR: uQR is an Umbraco package which adds the ability to generate dynamic QR Codes to an Umbraco website.User Account Picture Manager: This little app will try to fetch a users picture from active directory (thumbnailPhoto) and sets this picture on Windows Vista/2008/7/2008 R2 as User Account PictureVirtual Shell Service: Virtual Shell Service, is a service that runs on your server allowing trusted users to interact with the system via a sand boxed environment. The service has been designed to support mapping terminal commands in different ways via plugins and XML mappingWindows Live Writer Codehighlighting Plugin For Alex Gorbatchev's SyntaxHighligh: Windows Live Writer Codehighlighting Plugin For Alex Gorbatchev's SyntaxHighlighter. More information about the proect please visit http://tugberkugurlu.com/41

    Read the article

  • Elastic PaaS with WebLogic and OpenStack, part I

    - by Jernej Kaše
    In my previous blog I described the steps to get OpenStack on Solaris up and running. Now we'll explore how WebLogic and OpenStack can work together to deliver truly elastic Middleware Platform as a Service. Middleware / Platform as a Service goals First, let's define what PaaS should be : PaaS offerings facilitate the deployment of applications without the complexity of managing the underlying hardware and software and provisioning hosting capabilities. To break it down: - PaaS provides a complete platform for hosting solutions (Java EE, SOA, BPM, ...) - Infrastructure provisioning (virtual machine, OS, platform) and managing is hidden from the PaaS user [administrator or developer] - Additionally, PaaS could / should define target SLAs, and the platform should ensure the SLAs are meet automatically. PaaS use case To make it more tangible, we have an IT Administrator who has the requirement to deploy a Java EE enterprise application. The application is used by external users who need to submit reports by the end of each month. As a result, the number of concurrent users will fluctuate, with expected huge spikes around the end of each month. The SLA agreed by the management is that no more than 100 requests should be waiting to be processes at any given time. In addition, the IT admin has no more than 3 days to have the platform and the application operational. The Challenges Some of the challenges the IT Administrator is facing are: - how are we going to ensure the processing power? - how are we going to provision the (virtual) machines, Java EE platform and deploy the application? - how are we going to monitor the SLA? - how are we going to react to SLA, and increase capacity?  The Ideal Solution Ideally, the whole process should be automated, "set it and forget" and require no human interaction: - The vendor packages the solution as deployable image(s) - The images are deployed to the IaaS - From there, automated processes take care of SLA  Solution Architecture with WebLogic 12c, Dynamic Clusters, OpenStack & Solaris OracleSolaris provides OS and virtualisation through Solaris Zones OpenStack is a part of Solaris 11.2 and provides Cloud Management (console and API) WebLogic 12c with Dynamic Clusters provides the Platform Trafic Manager provides load balancing On top of out that, we are going to implement a small control script - Cloud Manager - which is going to monitor SLA through WebLogic Diagnostic Framework. In case there are more than 100 pending requests, the script will: - provision a new virtual machine based on image which is configured for the WebLogic domain - add the machine to WebLogic domain - Increase the number of servers in dynamic cluster - Start the newly provisioned server  Stay tuned for part II The hole solution with working demo will be presented in one of our Partner WebCasts in June, exact date TBA. Jernej Kaše is a Fusion Middleware Specialist working closely with Oracle Partners in the ECEMEA region to grow their business by leveraging Oracle technology.

    Read the article

  • Oracle Utilities Application Framework V4.2.0.0.0 Released

    - by ACShorten
    The Oracle Utilities Application Framework V4.2.0.0.0 has been released with Oracle Utilities Customer Care And Billing V2.4. This release includes new functionality and updates to existing functionality and will be progressively released across the Oracle Utilities applications. The release is quite substantial with lots of new and exciting changes. The release notes shipped with the product includes a summary of the changes implemented in V4.2.0.0.0. They include the following: Configuration Migration Assistant (CMA) - A new data management capability to allow you to export and import Configuration Data from one environment to another with support for Approval/Rejection of individual changes. Database Connection Tagging - Additional tags have been added to the database connection to allow database administrators, Oracle Enterprise Manager and other Oracle technology the ability to monitor and use individual database connection information. Native Support for Oracle WebLogic - In the past the Oracle Utilities Application Framework used Oracle WebLogic in embedded mode, and now, to support advanced configuration and the ExaLogic platform, we are adding Native Support for Oracle WebLogic as configuration option. Native Web Services Support - In the past the Oracle Utilities Application Framework supplied a servlet to handle Web Services calls and now we offer an alternative to use the native Web Services capability of Oracle WebLogic. This allows for enhanced clustering, a greater level of Web Service standards support, enchanced security options and the ability to use the Web Services management capabilities in Oracle WebLogic to implement higher levels of management including defining additional security rules to control access to individual Web Services. XML Data Type Support - Oracle Utilities Application Framework now allows implementors to define XML Data types used in Oracle in the definition of custom objects to take advantage of XQuery and other XML features. Fuzzy Operator Support - Oracle Utilities Application Framework supports the use of the fuzzy operator in conjunction with Oracle Text to take advantage of the fuzzy searching capabilities within the database. Global Batch View - A new JMX based API has been implemented to allow JSR120 compliant consoles the ability to view batch execution across all threadpools in the Coherence based Named Cache Cluster. Portal Personalization - It is now possible to store the runtime customizations of query zones such as preferred sorting, field order and filters to reuse as personal preferences each time that zone is used. These are just the major changes and there are quite a few more that have been delivered (and more to come in the service packs!!). Over the next few weeks we will be publishing new whitepapers and new entries in this blog outlining new facilities that you want to take advantage of.

    Read the article

  • Script For Detecting Availability of XMLHttp in Internet Explorer

    - by Duncan Mills
    Having the XMLHttpRequest API available is key to any ADF Faces Rich Client application. Unfortunately, it is possible for users to switch off this option in Internet Explorer as a Security setting. Without XMLHttpRequest available, your ADF Faces application will simply not work correctly, but rather than giving the user a bad user experience wouldn't it be nicer to tell them that they need to make some changes in order to use the application?  Thanks to Blake Sullivan in the ADF Faces team we now have a little script that can do just this. The script is available from https://samplecode.oracle.com here - The attached file browserCheck.js is what you'll need to add to your project.The best way to use this script is to make changes to whatever template you are using for the entry points to your application. If you're not currently using template then you'll have to make the same change in each of your JSPX pages. Save the browserCheck.js file into a /js/ directory under your HTML root within your UI project (e.g. ViewController)In the template or page, select the <af:document> object in the Structure window. From the right mouse (context) menu choose Facet and select the metaContainer facet.Switch to the source code view and locate the metaContainer facet. Then insert the following lines (I've included the facet tag for clarity but you'll already have that):      <f:facet name="metaContainer">        <af:resource type="javascript"                      source="/js/browserCheck.js"/>        <af:resource type="javascript">           xmlhttpNativeCheck(                     "help/howToConfigureYourBrowser.html");        </af:resource>      </f:facet>Note that the argument to the xmlhttpNativeCheck function is a page that you want to show to the user if they need to change their browser configuration. So build this page in the appropriate place as well. You can also just call the function without any arguments e.g. xmlhttpNativeCheck(); in which case it will pop up default instructions for the user to follow, but not redirect to any other page.

    Read the article

  • OData Mix10 &ndash; Part Dos

    - by Jon Dalberg
    The other day I had a snazzy post on fetching all the video (WMV) files from Mix ‘10. A simple, console application that grabbed the urls from the OData feed and downloaded the videos. I wanted to change that app to fire the OData query asynchronously so here’s what resulted: 1: static void Main(string[] args) 2: { 3: var mix = new Mix.EventEntities(new Uri("http://api.visitmix.com/OData.svc")); 4:   5: var temp = mix.Files.Where(f => f.TypeName == "WMV"); 6: var query = temp as DataServiceQuery<Mix.File>; 7:   8: query.BeginExecute(OnFileQueryComplete, query); 9:   10: // waiting... 11: Console.ReadLine(); 12: } 13:   14: static void OnFileQueryComplete(IAsyncResult result) 15: { 16: var query = result.AsyncState as DataServiceQuery<Mix.File>; 17: var response = query.EndExecute(result); 18:   19: var web = new WebClient(); 20:   21: var myVideos = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyVideos), "Mix10"); 22:   23: Directory.CreateDirectory(myVideos); 24:   25: foreach (Mix.File f in response) 26: { 27: var fileName = new Uri(f.Url).Segments.Last(); 28: Console.WriteLine(f.Url); 29: web.DownloadFile(f.Url, Path.Combine(myVideos, fileName)); 30: } 31: } There are two important things here that are not explained well in the MSDN docs: See lines 5 and 6? That’s where I query for the WMV files and it returns an IQueryable<T>. You *have* to cast that to a DataServiceQuery<T> and then call BeginExecute. The documented example does not filter so it didn’t show that step. Line 16 shows the correct way to get the previously executed DataServiceQuery<T> from the async result. If you looked at the MSDN example docs it shows (incorrectly) just casting the result, like this: // wrong var query = result as DataServiceQuery<Mix.File>; Other than those items it is relatively straight forward and we’re all async-ified. Enjoy!

    Read the article

  • Clusterware 11gR2 &ndash; Setting up an Active/Passive failover configuration

    - by Gilles Haro
    Oracle provides many interesting ways to ensure High Availability. Dataguard configurations, RAC configurations or even both (as recommended for a Maximum Available Architecture - MAA) are the most frequently found. But when it comes to protecting a system with an Active/Passive architecture with failover capabilities, one often thinks to expensive third party cluster systems. Oracle Clusterware technology, which comes free with Oracle Database, is – in the knowing of most people - often linked to Oracle RAC and therefore, is rarely used to implement failover solutions. 11gR2 Clusterware – which is part of Oracle Grid Infrastructure - provides a comprehensive framework to setup automatic failover configurations. It is actually possible to make “failover-able'” and, therefore to protect, almost every kind of application (from xclock to the more complex Application Server) In the next couple of lines, I will try to present the different steps to achieve this goal : Have a fully operational 11gR2 database protected by automatic failover capabilities. I assume you are fluent in installing Oracle Database 11gR2, Oracle Grid Infrastructure 11gR2 on a Linux system and that ASM is not a problem for you (as I am using it as a shared storage). If not, please have a look at Oracle Documentation. As often, I made my tests using an Oracle VirtualBox environment. The scripts are tested and functional. Unfortunately, there can always be a typo or a mistake. This blog entry is not a course around the Clusterware Framework. I just hope it will let you see how powerful it is and that it will give you the whilst to go further with it…   Prerequisite 2 Linux boxes (OELCluster01 and OELCluster02) at the same OS level. I used OEL 5 Update 5 with Enterprise Kernel. Shared Storage (SAN). On my VirtualBox system, I used Openfiler to simulate the SAN Oracle 11gR2 Database (11.2.0.1) Oracle 11gR2 Grid Infrastructure (11.2.0.1)   Step 1 – Install the software Using asmlib, create 3 ASM disks (ASM_CRS, ASM_DTA and ASM_FRA) Install Grid Infrastructure for a cluster (OELCluster01 and OELCluster02 are the 2 nodes of the cluster) Use ASM_CRS to store Voting Disk and OCR. Use SCAN. Install Oracle Database Standalone binaries on both nodes. Use asmca to check/mount the disk groups on 2 nodes Use dbca to create and configure a database on the primary node Let’s name it DB11G. Copy the pfile, password file to the second node. Create adump directoty on the second node.   Step 2 - Setup the resource to be protected After its creation with dbca, the database is automatically protected by the Oracle Restart technology available with Grid Infrastructure. Consequently, it restarts automatically (if possible) after a crash (ex: kill –9 smon). A database resource has been created for that in the Cluster Registry. We can observe this with the command : crsctl status resource that shows and ora.dba11g.db entry. Let’s save the definition of this resource, for future use : mkdir –p /crs/11.2.0/HA_scripts chown oracle:oinstall /crs/11.2.0/HA_scripts crsctl status resource ora.db11g.db -p > /crs/11.2.0/HA_scripts/myResource.txt Although very interesting, Oracle Restart is not cluster aware and cannot restart the database on any other node of the cluster. So, let’s remove it from the OCR definitions, we don’t need it ! srvctl stop database -d DB11G srvctl remove database -d DB11G Instead of it, we need to create a new resource of a more general type : cluster_resource. Here are the steps to achieve this : Create an action script :  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh #!/bin/bash export ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 export ORACLE_SID=DB11G case $1 in 'start')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   startup EOF   RET=0   ;; 'stop')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   shutdown immediate EOF   RET=0   ;; 'check')    ok=`ps -ef | grep smon | grep $ORACLE_SID | wc -l`    if [ $ok = 0 ]; then      RET=1    else      RET=0    fi    ;; '*')      RET=0   ;; esac if [ $RET -eq 0 ]; then    exit 0 else    exit 1 fi   This script must provide, at least, methods to start, stop and check the database. It is self-explaining and contains nothing special. Just be aware that it is run as Oracle user (because of the ACL property – see later) and needs to know about the environment. It also needs to be present on every node of the cluster. chmod +x /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh scp  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh   oracle@OELCluster02:/crs/11.2.0/HA_scripts Create a new resource file, based on the information we got from previous  myResource.txt . Name it myNewResource.txt. myResource.txt  is shown below. As we can see, it defines an ora.database.type resource, named ora.db11g.db. A lot of properties are related to this type of resource and do not need to be used for a cluster_resource. NAME=ora.db11g.db TYPE=ora.database.type ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_FAILURE_TEMPLATE= ACTION_SCRIPT= ACTIVE_PLACEMENT=1 AGENT_FILENAME=%CRS_HOME%/bin/oraagent%CRS_EXE_SUFFIX% AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=1 CHECK_TIMEOUT=600 CLUSTER_DATABASE=false DB_UNIQUE_NAME=DB11G DEFAULT_TEMPLATE=PROPERTY(RESOURCE_CLASS=database) PROPERTY(DB_UNIQUE_NAME= CONCAT(PARSE(%NAME%, ., 2), %USR_ORA_DOMAIN%, .)) ELEMENT(INSTANCE_NAME= %GEN_USR_ORA_INST_NAME%) DEGREE=1 DESCRIPTION=Oracle Database resource ENABLED=1 FAILOVER_DELAY=0 FAILURE_INTERVAL=60 FAILURE_THRESHOLD=1 GEN_AUDIT_FILE_DEST=/oracle/admin/DB11G/adump GEN_USR_ORA_INST_NAME= GEN_USR_ORA_INST_NAME@SERVERNAME(oelcluster01)=DB11G HOSTING_MEMBERS= INSTANCE_FAILOVER=0 LOAD=1 LOGGING_LEVEL=1 MANAGEMENT_POLICY=AUTOMATIC NLS_LANG= NOT_RESTARTING_TEMPLATE= OFFLINE_CHECK_INTERVAL=0 ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 PLACEMENT=restricted PROFILE_CHANGE_TEMPLATE= RESTART_ATTEMPTS=2 ROLE=PRIMARY SCRIPT_TIMEOUT=60 SERVER_POOLS=ora.DB11G SPFILE=+DTA/DB11G/spfileDB11G.ora START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STATE_CHANGE_TEMPLATE= STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h USR_ORA_DB_NAME=DB11G USR_ORA_DOMAIN=haroland USR_ORA_ENV= USR_ORA_FLAGS= USR_ORA_INST_NAME=DB11G USR_ORA_OPEN_MODE=open USR_ORA_OPI=false USR_ORA_STOP_MODE=immediate VERSION=11.2.0.1.0 I removed database type related entries from myResource.txt and modified some other to produce the following myNewResource.txt. Notice the NAME property that should not have the ora. prefix Notice the TYPE property that is not ora.database.type but cluster_resource. Notice the definition of ACTION_SCRIPT. Notice the HOSTING_MEMBERS that enumerates the members of the cluster (as returned by the olsnodes command). NAME=DB11G.db TYPE=cluster_resource DESCRIPTION=Oracle Database resource ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_SCRIPT=/crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh PLACEMENT=restricted ACTIVE_PLACEMENT=0 AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=10 DEGREE=1 ENABLED=1 HOSTING_MEMBERS=oelcluster01 oelcluster02 LOGGING_LEVEL=1 RESTART_ATTEMPTS=1 START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h Register the resource. Take care of the resource type. It needs to be a cluster_resource and not a ora.database.type resource (Oracle recommendation) .   crsctl add resource DB11G.db  -type cluster_resource -file /crs/11.2.0/HA_scripts/myNewResource.txt Step 3 - Start the resource crsctl start resource DB11G.db This command launches the ACTION_SCRIPT with a start and a check parameter on the primary node of the cluster. Step 4 - Test this We will test the setup using 2 methods. crsctl relocate resource DB11G.db This command calls the ACTION_SCRIPT  (on the two nodes)  to stop the database on the active node and start it on the other node. Once done, we can revert back to the original node, but, this time we can use a more “MS$ like” method :Turn off the server on which the database is running. After short delay, you should observe that the database is relocated on node 1. Conclusion Once the software installed and the standalone database created (which is a rather common and usual task), the steps to reach the objective are quite easy : Create an executable action script on every node of the cluster. Create a resource file. Create/Register the resource with OCR using the resource file. Start the resource. This solution is a very interesting alternative to licensable third party solutions.   References Clusterware 11gR2 documentation Oracle Clusterware Resource Reference   Gilles Haro Technical Expert - Core Technology, Oracle Consulting   

    Read the article

  • A big flat text file or a HTML site for language documentation?

    - by Bad Sector
    A project of mine is a small embeddable Tcl-like scripting language, LIL. While i'm mostly making it for my own use, i think it is interesting enough for others to use, so i want it to have a nice (but not very "wordy") documentation. So far i'm using a single flat readme.txt file. It explains the language's syntax, features, standard functions, how to use the C API, etc. Also it is easy to scan and read in almost every environment out there, from basic text-only terminals to full-fledged high-end graphical desktop environments. However, while i tried to keep things nicely formatted (as much as this is possible in plain text), i still think that being a big (and growing) wall of text, it isn't as easy on the eyes as it could be. Also i feel that sometimes i'm not writing as much as i want in order to avoid expanding the text too much. So i thought i could use another project of mine, QuHelp, which is basically a help site generator for sites like this one with a sidebar that provides a tree of topics/subtopics and offline full text search. With this i can use HTML to format the documentation and if i use QuHelp for some other project that uses LIL, i can import LIL's documentation as part of the other project's documentation. However converting the existing documentation to QuHelp/HTML isn't a small task, especially when it comes to functions (i'll need to put more detail on them than what currently exists in the readme.txt file). Also it loses the wide range of availability that it currently has (even if QuHelp's generated code degrades gracefully down to console-only web browsers, plain text is readable from everywhere, including from popular editors such as Vim and Emacs - i had someone once telling me that he likes LIL's documentation because it is readable without leaving his editor). So, my question is simply this: should i keep the documentation as it is now in the form of a single readme.txt file or should i convert it to something like the site i mentioned above? There is also the option to do both, but i'm not sure if i'll be able to always keep them in sync or if it is worth the effort. After asking around in IRC i've got mixed answers: some liked the wide availability of the single text file, others said that it is looks as bad as a man page (personally i don't mind that - i can read man pages just fine - but other people might have issues reading them). What do you think?

    Read the article

  • Travelling MVP #2: Community event at Bucharest, Romania

    - by DigiMortal
    My second trip was to DevReach with two stops. My first stop was at Bucharest where I met with my friend Dimitar Georgiev who is one of authors of Gym Realm service. Romanian MVP Andrei Ignat was our host there and organized meeting with local community guys. With me – it was first time in my life – was one more guy from Estonia visiting DevReach and he made the whole trip with me. Bucharest We arrived to Bucharest 29.09 at night. We stayed at Hotel Michelangelo. It’s small hotel with nice rooms, free WiFi and very good service. Although my room was on the first floor there was no street noise. We visited one restaurant that offers national cuisine and it was really great. Next day we went out with local guys and had some beers in “old town”. Bucharest “old town” is nice and cozy. There are many bars open and I am sure everybody will find there some very okay place. After supper we visited one warm karaoke bar where we had beers with local guys. Andrei Ignat – karaoke star Agu Suur and Andrei Ion Rinea enjoying karaoke and tequila Community event Next day we had community event. I made my session about ASP.NET Web API and Dimitar told about how to port ASP.NET web applications to cloud environment. Sessions were held at study class of one local company. Dimitar Georgiev speaking about porting web apps to Windows Azure. As it was usual community evening and not some bigger event there were about 12 guys attending from Bucharest. There were both IT-PROs and developers and one nice thing about Bucharest community is that they are listening to you very well and they ask questions if something is unclear or if you slide over from topic they are interested in. Okay, we tried to keep up good tempo so people stay awake and I think we succeeded. After sessions we went all together to local Piranha pub that was near event venue. We had some beers with local guys and talked with them on different technology topics. It was another good and interesting evening at Bucharest. I want to go back there for sure. As it was my first trip to Bucharest and mostly I gathered experiences I think my next community trip there will be way stronger. I take it as a challenge. Plus – I have there some new friends and I want to meet them too – be it community event or not. :)

    Read the article

  • Windows Azure Recipe: High Performance Computing

    - by Clint Edmonson
    One of the most attractive ways to use a cloud platform is for parallel processing. Commonly known as high-performance computing (HPC), this approach relies on executing code on many machines at the same time. On Windows Azure, this means running many role instances simultaneously, all working in parallel to solve some problem. Doing this requires some way to schedule applications, which means distributing their work across these instances. To allow this, Windows Azure provides the HPC Scheduler. This service can work with HPC applications built to use the industry-standard Message Passing Interface (MPI). Software that does finite element analysis, such as car crash simulations, is one example of this type of application, and there are many others. The HPC Scheduler can also be used with so-called embarrassingly parallel applications, such as Monte Carlo simulations. Whatever problem is addressed, the value this component provides is the same: It handles the complex problem of scheduling parallel computing work across many Windows Azure worker role instances. Drivers Elastic compute and storage resources Cost avoidance Solution Here’s a sketch of a solution using our Windows Azure HPC SDK: Ingredients Web Role – this hosts a HPC scheduler web portal to allow web based job submission and management. It also exposes an HTTP web service API to allow other tools (including Visual Studio) to post jobs as well. Worker Role – typically multiple worker roles are enlisted, including at least one head node that schedules jobs to be run among the remaining compute nodes. Database – stores state information about the job queue and resource configuration for the solution. Blobs, Tables, Queues, Caching (optional) – many parallel algorithms persist intermediate and/or permanent data as a result of their processing. These fast, highly reliable, parallelizable storage options are all available to all the jobs being processed. Training Here is a link to online Windows Azure training labs where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure HPC Scheduler (3 labs)  The Windows Azure HPC Scheduler includes modules and features that enable you to launch and manage high-performance computing (HPC) applications and other parallel workloads within a Windows Azure service. The scheduler supports parallel computational tasks such as parametric sweeps, Message Passing Interface (MPI) processes, and service-oriented architecture (SOA) requests across your computing resources in Windows Azure. With the Windows Azure HPC Scheduler SDK, developers can create Windows Azure deployments that support scalable, compute-intensive, parallel applications. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • 25 Secrets for Faster ASP.NET: the Eagle has landed!

    - by Michaela Murray
    On Friday we launched our new free eBook, 25 Secrets for Faster ASP.NET Applications! Heading for 1000 of you have picked it up already, but if you haven’t got your copy yet, you can grab it from http://www.red-gate.com/25secrets. It’s the follow up to the wildly successful 50 Ways to Avoid, Find and Fix ASP.NET Performance Issues, which we released back in January this year (you can download from www.red-gate.com/50ways). Once again, we collected tips from some of the smartest brains in the ASP.NET community, but this time around, we’ve covered the latest stuff in the .NET framework – async/await, Web API, and more. Houston, we have a winner… In my original blogpost, I offered a Microsoft Surface as a prize for the best tip. Now, after some serious deliberation, our judges have settled on a winner. By a unanimous verdict, the prize goes to… (wait for it!) … Jeffrey Richter, for this cheeky number, Tip #1 in the new book: Want to build scalable websites and services? Work asynchronously One of the secrets to producing scalable websites and services is to perform all your I/O operations asynchronously to avoid blocking threads. When your thread issues a synchronous I/O request, the Windows kernel blocks the thread. This causes the thread pool to create a new thread, which allocates a lot of memory and wastes precious CPU time. Calling xxxAsync method and using C#’s async/await keywords allows your thread to return to the thread pool so it can be used for other things. This reduces the resource consumption of your app, allowing it to use more memory and improving response time to your clients. Congratulations Jeffrey! Of course, I also owe a massive thank you to everyone who’s been involved in the book, especially all the authors. It’s a real treat to work with a developer community that’s so keen to collaborate and to share their hard-won nuggets of performance knowhow. If you haven’t read it yet, I can’t recommend it highly enough. You can get it for free at www.red-gate.com/25secrets The full backstory for both eBooks: https://www.simple-talk.com/blogs/2012/11/15/application-performance-the-best-of-the-web/ https://www.simple-talk.com/blogs/2012/11/27/application-performance-episode-2-announcing-the-judges/ https://www.simple-talk.com/blogs/2013/01/25/free-ebook-50-ways-to-avoid-find-and-fix-asp-net-performance-issues/ https://www.simple-talk.com/blogs/2013/03/22/50-ways-to-avoid-find-and-fix-asp-net-performance-issues-the-next-generation/

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!! Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" } To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet UPDATE: THis post prompted Chad Miller to write a post describing his Powershell add-in that utilises a SSIS API to do exporting of packages. Go take a read here: http://sev17.com/2011/02/importing-and-exporting-ssis-packages-using-powershell/

    Read the article

  • Maven Integrated View for NetBeans IDE

    - by Geertjan
    Started working on an oft-heard request from Kirk Pepperdine for an integrated view for multimodule builds for Maven projects in NetBeans IDE, as explained here. I suddenly had some kind of brainwave and solved all the remaining problems I had, by delegating to the LogicalViewProvider's node, instead of the project's node, which means I inherit all the icons, actions, package nodes, and anything else that was originally defined within the original project, in this case for the open source JAnnocessor project: Above, you can see that the Maven submodules can either be edited in-line, i.e., within the parent project, or separately, by opening them in the traditional NetBeans way. Get the module here: http://plugins.netbeans.org/plugin/45180/?show=true Some people out there might be interested in how this is achieved. First, hide the original ModulesNodeFactory in the layer. Then create the following class, which creates what you see in the screenshot above: import java.util.ArrayList; import java.util.List; import javax.swing.event.ChangeListener; import org.netbeans.api.project.Project; import org.netbeans.spi.project.SubprojectProvider; import org.netbeans.spi.project.ui.LogicalViewProvider; import org.netbeans.spi.project.ui.support.NodeFactory; import org.netbeans.spi.project.ui.support.NodeList; import org.openide.nodes.FilterNode; import org.openide.nodes.Node; @NodeFactory.Registration(projectType = "org-netbeans-modules-maven", position = 400) public class ModulesNodeFactory2 implements NodeFactory { @Override public NodeList<?> createNodes(Project prjct) { return new MavenModulesNodeList(prjct); } private class MavenModulesNodeList implements NodeList<Project> { private final Project project; public MavenModulesNodeList(Project prjct) { this.project = prjct; } @Override public List<Project> keys() { return new ArrayList<Project>( project.getLookup(). lookup(SubprojectProvider.class).getSubprojects()); } @Override public Node node(final Project project) { Node node = project.getLookup().lookup(LogicalViewProvider.class).createLogicalView(); return new FilterNode(node, new FilterNode.Children(node)); } @Override public void addChangeListener(ChangeListener cl) { } @Override public void removeChangeListener(ChangeListener cl) { } @Override public void addNotify() { } @Override public void removeNotify() { } } } Considering that there's only about 5 actual statements above, it's pretty amazing how much can be achieved with so little code. The NetBeans APIs really are very cool. Hope you like it, Kirk!

    Read the article

  • Suggestions for connecting .NET WPF GUI with Java SE Server

    - by Sam Goldberg
    BACKGROUND We are building a Java (SE) trading application which will be monitoring market data and sending trade messages based on the market data, and also on user defined configuration parameters. We are planning to provide the user with a thin client, built in .NET (WPF) for managing the parameters, controlling the server behavior, and viewing the current state of the trading. The client doesn't need real-time updates; it will instead update the view once every few seconds (or whatever interval is configured by the user). The client has about 6 different operations it needs to perform with the server, for example: CRUD with configuration parameters query subset of the data receive updates of current positions from server It is possible that most of the different operations (except for receiving data) are just different flavors of managing the configuration parameters, but it's too early in our analysis for us to be sure. To connect the client with the server, we have been considering using: SOAP Web Service RESTful service building a custom TCP/IP based API (text or xml) (least preferred - but we use this approach with other applications we have) As best as I understand, pros and cons of the different web service flavors are: SOAP pro: totally automated in .NET (and Java), modifying server side interface require no code changes in communication layer, just running refresh on Web Service reference to regenerate the classes. con: more overhead in the communication layer sending more text, etc. We're not using J2EE container so maybe doesn't work so well with J2SE REST pro: lighter weight, less data. Has good .NET and Java support. (I don't have any real experience with this, so don't know what other benefits it has.) con: client will not be automatically aware if there are any new operations or properties added (?), so communication layer needs to be updated by developer if server interface changes. con: (both approaches) Server cannot really push updates to the client at regular intervals (?) (However, we won't mind if client polls the server to get updates.) QUESTION What are your opinions on the above options or suggestions for other ways to connect the 2 parts? (Ideally, we don't want to put much work into the communication layer, because it's not the significant part of the application so the more off-the-shelf and automated the better.)

    Read the article

  • Is throwing an error in unpredictable subclass-specific circumstances a violation of LSP?

    - by Motti Strom
    Say, I wanted to create a Java List<String> (see spec) implementation that uses a complex subsystem, such as a database or file system, for its store so that it becomes a simple persistent collection rather than an basic in-memory one. (We're limiting it specifically to a List of Strings for the purposes of discussion, but it could extended to automatically de-/serialise any object, with some help. We can also provide persistent Sets, Maps and so on in this way too.) So here's a skeleton implementation: class DbBackedList implements List<String> { private DbBackedList() {} /** Returns a list, possibly non-empty */ public static getList() { return new DbBackedList(); } public String get(int index) { return Db.getTable().getRow(i).asString(); // may throw DbExceptions! } // add(String), add(int, String), etc. ... } My problem lies with the fact that the underlying DB API may encounter connection errors that are not specified in the List interface that it should throw. My problem is whether this violates Liskov's Substitution Principle (LSP). Bob Martin actually gives an example of a PersistentSet in his paper on LSP that violates LSP. The difference is that his newly-specified Exception there is determined by the inserted value and so is strengthening the precondition. In my case the connection/read error is unpredictable and due to external factors and so is not technically a new precondition, merely an error of circumstance, perhaps like OutOfMemoryError which can occur even when unspecified. In normal circumstances, the new Error/Exception might never be thrown. (The caller could catch if it is aware of the possibility, just as a memory-restricted Java program might specifically catch OOME.) Is this therefore a valid argument for throwing an extra error and can I still claim to be a valid java.util.List (or pick your SDK/language/collection in general) and not in violation of LSP? If this does indeed violate LSP and thus not practically usable, I have provided two less-palatable alternative solutions as answers that you can comment on, see below. Footnote: Use Cases In the simplest case, the goal is to provide a familiar interface for cases when (say) a database is just being used as a persistent list, and allow regular List operations such as search, subList and iteration. Another, more adventurous, use-case is as a slot-in replacement for libraries that work with basic Lists, e.g if we have a third-party task queue that usually works with a plain List: new TaskWorkQueue(new ArrayList<String>()).start() which is susceptible to losing all it's queue in event of a crash, if we just replace this with: new TaskWorkQueue(new DbBackedList()).start() we get a instant persistence and the ability to share the tasks amongst more than one machine. In either case, we could either handle connection/read exceptions that are thrown, perhaps retrying the connection/read first, or allow them to throw and crash the program (e.g. if we can't change the TaskWorkQueue code).

    Read the article

  • ALSA samples capture: cannot open device

    - by Randagio
    I'm quite new to Linux (Lubuntu 12.04 for sake of precision) and ALSA programming at all. I'm trying to write a C program to capture audio from internal PC microphone for processing it. So as first step I google a bit and I found this article for capturing audio samples A tutorial on using the ALSA Audio API but when I compile it and execute it with: ./capture "default" or ./capture "hw:0,0" and all the possible variants on theme it always raises the error: cannot open device hw:0,0 (no such file or directory). So the issue is: what is the name of the mic audio device to pass as parameter to record the audio from mic ? The mic is working ok because the Sound Recorder program records sounds perfectly and I can playback them. The output of the aplay -l is the following : **** List of PLAYBACK Hardware Devices **** card 0: I82801DBICH4 [Intel 82801DB-ICH4], device 0: Intel ICH [Intel 82801DB-ICH4] Subdevices: 1/1 Subdevice #0: subdevice #0 card 0: I82801DBICH4 [Intel 82801DB-ICH4], device 4: Intel ICH - IEC958 [Intel 82801DB-ICH4 - IEC958] Subdevices: 1/1 Subdevice #0: subdevice #0 and this is the amixer output (cut) Simple mixer control 'Master',0 Capabilities: pvolume pswitch penum Playback channels: Front Left - Front Right Limits: Playback 0 - 31 Mono: Front Left: Playback 31 [100%] [0.00dB] [on] Front Right: Playback 31 [100%] [0.00dB] [on] Simple mixer control 'Master Mono',0 Capabilities: pvolume pvolume-joined pswitch pswitch-joined penum Playback channels: Mono Limits: Playback 0 - 31 Mono: Playback 4 [13%] [-40.50dB] [on] Simple mixer control 'PCM',0 Capabilities: pvolume pswitch penum Playback channels: Front Left - Front Right Limits: Playback 0 - 31 Mono: Front Left: Playback 31 [100%] [12.00dB] [on] Front Right: Playback 31 [100%] [12.00dB] [on] Simple mixer control 'CD',0 Capabilities: pvolume pswitch cswitch cswitch-exclusive penum Capture exclusive group: 0 Playback channels: Front Left - Front Right Capture channels: Front Left - Front Right Limits: Playback 0 - 31 Front Left: Playback 0 [0%] [-34.50dB] [off] Capture [off] Front Right: Playback 0 [0%] [-34.50dB] [off] Capture [off] Simple mixer control 'Mic',0 Capabilities: pvolume pvolume-joined pswitch pswitch-joined cswitch cswitch-exclusive penum Capture exclusive group: 0 Playback channels: Mono Capture channels: Front Left - Front Right Limits: Playback 0 - 31 Mono: Playback 22 [71%] [-1.50dB] [on] Front Left: Capture [on] Front Right: Capture [on] Simple mixer control 'Mic Boost (+20dB)',0 Capabilities: pswitch pswitch-joined penum Playback channels: Mono Mono: Playback [off] Simple mixer control 'Mic Select',0 Capabilities: enum Items: 'Mic1' 'Mic2' Item0: 'Mic1' Simple mixer control 'Stereo Mic',0 Capabilities: pswitch pswitch-joined penum Playback channels: Mono Mono: Playback [off] so for aplay it seems I have no recording device, but for amixer I've got the mic, a mic boost and mic stereo as well with all those gorgeous stuffs on their place !!. If so, how could my Sound Recorder record the audio without any problem at all ?!?! For sure I'm giving the wrong device name to the command line for capturing audio but I'm loosing the hope for finding the correct one ! Please help....before I tear my hair out !!!

    Read the article

  • Suggestions for connecting .NET WPF GUI with Java SE Server aoo

    - by Sam Goldberg
    BACKGROUND We are building a Java (SE) trading application which will be monitoring market data and sending trade messages based on the market data, and also on user defined configuration parameters. We are planning to provide the user with a thin client, built in .NET (WPF) for managing the parameters, controlling the server behavior, and viewing the current state of the trading. The client doesn't need real-time updates; it will instead update the view once every few seconds (or whatever interval is configured by the user). The client has about 6 different operations it needs to perform with the server, for example: CRUD with configuration parameters query subset of the data receive updates of current positions from server It is possible that most of the different operations (except for receiving data) are just different flavors of managing the configuration parameters, but it's too early in our analysis for us to be sure. To connect the client with the server, we have been considering using: SOAP Web Service RESTful service building a custom TCP/IP based API (text or xml) (least preferred - but we use this approach with other applications we have) As best as I understand, pros and cons of the different web service flavors are: SOAP pro: totally automated in .NET (and Java), modifying server side interface require no code changes in communication layer, just running refresh on Web Service reference to regenerate the classes. con: more overhead in the communication layer sending more text, etc. We're not using J2EE container so maybe doesn't work so well with J2SE REST pro: lighter weight, less data. Has good .NET and Java support. (I don't have any real experience with this, so don't know what other benefits it has.) con: client will not be automatically aware if there are any new operations or properties added (?), so communication layer needs to be updated by developer if server interface changes. con: (both approaches) Server cannot really push updates to the client at regular intervals (?) (However, we won't mind if client polls the server to get updates.) QUESTION What are your opinions on the above options or suggestions for other ways to connect the 2 parts? (Ideally, we don't want to put much work into the communication layer, because it's not the significant part of the application so the more off-the-shelf and automated the better.)

    Read the article

  • Developer Day @ OOP 2001with SOA Specialized Partners

    - by Jürgen Kress
    Oracle SOA Specialized Partners like Opitz Consulting participate in our key marketing events. Therefore make sure that you start your journey to SOA Specialization! ORACLE Developer Day auf der OOP: Entdecken Sie die Einsatzmöglichkeiten und Leistungsfähigkeit der Java-Technologie! incl. Live Hacking mit Special Guest: JAVA Guru Adam Bien! Enterprise-Anwendungen leicht gemacht! Beschleunigen Sie Ihre Entwicklung mit Java. Kommen Sie zum kostenlosen Ganztages-Workshop von ORACLE auf der OOP und lernen Sie die Leistungsfähigkeit von Java kennen. Erfahren Sie mehr über die Java Strategie und die Produktroadmap, welche Einsatzmöglichkeiten Java SE für Embedded erschließt und wie sich eine SOA und BPM-Lösung auf der Basis von Java realisieren lässt. Die vielfältigen Verbesserungen von Java EE6 erleichtern den Entwicklern das Leben erheblich. Kennen Sie bereits das Potential von Java EE6? Adam Bien wird Sie mit einem Live-Hacking von den Stühlen reißen. Torsten Winterberg, Oracle Fusion Middleware ACE Director und Danilo Schmiedel stellen vor wie Java Entwickler die Oracle SOA & BPM Lösungen einbinden können. Am Nachmittag können Sie dann in einer Hands-On Session mit Ihrem eigenen Laptop Java Persistence API, Java Beans, CDI und weitere Technologien ausprobieren. In diesem kostenlosen Workshop von Oracle können Sie sich mit Gleichgesinnten austauschen, sich die neueste Technik direkt von den Oracle Experten zeigen lassen und an praktischen Programmierübungen teilnehmen. Auf dieser Veranstaltung sind Sie richtig, wenn Sie mehr über den aktuellen Status der Java Roadmap wissen wollen, mehr über Java Technologie- und Lösungen (Java SE, ME, etc) erfahren wollen, die Plattform Java EE erproben, die Vorteile der Java EE 6 für Ihre Arbeit verstehen möchten, wenn Sie auf eine Enterprise-Landschaft hochskalieren wollen, mit Java Server Faces Front-Ends erstellen, neue Entwicklungsprojekte planen oder gerade in Angriff annehmen. Registrieren Sie sich jetzt!   ICM - Internationales Congress Center München Am Messesee, Trudering-Riem 81829 München 27. Januar 2011 9.00 Uhr - 16.30 Uhr For more information on SOA Specialization and the SOA Partner Community please feel free to register at www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website Technorati Tags: OOP,Adam Bien,Torsten Winterberg,Opitz Consulting,Oracle,SOA,SOA Specialization,OPN

    Read the article

  • Additional new material WebLogic Community

    - by JuergenKress
    Oracle Cloud Application Foundation 12c Helps Customers Deliver Next-Generation Applications on a Mission-Critical Cloud Platform In a recent online event, Oracle and industry speakers introduced Oracle Cloud Application Foundation 12c, including Oracle WebLogic 12.1.2 and Oracle Coherence 12.1.2.  Read More Team Spotlight: Mike Lehmann, Vice President of Product Management Meet the team behind Oracle Fusion Middleware. In this edition, we speak to Mike Lehmann, Oracle’s vice president of product management for Oracle Cloud Application Foundation, Oracle WebLogic Server, Oracle Coherence, Java Cloud Services, and Java Platform, Enterprise Edition. Read More New and Free: Learn Oracle Application Development Framework Mobile Online at Your Convenience Are you ready to go mobile? Check out this new tutorial from Oracle’s ADF Academy - Developing Applications with Oracle Application Development Framework Mobile. New: Oracle JDeveloper 12c and Oracle Application Development Framework 12c Announcing Oracle JDeveloper 12c and Oracle Application Development Framework 12c. New capabilities include HTML5, better Maven support, Git support, new Oracle ADF Faces components, improved REST support, Enterprise JavaBeans/Java Persistence API, and the latest support for Oracle WebLogic Server 12.1.2. Get more details and download. New: Oracle Enterprise Pack for Eclipse 12c The best Eclipse-based tools for Oracle WebLogic and Oracle Coherence continue to get better. Check out the latest Oracle WebLogic and Oracle Coherence support, improved Oracle Application Development Framework support, Maven, and more. Register: Oracle WebLogic Devcast Series Join us for the upcoming Oracle WebLogic Devcast webcast. Oracle GlassFish Server 3.1.2 and 2.1.1 updates  & An Overview of JSON-P & Comprehensive Free Java EE 6 Video Tutorial! Java ME Embedded 3.3 and Java ME Software Development Kit (SDK) 3.3 Now Available - Optimized for microcontrollers and other resource-constrained devices, this release reduces "core plumbing" for an app, and includes more information about memory and network usage critical for low-power apps. JDK 8 Early Access Releases now available JDK 8 Early Access Developer Documentation - Get the latest documentation changes to the Java Developer Guides and the Java Tutorials - Blog NetBeans IDE 7.4 Beta - This release extends HTML5 features to Java EE and PHP application development, introduces new support for Hybrid HTML5 development on Android and iOS platforms, and preview support for JDK 8. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • Exit Infragistics, Enter Telerik

    - by Anthony Trudeau
    Today I made the purchase of the Premium Collection of components from Telerik.  This follows an evaluation I’ve been doing to replace the Infragistics components we currently use for Windows Forms, ASP.NET MVC, and WPF. It was not a formal evaluation.  I had already decided to move the company away from Infragistics.  That decision was mostly born out of frustration with support over using the Infragistics components in my first production MVC application. One such issue was a simple scenario where you have a model that has a scalar property that can be one value out of a list.  The built-in combobox does this, but I was told by Infragistics support that they didn’t support it – and it took them several emails and days of waiting between responses to determine that.  I implemented this in Telerik in a minute not including the several minutes it took me to get a rudimentary understanding for the component and its API. Here’s the code using the built-in combobox:@Html.DropDownListFor(x => x.VendorId, new SelectList(ViewBag.Vendors, "VendorId", "VendorName", Model.VendorId), "Select Id") Here’s the code using the Telerik combobox:@(Html.Telerik().ComboBoxFor(model => model.VendorId) .AutoFill(true) .BindTo(new SelectList(ViewBag.Vendors, "VendorId", "VendorName", Model.VendorId)) )   I chose Telerik over other competitors based on the professional appearance of their website, and how easy it was to find information.  I’d like to say I had time to evaluate other Infragistics competitors.  Due to time constraints I had to make an initial decision based on superficial, but still important things. I picked Telerik with the plan to only look further at other companies if my evaluation didn’t meet my expectations.  Luckily they did, because I didn’t relish the thought of carving out more time to evaluate another set of components. Overall my experience with Telerik has been superior to Infragistics in every way.  The installation was easy using their control panel installer application.  Getting up to speed has been easy.  And the communication from Telerik has met my expectations.  And we’ll continue to be good as long as I don’t start getting email messages from a sales rep saying that they want to talk to me about training and consulting – I’m looking at you Infragistics.

    Read the article

  • Music Notation Editor - Refactoring view creation logic elseware

    - by Cyril Silverman
    Let me preface by saying that knowing some elementary music theory and music notation may be helpful in grasping the problem at hand. I'm currently building a Music Notation and Tablature Editor (in Javascript). But I've come to a point where the core parts of the program are more or less there. All functionality I plan to add at this point will really build off the foundation that I've created. As a result, I want to refactor to really solidify my code. I'm using an API called VexFlow to render notation. Basically I pass the parts of the editor's state to VexFlow to build the graphical representation of the score. Here is a rough and stripped down UML diagram showing you the outline of my program: In essence, a Part has many Measures which has many Notes which has many NoteItems (yes, this is semantically weird, as a chord is represented as a Note with multiple NoteItems, individual pitches or fret positions). All of the relationships are bi-directional. There are a few problems with my design because my Measure class contains the majority of the entire application view logic. The class holds the data about all VexFlow objects (the graphical representation of the score). It contains the graphical Staff object and the graphical notes. (Shouldn't these be placed somewhere else in the program?) While VexFlowFactory deals with actual creation (and some processing) of most of the VexFlow objects, Measure still "directs" the creation of all the objects and what order they are supposed to be created in for both the VexFlowStaff and VexFlowNotes. I'm not looking for a specific answer as you'd need a much deeper understanding of my code. Just a general direction to go in. Here's a thought I had, create an MeasureView/NoteView/PartView classes that contains the basic VexFlow objects for each class in addition to any extraneous logic for it's creation? but where would these views be contained? Do I create a ScoreView that is a parallel graphical representation of everything? So that ScoreView.render() would cascade down PartView and call render for each PartView and casade down into each MeasureView, etc. Again, I just have no idea what direction to go in. The more I think about it, the more ways to go seem to pop into my head. I tried to be as concise and simplistic as possible while still getting my problem across. Please feel free to ask me any questions if anything is unclear. It's quite a struggle trying to dumb down a complicated problem to its core parts.

    Read the article

< Previous Page | 418 419 420 421 422 423 424 425 426 427 428 429  | Next Page >