Search Results

Search found 16383 results on 656 pages for 'bi applications'.

Page 191/656 | < Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >

  • Sharepoint 2010 web application development suitability evaluation/assessment

    - by Robert Koritnik
    I would like to know what kind of applications are suitable to be developed on top of Sharepoint 2010 and which should not be built on to of it. So when to embrace/avoid Sharepoint 2010 as a development platform for new web applications. Addendum Would you as a sharepoint development specialist choose it as a platform for your next enterprise application with these characteristics: processor intensive lots of various screens for entering and managing data many complex business processes no need to change the UI (ie. reposition parts) ERP integration etc. I'm an Asp.net MVC (former web forms) developer and would like to know if usual multi-page semi complex web applications (intra/extra-net) should be built on top of Sharepoint 2010 and why (if yes or if no).

    Read the article

  • Class library modification / migration

    - by Clint
    I have 3 class libraries. A BBL, a DAL, and a DATA (about 15 datasets). Currently 4 [major] applications utilize the functionality in these DLL's. I'm rewriting one of those applications and I need to (1) Use some of the existing functionality in the libraries (2) Change some of it (3) Add new functionality (4) Add new datasets. I'm back and forth about the best way to do this, while keeping my risks at a minimum. Some thoughts.. 1) Use the existing projects and don't make any modifications, only additions 2) Make new libraries, bring over the code I can use, and make additions as needed 3) Implement partial classes in the existing projects Eventually all 4 applications will use the newest functionality, but it will be a slow migration; so the old code can't be depricated yet. Any thoughts?

    Read the article

  • X11, how to detect I’m the last window/application on the display

    - by ts
    I have an x11 display with a windowmanager (sadly not a specific one, could be twm, dtwm, mwm, metacity …) , myApp and other applications with windows. I want to close the display if the other applications are closed and myApp is the only one with windows on the display. I do know the windows of myApp, but how do I distinguish between the windows of the windowmanger and of the other applications. I’m currently polling with xwininfo -tree -root -children and comparing this to what I’m expecting, but this only works in a ‘well defined’ environment. It seems that many of the above mentioned windowmanager don’t support EWMH.

    Read the article

  • Big Data: Size isn’t everything

    - by Simon Elliston Ball
    Big Data has a big problem; it’s the word “Big”. These days, a quick Google search will uncover terabytes of negative opinion about the futility of relying on huge volumes of data to produce magical, meaningful insight. There are also many clichéd but correct assertions about the difficulties of correlation versus causation, in massive data sets. In reading some of these pieces, I begin to understand how climatologists must feel when people complain ironically about “global warming” during snowfall. Big Data has a name problem. There is a lot more to it than size. Shape, Speed, and…err…Veracity are also key elements (now I understand why Gartner and the gang went with V’s instead of S’s). The need to handle data of different shapes (Variety) is not new. Data developers have always had to mold strange-shaped data into our reporting systems, integrating with semi-structured sources, and even straying into full-text searching. However, what we lacked was an easy way to add semi-structured and unstructured data to our arsenal. New “Big Data” tools such as MongoDB, and other NoSQL (Not Only SQL) databases, or a graph database like Neo4J, fill this gap. Still, to many, they simply introduce noise to the clean signal that is their sensibly normalized data structures. What about speed (Velocity)? It’s not just high frequency trading that generates data faster than a single system can handle. Many other applications need to make trade-offs that traditional databases won’t, in order to cope with high data insert speeds, or to extract quickly the required information from data streams. Unfortunately, many people equate Big Data with the Hadoop platform, whose batch driven queries and job processing queues have little to do with “velocity”. StreamInsight, Esper and Tibco BusinessEvents are examples of Big Data tools designed to handle high-velocity data streams. Again, the name doesn’t do the discipline of Big Data any favors. Ultimately, though, does analyzing fast moving data produce insights as useful as the ones we get through a more considered approach, enabled by traditional BI? Finally, we have Veracity and Value. In many ways, these additions to the classic Volume, Velocity and Variety trio acknowledge the criticism that without high-quality data and genuinely valuable outputs then data, big or otherwise, is worthless. As a discipline, Big Data has recognized this, and data quality and cleaning tools are starting to appear to support it. Rather than simply decrying the irrelevance of Volume, we need as a profession to focus how to improve Veracity and Value. Perhaps we should just declare the ‘Big’ silent, embrace these new data tools and help develop better practices for their use, just as we did the good old RDBMS? What does Big Data mean to you? Which V gives your business the most pain, or the most value? Do you see these new tools as a useful addition to the BI toolbox, or are they just enabling a dangerous trend to find ghosts in the noise?

    Read the article

  • Building vs. Buying a Master Data Management Solution

    - by david.butler(at)oracle.com
    Many organizations prefer to build their own MDM solutions. The argument is that they know their data quality issues and their data better than anyone. Plus a focused solution will cost less in the long run then a vendor supplied general purpose product. This is not unreasonable if you think of MDM as a point solution for a particular data quality problem. But this approach carries significant risk. We now know that organizations achieve significant competitive advantages when they deploy MDM as a strategic enterprise wide solution: with the most common best practice being to deploy a tactical MDM solution and grow it into a full information architecture. A build your own approach most certainly will not scale to a larger architecture unless it is done correctly with the larger solution in mind. It is possible to build a home grown point MDM solution in such a way that it will dovetail into broader MDM architectures. A very good place to start is to use the same basic technologies that Oracle uses to build its own MDM solutions. Start with the Oracle 11g database to create a flexible, extensible and open data model to hold the master data and all needed attributes. The Oracle database is the most flexible, highly available and scalable database system on the market. With its Real Application Clusters (RAC) it can even support the mixed OLTP and BI workloads that represent typical MDM data access profiles. Use Oracle Data Integration (ODI) for batch data movement between applications, MDM data stores, and the BI layer. Use Oracle Golden Gate for more real-time data movement. Use Oracle's SOA Suite for application integration with its: BPEL Process Manager to orchestrate MDM connections to business processes; Identity Management for managing users; WS Manager for managing web services; Business Intelligence Enterprise Edition for analytics; and JDeveloper for creating or extending the MDM management application. Oracle utilizes these technologies to build its MDM Hubs.  Customers who build their own MDM solution using these components will easily migrate to Oracle provided MDM solutions when the home grown solution runs out of gas. But, even with a full stack of open flexible MDM technologies, creating a robust MDM application can be a daunting task. For example, a basic MDM solution will need: a set of data access methods that support master data as a service as well as direct real time access as well as batch loads and extracts; a data migration service for initial loads and periodic updates; a metadata management capability for items such as business entity matrixed relationships and hierarchies; a source system management capability to fully cross-reference business objects and to satisfy seemingly conflicting data ownership requirements; a data quality function that can find and eliminate duplicate data while insuring correct data attribute survivorship; a set of data quality functions that can manage structured and unstructured data; a data quality interface to assist with preventing new errors from entering the system even when data entry is outside the MDM application itself; a continuing data cleansing function to keep the data up to date; an internal triggering mechanism to create and deploy change information to all connected systems; a comprehensive role based data security system to control and monitor data access, update rights, and maintain change history; a flexible business rules engine for managing master data processes such as privacy and data movement; a user interface to support casual users and data stewards; a business intelligence structure to support profiling, compliance, and business performance indicators; and an analytical foundation for directly analyzing master data. Oracle's pre-built MDM Hub solutions are full-featured 3-tier Internet applications designed to participate in the full Oracle technology stack or to run independently in other open IT SOA environments. Building MDM solutions from scratch can take years. Oracle's pre-built MDM solutions can bring quality data to the enterprise in a matter of months. But if you must build, at lease build with the world's best technology stack in a way that simplifies the eventual upgrade to Oracle MDM and to the full enterprise wide information architecture that it enables.

    Read the article

  • SSIS - Connect to Oracle on a 64-bit machine (Updated for SSIS 2008 R2)

    - by jorg
    We recently had a few customers where a connection to Oracle on a 64 bit machine was necessary. A quick search on the internet showed that this could be a big problem. I found all kind of blog and forum posts of developers complaining about this. A lot of developers will recognize the following error message: Test connection failed because of an error in initializing provider. Oracle client and networking components were not found. These components are supplied by Oracle Corporation and are part of the Oracle Version 7.3.3 or later client software installation. Provider is unable to function until these components are installed. After a lot of searching, trying and debugging I think I found the right way to do it! Problems Because BIDS is a 32 bit application, as well on 32 as on 64 bit machines, it cannot see the 64 bit driver for Oracle. Because of this, connecting to Oracle from BIDS on a 64 bit machine will never work when you install the 64 bit Oracle client. Another problem is the "Microsoft Provider for Oracle", this driver only exists in a 32 bit version and Microsoft has no plans to create a 64 bit one in the near future. The last problem I know of is in the Oracle client itself, it seems that a connection will never work with the instant client, so always use the full client. There are also a lot of problems with the 10G client, one of it is the fact that this driver can't handle the "(x86)" in the path of SQL Server. So using the 10G client is no option! Solution Download the Oracle 11G full client. Install the 32 AND the 64 bit version of the 11G full client (Installation Type: Administrator) and reboot the server afterwards. The 32 bit version is needed for development from BIDS with is 32 bit, the 64 bit version is needed for production with the SQLAgent, which is 64 bit. Configure the Oracle clients (both 32 and 64 bits) by editing  the files tnsnames.ora and sqlnet.ora. Try to do this with an Oracle DBA or, even better, let him/her do this. Use the "Oracle provider for OLE DB" from SSIS, don't use the "Microsoft Provider for Oracle" because a 64 bit version of it does not exist. Schedule your packages with the SQLAgent. Background information Visual Studio (BI Dev Studio)is a 32bit application. SQL Server Management Studio is a 32bit application. dtexecui.exe is a 32bit application. dtexec.exe has both 32bit and 64bit versions. There are x64 and x86 versions of the Oracle provider available. SQLAgent is a 64bit process. My advice to BI consultants is to get an Oracle DBA or professional for the installation and configuration of the 2 full clients (32 and 64 bit). Tell the DBA to download the biggest client available, this way you are sure that they pick the right one ;-) Testing if the clients have been installed and configured in the right way can be done with Windows ODBC Data Source Administrator: Start... Programs... Administrative tools... Data Sources (ODBC) ADITIONAL STEPS FOR SSIS 2008 R2 It seems that, unfortunately, some additional steps are necessary for SQL Server 2008 R2 installations: 1. Open REGEDIT (Start… Run… REGEDIT) on the server and search for the following entry (for the 32 bits driver): HKEY_LOCAL_MACHINE\Software\Microsoft\MSDTC\MTxOCI Make sure the following values are entered: 2. Next, search for (for the 64 bits driver): HKEY_LOCAL_MACHINE\Software\Wow6432Node\Microsoft\MSDTC\MTxOCI Make sure the same values as above are entered. 3. Reboot your server.

    Read the article

  • ArchBeat Link-o-Rama Top 10 for September 2-8, 2012

    - by Bob Rhubart
    The Top 10 items shared on the OTN Facebook Page for the week of September 2-8, 2012. Adding a runtime LOV for a taskflow parameter in WebCenter | Yannick Ongena Oracle ACE Yannick Ongena illustrates how to customize the parameters tab for a taskflow in WebCenter. Tips on Migrating from AquaLogic .NET Accelerator to WebCenter WSRP Producer for .NET | Scott Nelson "It has been a very winding path and this blog entry is intended to share both the lessons learned and relevant approaches that led to those learnings," says Scott Nelson. "Like most journeys of discovery, it was not a direct path, and there are notes to let you know when it is practical to skip a section if you are in a hurry to get from here to there." Free Event: Oracle Technology Network Architect Day – Boston, MA – 9/12/2012 Sure, you could ask a voodoo priestess for help in improving your solution architecture skills. But there's the whole snake thing, and the zombie thing, and other complications. So why not keep it simple and register for Oracle Technology Network Architect Day in Boston, MA. There's no magic, just a full day of technical sessions covering Cloud, SOA, Engineered Systems, and more. Registration is free, but seating is limited. You'll curse yourself if you miss this one. Starting and Stopping Fusion Applications the Right Way | Ronaldo Viscuso While the fastartstop tool that ships with Oracle Fusion Applications does most of the work to start/stop/bounce the Fusion Apps environment, it does not do it all. Oracle Fusion Applications A-Team blogger Ronaldo Viscuso's post "aims to explain all tasks involved in starting and stopping a Fusion Apps environment completely." Article Index: Architect Community Column in Oracle Magazine Did you know that Oracle Magazine features a regular column devoted specifically to the architect community? Every issue includes insight and expertise from architects who regularly work with Oracle Technologies. Click here to see a complete list of these articles. Using FMAP and AnalyticsRes in a Oracle BI High Availability Implementation | Art of Business Intelligence "The fmap syntax has been used for a long time in Oracle BI / Siebel Analytics when referencing images inherent in the application as well as custom images," says Oracle ACE Christian Screen. "This syntax is used on Analysis requests an dashboards." Dodeca Customer Feedback - The Rosewood Company | Tim Tow Oracle ACE Director Tim Tow shares anecdotal comments from one of his clients, a company that is deploying Dodeca to replace an aging VBA/Essbase application. Configuring UCM cache to check for external Content Server changes | Martin Deh Oracle WebCenter and ADF A-Team blogger shares the background information and the solution to a recently encountered customer scenario. Attend OTN Architect Day in Los Angeles – by Architects, for Architects – October 25 The OTN Architect Day roadshow stops in Boston next week, then it's on to Los Angeles for another all architecture, all day event on Thursday October 25, 2012 at the Sofitel Los Angeles, 555 Beverly Boulevard, Los Angeles, CA 90048. Like all Architect Day events, this one is absolutley free, so register now. The Role of Oracle VM Server for SPARC In a Virtualization Strategy New OTN article from Matthias Pfutzner. Thought for the Day "Practicing architects, through eduction, experience and examples, accumulate a considerable body of contextual sense by the time they're entrusted with solving a system-level problem…" — Eberhardt Rechtin (January 16, 1926 – April 14, 2006) Source: SoftwareQuotes.com

    Read the article

  • Exalytics and Oracle Business Intelligence Enterprise Edition (OBIEE) Partner Workshop

    - by mseika
    Workshop Description Oracle Fusion Middleware 11g is the #1 application infrastructure foundation. It enables enterprises to create and run agile and intelligent business applications and maximize IT efficiency by exploiting modern hardware and software architectures. Oracle Exalytics Business Intelligence Machine is the world’s first engineered system specifically designed to deliver high performance analysis, modeling and planning. Built using industry-standard hardware, market-leading business intelligence software and in-memory database technology, Oracle Exalytics is an optimized system that delivers unmatched speed, visualizations and scalability for Business Intelligence and Enterprise Performance Management applications. This FREE hands-on, partner workshop highlights both the hardware and software components that are engineered to work together to deliver Oracle Exalytics - an optimized version of the industry-leading Oracle TimesTen In-Memory Database with analytic extensions, a highly scalable Oracle server designed specifically for in-memory business intelligence, and Oracle’s proven Business Intelligence Foundation with enhanced visualization capabilities and performance optimizations. This workshop will provide hands-on experience with Oracle's latest engineered system. Topics covered will include TimesTen In-Memory Database and the new Summary Advisor for Exalytics, the technical details (including mobile features) of the latest release of visualization enhancements for OBI-EE, and technical updates on Essbase. After taking this course, you will be well prepared to architect, build, demo, and implement an end-to-end Exalytics solution. You will also be able to extend your current analytical and enterprise performance management application implementations with numerous Oracle technologies specifically enhanced to take advantage of the compute capacity and in-memory capabilities of Oracle Exalytics.If you are a BI or Data Warehouse Architect, developer or consultant, you don’t want to miss this 3-day workshop. Register Now! Presentations Exalytics Architectural Overview Upgrade and Lifecycle Management Times Ten for Exalytics Summary Advisor Utility Essbase and EPM System on Exalytics Dashboard and Analysis Interactions OBIEE 11.1.1.6 Features and Advanced Topics Lab OutlineThe labs showcase Oracle Exalytics core components and functionality and provide expertise of Oracle Business Intelligence 11.1.1.6 new features and updates from prior releases. The hands-on activities are based on an Oracle VirtualBox image with software and training samples pre-installed. Lab Environment Setup Creating and Working with Oracle TimesTen In-Memory Database Running Summary Advisor Utility Working with Exalytics Visualization Features – Dashboard and Analysis Interactions Audience Oracle Partners BI and EPM Application Developers and Implementers System Integrators and Solution Consultants Data Warehouse Developers Enterprise Architects Prerequisites Experience and understanding of OBIEE 11g is required Previous attendance of Oracle Business Intelligence Foundation Suite Workshop or BIEE 11gIntroduction Workshop is highly recommended Good understanding of data warehousing and data modeling for reporting and analysis purpose Strong experience with database technologies preferred Equipment RequirementsThis workshop requires attendees to provide their own laptops for this class.Attendee laptops must meet the following minimum hardware/software requirements: Hardware Minimum 8GB RAM 60 GB free space (includes staging) USB 2.0 port (at least one available) It is strongly recommended that you bring a mouse. You will be working in a development environment and using the mouse heavily. Software One of the following operating systems: 64-bit Windows host/laptop OS 64-bit host/laptop OS with a Windows VM (XP, Server, or Win 7, BIC2g, etc.) Internet Explorer 7.x/8.x or Firefox 3.5.x WINRAR or 7ziputility to unzip workshop files: Download-able from http://www.win-rar.com/download.html Download-able from http://www.7zip.com/ Oracle VirtualBox 4.0.2 or higher Downloadable from http://www.virtualbox.org/wiki/Downloads CPU virtualization mode needs to be enabled. We will provide guidance on the day of the workshop. Attendees will be given a VirtualBox image containing a pre-installed Oracle Exalytics environment. Schedule This workshop is 3 days. - Times vary by country!9:00am: Sign-in and technical setup 9:30am: Workshop starts 5:00pm: Workshop ends Oracle Exalytics and Business Intelligence (OBIEE) Workshop December 11-13, 2012: Oracle BVP, Birmingham, UK Register Here. Questions? Send email to: [email protected] Oracle Platform Technologies Enablement Services

    Read the article

  • ArchBeat Link-o-Rama Top 10 for August 19-26, 2012

    - by Bob Rhubart
    The Top 10 most popular items shared via the OTN ArchBeat Facebook page for the week of August 19-26, 2012. Now Available: Oracle SQL Developer 3.2 (3.2.09.23) The latest release of Oracle SQl Developer includes UI enhancements, 12c database support, and bug fixes. ADF Tutorial Chapter 3: Creating a Master-Detail taskflow | Yannick Ongena Oracle ACE Yannick Ongena continues his ADF tutorial with a chapter devoted to view layer and using the data control to build pages that allow user to update reference data. GlassFish Community Event at JavaOne 2012 Don't miss out on this exclusive GlassFish Community Event on Sunday, September 30th from 11:00 a.m. – 1:00 p.m. in Moscone South. Register Now! Part of JavaOne 2012. Oracle BI 11g Book Authors – Podcast #9 | Art of Business Intelligence In this home-grown podcast, authors Christian Screen, Haroun Khan, and Adrian Ward talk about their new book, "Oracle Business Intelligence Enterprise Edition 11g: A Hands-On Tutorial," about their sessions at Oracle OpenWorld, and about their ORACLENERD t-shirts. Oracle Service Bus duplicate message check using Coherence | Jan van Zoggel "Giving the fact that every message on our ESB has an unique messageID element in the SOAP header we could store this on disk, database or in memory,"says Jan van Zoggel. "With the help of Oracle Coherence this last option, in memory, is relatively simple." Even simpler with Jan's detailed instructions. Oracle Technology Network Architect Day - Boston - Sept 12 There are easier ways to increase your IT brainpower. Skip the electrodes and register for Oracle Technology Network Architect Day in Boston, September 12, 2012. This free event includes 8 technical sessions, panel Q&A, roundtable discussions—and a free lunch. 8:00 a.m. – 5:00 p.m. at the Boston Marriott Burlington, One Burlington Mall Road, Burlington, MA 01803. Oracle BPM enable BAM | Peter Paul van de Beek "BAM enables you to make decisions based on real-time information gathered from your running processes," says Peter Paul van de Beek. "With BPMN processes you can use the standard Business Indicators that the BPM Suite offers you and use them to with BAM without much extra effort." Sample Application for Switching Application Module Data Sources | Andrejus Baranovskis A sample application and how-to guide from Oracle ACE Director and ADF expert Andrejus Baranovskis. ORCLville: Some Basic BI Thoughts "If we'd stop to consider what business intelligence really is, many of us might grow a different perspective about how we implement enterprise apps," says Oracle ACE Director Floyd Teter. "What if we implemented with an eye to what kind of information we'd like to get from our enterprise apps?" Oracle VM VirtualBox 4.1.20 released |Oracle's Virtualization Blog Oracle VM VirtualBox 4.1.20 was just released at the community and Oracle download sites, reports the Fat Bloke. This is a maintenance release containing bug fixes and stability improvements. Thought for the Day "The programmer, like the poet, works only slightly removed from pure thought-stuff. He builds his castles in the air, from air, creating by exertion of the imagination. Few media of creation are so flexible, so easy to polish and rework, so readily capable of realizing grand conceptual structures." — Frederick P. Brooks Source: SoftwareQuotes

    Read the article

  • Crime Scene Investigation: SQL Server

    - by Rodney Landrum
    “The packages are running slower in Prod than they are in Dev” My week began with this simple declaration from one of our lead BI developers, quickly followed by an emailed spreadsheet demonstrating that, over 5 executions, an extensive ETL process was running average 630 seconds faster on Dev than on Prod. The situation needed some scientific investigation to determine why the same code, the same data, the same schema would yield consistently slower results on a more powerful server. Prod had yet to be officially christened with a “Go Live” date so I had the time, and having recently been binge watching CSI: New York, I also had the inclination. An inspection of the two systems, Prod and Dev, revealed the first surprise: although Prod was indeed a “bigger” system, with double the amount of RAM of Dev, the latter actually had twice as many processor cores. On neither system did I see much sign of resources being heavily taxed, while the ETL process was running. Without any real supporting evidence, I jumped to a conclusion that my years of performance tuning should have helped me avoid, and that was that the hardware differences explained the better performance on Dev. We spent time setting up a Test system, similarly scoped to Prod except with 4 times the cores, and ported everything across. The results of our careful benchmarks left us truly bemused; the ETL process on the new server was slower than on both other systems. We burned more time tweaking server configurations, monitoring IO and network latency, several times believing we’d uncovered the smoking gun, until the results of subsequent test runs pitched us back into confusion. Finally, I decided, enough was enough. Hadn’t I learned very early in my DBA career that almost all bottlenecks were caused by code and database design, not hardware? It was time to get back to basics. With over 100 SSIS packages and hundreds of queries, each handling specific tasks such as file loads, bulk inserts, transforms, logging, and so on, the task seemed formidable. And yet, after barely an hour spent with Profiler, Extended Events, and wait statistics DMVs, I had a lead in the shape of a query that joined three tables, containing millions of rows, returned 3279 results, but performed 239K logical reads. As soon as I looked at the execution plans for the query in Dev and Test I saw the culprit, an implicit conversion warning on a join predicate field that was numeric in one table and a varchar(50) in another! I turned this information over to the BI developers who quickly resolved the data type mismatches and found and fixed “several” others as well. After the schema changes the same query with the same databases ran in under 1 second on all systems and reduced the logical reads down to fewer than 300. The analysis also revealed that on Dev, the ETL task was pulling data across a LAN, whereas Prod and Test were connected across slower WAN, in large part explaining why the same process ran slower on the latter two systems. Loading the data locally on Prod delivered a further 20% gain in performance. As we progress through our DBA careers we learn valuable lessons. Sometimes, with a project deadline looming and pressure mounting, we choose to forget them. I was close to giving into the temptation to throw more hardware at the problem. I’m pleased at least that I resisted, though I still kick myself for not looking at the code on day one. It can seem a daunting prospect to return to the fundamentals of the code so close to roll out, but with the right tools, and surprisingly little time, you can collect the evidence that reveals the true problem. It is a lesson I trust I will remember for my next 20 years as a DBA, if I’m ever again tempted to bypass the evidence.

    Read the article

  • Perm SSIS Developer Urgently Required

    - by blakmk
      Job Role To provide dedicated data services support to the company, by designing, creating, maintaining and enhancing database objects, ensuring data quality, consistency and integrity. Migrating data from various sources to central SQL 2008 data warehouse will be the primary function. Migration of data from bespoke legacy database’s to SQL 2008 data warehouse. Understand key business requirements, Liaising with various aspects of the company. Create advanced transformations of data, with focus on data cleansing, redundant data and duplication. Creating complex business rules regarding data services, migration, Integrity and support (Best Practices). Experience ·         Minimum 3 year SSIS experience, in a project or BI Development role and involvement in at least 3 full ETL project life cycles, using the following methodologies and tools o    Excellent knowledge of ETL concepts including data migration & integrity, focusing on SSIS. o    Extensive experience with SQL 2005 products, SQL 2008 desirable. o    Working knowledge of SSRS and its integration with other BI products. o    Extensive knowledge of T-SQL, stored procedures, triggers (Table/Database), views, functions in particular coding and querying. o    Data cleansing and harmonisation. o    Understanding and knowledge of indexes, statistics and table structure. o    SQL Agent – Scheduling jobs, optimisation, multiple jobs, DTS. o    Troubleshoot, diagnose and tune database and physical server performance. o    Knowledge and understanding of locking, blocks, table and index design and SQL configuration. ·         Demonstrable ability to understand and analyse business processes. ·         Experience in creating business rules on best practices for data services. ·         Experience in working with, supporting and troubleshooting MS SQL servers running enterprise applications ·         Proven ability to work well within a team and liaise with other technical support staff such as networking administrators, system administrators and support engineers. ·         Ability to create formal documentation, work procedures, and service level agreements. ·         Ability to communicate technical issues at all levels including to a non technical audience. ·         Good working knowledge of MS Word, Excel, PowerPoint, Visio and Project.   Location Based in Crawley with possibility of some remote working Contact me for more info: http://sqlblogcasts.com/blogs/blakmk/contact.aspx      

    Read the article

  • Olympics data available for all on Windows Azure SQL Database and Power View

    - by jamiet
    Are you looking around for some decent test data for your BI demos? Well, if so, Microsoft have provided some data about all medals won at the Olympics Games (1900 to 2008) at OlympicsData workbook - Excel, SSIS, Azure sample; it provides analysis over athletes, countries, medal type, sport, discipline and various other dimensions. The data has been provided in an Excel workbook along with instructions on how to load the data into a Windows Azure SQL Database using SQL Server Integration Services (SSIS). Frankly though, the rigmarole of standing up your own Windows Azure SQL Database ok, SQL Azure database, is both costly (SQL Azure isn’t free) and time consuming (the provided instructions aren’t exactly an idiot’s guide and getting SSIS to work properly with Excel isn’t a barrel of laughs either). To ease the pain for all you BI folks out there that simply want to party on the data I have loaded it all into the SQL Azure database that I use for hosting AdventureWorks on Azure. You can read more about AdventureWorks on Azure below however I’ll summarise here by saying it is a SQL Azure database provided for the use of the SQL Server community and which is supported by voluntary donations. To view the data the credentials you need are: Server mhknbn2kdz.database.windows.net  Database AdventureWorks2012 User sqlfamily Password sqlf@m1ly Type those into SSMS and away you go, the data is provided in four tables [olympics].[Sport], [olympics].[Discipline], [olympics].[Event] & [olympics].[Medalist]: I figured this would be a good candidate for a Power View report so I fired up Excel 2013 and built such a report to slice’n’dice through the data – here are some screenshots that should give you a flavour of what is available: A view of all the available data Where do all the gymastics medals go? Which countries do top ten all-time medal winners come from? You get the idea. There is masses of information here and if you have Excel 2013 handy Power View provides a quick and easy way of surfing through it. To save you the bother of setting up the Power View report yourself you can have the one that I took these screenshots from, it is available on my SkyDrive at OlympicsAnalysis.xlsx so just hit the link and download to play to your heart’s content. Party on, people! As I said above the data is hosted on a SQL Azure database that I use for hosting “AdventureWorks on Azure” which I first announced in March 2013 at AdventureWorks2012 now available for all on SQL Azure. I’ll repeat the pertinent parts of that blog post here: I am pleased to announce that as of today … [AdventureWorks2012] now resides on SQL Azure and is available for anyone, absolutely anyone, to connect to and use for their own means. This database is free for you to use but SQL Azure is of course not free so before I give you the credentials please lend me your ears eyes for a short while longer. AdventureWorks on Azure is being provided for the SQL Server community to use and so I am hoping that that same community will rally around to support this effort by making a voluntary donation to support the upkeep which, going on current pricing, is going to be $119.88 per year. If you would like to contribute to keep AdventureWorks on Azure up and running for that full year please donate via PayPal to [email protected] Any amount, no matter how small, will help. If those 50+ people that retweeted me beforehand all contributed $2 then that would just about be enough to keep this up for a year. If the community contributes more than we need then there are a number of additional things that could be done: Host additional databases (Northwind anyone??) Host in more datacentres (this first one is in Western Europe) Make a charitable donation That last one, a charitable donation, is something I would really like to do. The SQL Community have proved before that they can make a significant contribution to charitable orgnisations through purchasing the SQL Server MVP Deep Dives book and I harbour hopes that AdventureWorks on Azure can continue in that vein. So please, if you think AdventureWorks on Azure is something that is worth supporting please make a contribution. I’d like to emphasize that last point. If my hosting this Olympics data is useful to you please support this initiative by donating. Thanks in advance. @Jamiet

    Read the article

  • Consumer Oriented Search In Oracle Endeca Information Discovery - Part 2

    - by Bob Zurek
    As discussed in my last blog posting on this topic, Information Discovery, a core capability of the Oracle Endeca Information Discovery solution enables businesses to search, discover and navigate through a wide variety of big data including structured, unstructured and semi-structured data. With search as a core advanced capabilities of our product it is important to understand some of the key differences and capabilities in the underlying data store of Oracle Endeca Information Discovery and that is our Endeca Server. In the last post on this subject, we talked about Exploratory Search capabilities along with support for cascading relevance. Additional search capabilities in the Endeca Server, which differentiate from simple keyword based "search boxes" in other Information Discovery products also include: The Endeca Server Supports Set Search.  The Endeca Server is organized around set retrieval, which means that it looks at groups of results (all the documents that match a search), as well as the relationship of each individual result to the set. Other approaches only compute the relevance of a document by comparing the document to the search query – not by comparing the document to all the others. For example, a search for “U.S.” in another approach might match to the title of a document and get a high ranking. But what if it were a collection of government documents in which “U.S.” appeared in many titles, making that clue less meaningful? A set analysis would reveal this and be used to adjust relevance accordingly. The Endeca Server Supports Second-Order Relvance. Unlike simple search interfaces in traditional BI tools, which provide limited relevance ranking, such as a list of results based on key word matching, Endeca enables users to determine the most salient terms to divide up the result. Determining this second-order relevance is the key to providing effective guidance. Support for Queries and Filters. Search is the most common query type, but hardly complete, and users need to express a wide range of queries. Oracle Endeca Information Discovery also includes navigation, interactive visualizations, analytics, range filters, geospatial filters, and other query types that are more commonly associated with BI tools. Unlike other approaches, these queries operate across structured, semi-structured and unstructured content stored in the Endeca Server. Furthermore, this set is easily extensible because the core engine allows for pluggable features to be added. Like a search engine, queries are answered with a results list, ranked to put the most likely matches first. Unlike “black box” relevance solutions, which generalize one strategy for everyone, we believe that optimal relevance strategies vary across domains. Therefore, it provides line-of-business owners with a set of relevance modules that let them tune the best results based on their content. The Endeca Server query result sets are summarized, which gives users guidance on how to refine and explore further. Summaries include Guided Navigation® (a form of faceted search), maps, charts, graphs, tag clouds, concept clusters, and clarification dialogs. Users don’t explicitly ask for these summaries; Oracle Endeca Information Discovery analytic applications provide the right ones, based on configurable controls and rules. For example, the analytic application might guide a procurement agent filtering for in-stock parts by visualizing the results on a map and calculating their average fulfillment time. Furthermore, the user can interact with summaries and filters without resorting to writing complex SQL queries. The user can simply just click to add filters. Within Oracle Endeca Information Discovery, all parts of the summaries are clickable and searchable. We are living in a search driven society where business users really seem to enjoy entering information into a search box. We do this everyday as consumers and therefore, we have gotten used to looking for that box. However, the key to getting the right results is to guide that user in a way that provides additional Discovery, beyond what they may have anticipated. This is why these important and advanced features of search inside the Endeca Server have been so important. They have helped to guide our great customers to success. 

    Read the article

  • Oracle Enterprise Computing Summit??!??????/EM????????

    - by Oracle Japan Marketing
    .NewsType1107 img{border:none; vertical-align:bottom;} .NewsType1107 p{margin:0; padding:0;} .NewsType1107 td{color:#333333; line-height:1.5; font-family:"MS P????", Osaka, Hiragino Kaku Gothic Pro; font-size:12px;} .NewsType1107 table.t10 td, .small{font-size:10px;} .NewsType1107 a:link, a:visited{color:#ff0000;} .NewsType1107 a:hover, a:active{color:#ff0000; text-decoration:none;} .NewsType1107 a.l01:link, a.l01:visited, a.l01:hover, a.l01:active{color:#333333;} .NewsType1107 span.r, td.r{color:#ff0000;} .NewsType1107 table.tbl-semi td{padding:5px;} ?????BCP????????????????????????! ??????????????????????????????Oracle Enterprise Computing? ??????????????????????????????????Oracle Enterprise Computing?????????????????????? ??·?????????? >> ????????????????IT???????????????????? ???????????????????????ID??·??????????·?????3??????????????????????????????? ????????? >> ??????????????????????????????????????????????????????? Oracle EPM & BI Summit???????·?????????????????????????????????????·???????????????????????? ??·?????????? >> ???????????????????????????? ???5??????????????????????????????????????????????????????????????????????????????????? ??·?????????? >> -- ?????????????????????? ????????????????????????! ???????????????????????????????????????? ????????? >> ?????????????!??????????????? ? Sun????&?????·?????????????????????IT????????? ? ???????????·???????????????????IT???????????? ????????????? ? ?????????·????????????????????BI?? more solutions ? LIXIL ?????ERP?????????????????????????????????????????????? ? ?????????? Oracle EBS???·?????????????????????????????????????????? ? ????? Oracle EBS???/??????????????????????????????????????????????????????? more success stories IT?????????????????????????????????????·???·?????? >> ???????????????????????? ?? ???? ?? 7/6(?)10:30~18:00 ?????????? 2011 ?????????(??) 7/7(?)14:00~19:00 Java SE 7 ?????? ?????? ??????????(??) 7/13(?)13:30~16:45 ?????????????????????????? ??????????(??) 7/15(?)13:00~18:00 ?????!??????????????????????? ??????????(??) 7/20(?)9:30~17:50 ????·????????&????????2011 ?????????????(??) 7/20(?)13:30~17:00 ???????????????????????????? ??????????(??) 7/25(?)14:00~17:00 MySQL?????????????? ??????????(??) 7/26(?)13:30~17:45 Oracle Enterprise Computing Summit ???????????(??) Copyright © 2011, Oracle.All Rights Reserved. ???????????? | ???????????? | ??????????/????????

    Read the article

  • Run Windows in Ubuntu with VMware Player

    - by Matthew Guay
    Are you an enthusiast who loves their Ubuntu Linux experience but still needs to use Windows programs?  Here’s how you can get the full Windows experience on Ubuntu with the free VMware Player. Linux has become increasingly consumer friendly, but still, the wide majority of commercial software is only available for Windows and Macs.  Dual-booting between Windows and Linux has been a popular option for years, but this is a frustrating solution since you have to reboot into the other operating system each time you want to run a specific application.  With virtualization, you’ll never have to make this tradeoff.  VMware Player makes it quick and easy to install any edition of Windows in a virtual machine.  With VMware’s great integration tools, you can copy and paste between your Linux and Windows programs and even run native Windows applications side-by-side with Linux ones. Getting Started Download the latest version of VMware Player for Linux, and select either the 32-bit or 64-bit version, depending on your system.  VMware Player is a free download, but requires registration.  Sign in with your VMware account, or create a new one if you don’t already have one. VMware Player is fairly easy to install on Linux, but you will need to start out the installation from the terminal.  First, enter the following to make sure the installer is marked as executable, substituting version/build_number for the version number on the end of the file you downloaded. chmod +x ./VMware-Player-version/build_number.bundle Then, enter the following to start the install, again substituting your version number: gksudo bash ./VMware-Player-version/build_number.bundle You may have to enter your administrator password to start the installation, and then the VMware Player graphical installer will open.  Choose whether you want to check for product updates and submit usage data to VMware, and then proceed with the install as normal. VMware Player installed in only a few minutes in our tests, and was immediately ready to run, no reboot required.  You can now launch it from your Ubuntu menu: click Applications \ System Tools \ VMware Player. You’ll need to accept the license agreement the first time you run it. Welcome to VMware Player!  Now you can create new virtual machines and run pre-built ones on your Ubuntu desktop. Install Windows in VMware Player on Ubuntu Now that you’ve got VMware setup, it’s time to put it to work.  Click the Create a New Virtual Machine as above to start making a Windows virtual machine. In the dialog that opens, select your installer disk or ISO image file that you want to install Windows from.  In this example, we’re select a Windows 7 ISO.  VMware will automatically detect the operating system on the disk or image.  Click Next to continue. Enter your Windows product key, select the edition of Windows to install, and enter your name and password. You can leave the product key field blank and enter it later.  VMware will ask if you want to continue without a product key, so just click Yes to continue. Now enter a name for your virtual machine and select where you want to save it.  Note: This will take up at least 15Gb of space on your hard drive during the install, so make sure to save it on a drive with sufficient storage space. You can choose how large you want your virtual hard drive to be; the default is 40Gb, but you can choose a different size if you wish.  The entire amount will not be used up on your hard drive initially, but the virtual drive will increase in size up to your maximum as you add files.  Additionally, you can choose if you want the virtual disk stored as a single file or as multiple files.  You will see the best performance by keeping the virtual disk as one file, but the virtual machine will be more portable if it is broken into smaller files, so choose the option that will work best for your needs. Finally, review your settings, and if everything looks good, click Finish to create the virtual machine. VMware will take over now, and install Windows without any further input using its Easy Install.  This is one of VMware’s best features, and is the main reason we find it the easiest desktop virtualization solution to use.   Installing VMware Tools VMware Player doesn’t include the VMware Tools by default; instead, it automatically downloads them for the operating system you’re installing.  Once you’ve downloaded them, it will use those tools anytime you install that OS.  If this is your first Windows virtual machine to install, you may be prompted to download and install them while Windows is installing.  Click Download and Install so your Easy Install will finish successfully. VMware will then download and install the tools.  You may need to enter your administrative password to complete the install. Other than this, you can leave your Windows install unattended; VMware will get everything installed and running on its own. Our test setup took about 30 minutes, and when it was done we were greeted with the Windows desktop ready to use, complete with drivers and the VMware tools.  The only thing missing was the Aero glass feature.  VMware Player is supposed to support the Aero glass effects in virtual machines, and although this works every time when we use VMware Player on Windows, we could not get it to work in Linux.  Other than that, Windows is fully ready to use.  You can copy and paste text, images, or files between Ubuntu and Windows, or simply drag-and-drop files between the two. Unity Mode Using Windows in a window is awkward, and makes your Windows programs feel out of place and hard to use.  This is where Unity mode comes in.  Click Virtual Machine in VMware’s menu, and select Enter Unity. Your Windows desktop will now disappear, and you’ll see a new Windows menu underneath your Ubuntu menu.  This works the same as your Windows Start Menu, and you can open your Windows applications and files directly from it. By default, programs from Windows will have a colored border and a VMware badge in the corner.  You can turn this off from the VMware settings pane.  Click Virtual Machine in VMware’s menu and select Virtual Machine Settings.  Select Unity under the Options tab, and uncheck the Show borders and Show badges boxes if you don’t want them. Unity makes your Windows programs feel at home in Ubuntu.  Here we have Word 2010 and IE8 open beside the Ubuntu Help application.  Notice that the Windows applications show up in the taskbar on the bottom just like the Linux programs.  If you’re using the Compiz graphics effects in Ubuntu, your Windows programs will use them too, including the popular wobbly windows effect. You can switch back to running Windows inside VMware Player’s window by clicking the Exit Unity button in the VMware window. Now, whenever you want to run Windows applications in Linux, you can quickly launch it from VMware Player. Conclusion VMware Player is a great way to run Windows on your Linux computer.  It makes it extremely easy to get Windows installed and running, lets you run your Windows programs seamlessly alongside your Linux ones.  VMware products work great in our experience, and VMware Player on Linux was no exception. If you’re a Windows user and you’d like to run Ubuntu on Windows, check out our article on how to Run Ubuntu in Windows with VMware Player. Link Download VMware Player 3 (Registration required) Download Windows 7 Enterprise 90-day trial Similar Articles Productive Geek Tips Enable Copy and Paste from Ubuntu VMware GuestInstall VMware Tools on Ubuntu Edgy EftRestart the Ubuntu Gnome User Interface QuicklyHow to Add a Program to the Ubuntu Startup List (After Login)How To Run Ubuntu in Windows 7 with VMware Player TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate Customize Everything Related to Dates, Times, Currency and Measurement in Windows 7 Google Earth replacement Icon (Icons we like) Build Great Charts in Excel with Chart Advisor

    Read the article

  • OpenGL or OpenGL ES

    - by zxspectrum
    What should I learn? OpenGL 4.1 or OpenGL ES 2.0? I will be developing desktop applications using Qt but I may start developing mobile applications in a few months, too. I don't know anything about 3D, 3D math, etc and I'd rather spend 100 bucks in a good book than 1 week digging websites and going through trial and error. One problem I see with OpenGL 4.1 is as far as I know there is no book yet (the most recent ones are for OpenGL 3.3 or 4.0), while there are books on OpenGL ES 2.0. On the other hand, from my naive point of view, OpenGL 4.1 seems like OpenGL ES 2.0 + additions, so it looks like it would be easier/better to first learn OpenGL ES 2.0, then go for the shader language, etc Please, don't tell me to use NeHe (it's generally agreed it's full of bad/old practices), the Durian tutorial, etc. Thanks

    Read the article

  • Rendering ASP.NET MVC Views to String

    - by Rick Strahl
    It's not uncommon in my applications that I require longish text output that does not have to be rendered into the HTTP output stream. The most common scenario I have for 'template driven' non-Web text is for emails of all sorts. Logon confirmations and verifications, email confirmations for things like orders, status updates or scheduler notifications - all of which require merged text output both within and sometimes outside of Web applications. On other occasions I also need to capture the output from certain views for logging purposes. Rather than creating text output in code, it's much nicer to use the rendering mechanism that ASP.NET MVC already provides by way of it's ViewEngines - using Razor or WebForms views - to render output to a string. This is nice because it uses the same familiar rendering mechanism that I already use for my HTTP output and it also solves the problem of where to store the templates for rendering this content in nothing more than perhaps a separate view folder. The good news is that ASP.NET MVC's rendering engine is much more modular than the full ASP.NET runtime engine which was a real pain in the butt to coerce into rendering output to string. With MVC the rendering engine has been separated out from core ASP.NET runtime, so it's actually a lot easier to get View output into a string. Getting View Output from within an MVC Application If you need to generate string output from an MVC and pass some model data to it, the process to capture this output is fairly straight forward and involves only a handful of lines of code. The catch is that this particular approach requires that you have an active ControllerContext that can be passed to the view. This means that the following approach is limited to access from within Controller methods. Here's a class that wraps the process and provides both instance and static methods to handle the rendering:/// <summary> /// Class that renders MVC views to a string using the /// standard MVC View Engine to render the view. /// /// Note: This class can only be used within MVC /// applications that have an active ControllerContext. /// </summary> public class ViewRenderer { /// <summary> /// Required Controller Context /// </summary> protected ControllerContext Context { get; set; } public ViewRenderer(ControllerContext controllerContext) { Context = controllerContext; } /// <summary> /// Renders a full MVC view to a string. Will render with the full MVC /// View engine including running _ViewStart and merging into _Layout /// </summary> /// <param name="viewPath"> /// The path to the view to render. Either in same controller, shared by /// name or as fully qualified ~/ path including extension /// </param> /// <param name="model">The model to render the view with</param> /// <returns>String of the rendered view or null on error</returns> public string RenderView(string viewPath, object model) { return RenderViewToStringInternal(viewPath, model, false); } /// <summary> /// Renders a partial MVC view to string. Use this method to render /// a partial view that doesn't merge with _Layout and doesn't fire /// _ViewStart. /// </summary> /// <param name="viewPath"> /// The path to the view to render. Either in same controller, shared by /// name or as fully qualified ~/ path including extension /// </param> /// <param name="model">The model to pass to the viewRenderer</param> /// <returns>String of the rendered view or null on error</returns> public string RenderPartialView(string viewPath, object model) { return RenderViewToStringInternal(viewPath, model, true); } public static string RenderView(string viewPath, object model, ControllerContext controllerContext) { ViewRenderer renderer = new ViewRenderer(controllerContext); return renderer.RenderView(viewPath, model); } public static string RenderPartialView(string viewPath, object model, ControllerContext controllerContext) { ViewRenderer renderer = new ViewRenderer(controllerContext); return renderer.RenderPartialView(viewPath, model); } protected string RenderViewToStringInternal(string viewPath, object model, bool partial = false) { // first find the ViewEngine for this view ViewEngineResult viewEngineResult = null; if (partial) viewEngineResult = ViewEngines.Engines.FindPartialView(Context, viewPath); else viewEngineResult = ViewEngines.Engines.FindView(Context, viewPath, null); if (viewEngineResult == null) throw new FileNotFoundException(Properties.Resources.ViewCouldNotBeFound); // get the view and attach the model to view data var view = viewEngineResult.View; Context.Controller.ViewData.Model = model; string result = null; using (var sw = new StringWriter()) { var ctx = new ViewContext(Context, view, Context.Controller.ViewData, Context.Controller.TempData, sw); view.Render(ctx, sw); result = sw.ToString(); } return result; } } The key is the RenderViewToStringInternal method. The method first tries to find the view to render based on its path which can either be in the current controller's view path or the shared view path using its simple name (PasswordRecovery) or alternately by its full virtual path (~/Views/Templates/PasswordRecovery.cshtml). This code should work both for Razor and WebForms views although I've only tried it with Razor Views. Note that WebForms Views might actually be better for plain text as Razor adds all sorts of white space into its output when there are code blocks in the template. The Web Forms engine provides more accurate rendering for raw text scenarios. Once a view engine is found the view to render can be retrieved. Views in MVC render based on data that comes off the controller like the ViewData which contains the model along with the actual ViewData and ViewBag. From the View and some of the Context data a ViewContext is created which is then used to render the view with. The View picks up the Model and other data from the ViewContext internally and processes the View the same it would be processed if it were to send its output into the HTTP output stream. The difference is that we can override the ViewContext's output stream which we provide and capture into a StringWriter(). After rendering completes the result holds the output string. If an error occurs the error behavior is similar what you see with regular MVC errors - you get a full yellow screen of death including the view error information with the line of error highlighted. It's your responsibility to handle the error - or let it bubble up to your regular Controller Error filter if you have one. To use the simple class you only need a single line of code if you call the static methods. Here's an example of some Controller code that is used to send a user notification to a customer via email in one of my applications:[HttpPost] public ActionResult ContactSeller(ContactSellerViewModel model) { InitializeViewModel(model); var entryBus = new busEntry(); var entry = entryBus.LoadByDisplayId(model.EntryId); if ( string.IsNullOrEmpty(model.Email) ) entryBus.ValidationErrors.Add("Email address can't be empty.","Email"); if ( string.IsNullOrEmpty(model.Message)) entryBus.ValidationErrors.Add("Message can't be empty.","Message"); model.EntryId = entry.DisplayId; model.EntryTitle = entry.Title; if (entryBus.ValidationErrors.Count > 0) { ErrorDisplay.AddMessages(entryBus.ValidationErrors); ErrorDisplay.ShowError("Please correct the following:"); } else { string message = ViewRenderer.RenderView("~/views/template/ContactSellerEmail.cshtml",model, ControllerContext); string title = entry.Title + " (" + entry.DisplayId + ") - " + App.Configuration.ApplicationName; AppUtils.SendEmail(title, message, model.Email, entry.User.Email, false, false)) } return View(model); } Simple! The view in this case is just a plain MVC view and in this case it's a very simple plain text email message (edited for brevity here) that is created and sent off:@model ContactSellerViewModel @{ Layout = null; }re: @Model.EntryTitle @Model.ListingUrl @Model.Message ** SECURITY ADVISORY - AVOID SCAMS ** Avoid: wiring money, cross-border deals, work-at-home ** Beware: cashier checks, money orders, escrow, shipping ** More Info: @(App.Configuration.ApplicationBaseUrl)scams.html Obviously this is a very simple view (I edited out more from this page to keep it brief) -  but other template views are much more complex HTML documents or long messages that are occasionally updated and they are a perfect fit for Razor rendering. It even works with nested partial views and _layout pages. Partial Rendering Notice that I'm rendering a full View here. In the view I explicitly set the Layout=null to avoid pulling in _layout.cshtml for this view. This can also be controlled externally by calling the RenderPartial method instead: string message = ViewRenderer.RenderPartialView("~/views/template/ContactSellerEmail.cshtml",model, ControllerContext); with this line of code no layout page (or _viewstart) will be loaded, so the output generated is just what's in the view. I find myself using Partials most of the time when rendering templates, since the target of templates usually tend to be emails or other HTML fragment like output, so the RenderPartialView() method is definitely useful to me. Rendering without a ControllerContext The preceding class is great when you're need template rendering from within MVC controller actions or anywhere where you have access to the request Controller. But if you don't have a controller context handy - maybe inside a utility function that is static, a non-Web application, or an operation that runs asynchronously in ASP.NET - which makes using the above code impossible. I haven't found a way to manually create a Controller context to provide the ViewContext() what it needs from outside of the MVC infrastructure. However, there are ways to accomplish this,  but they are a bit more complex. It's possible to host the RazorEngine on your own, which side steps all of the MVC framework and HTTP and just deals with the raw rendering engine. I wrote about this process in Hosting the Razor Engine in Non-Web Applications a long while back. It's quite a process to create a custom Razor engine and runtime, but it allows for all sorts of flexibility. There's also a RazorEngine CodePlex project that does something similar. I've been meaning to check out the latter but haven't gotten around to it since I have my own code to do this. The trick to hosting the RazorEngine to have it behave properly inside of an ASP.NET application and properly cache content so templates aren't constantly rebuild and reparsed. Anyway, in the same app as above I have one scenario where no ControllerContext is available: I have a background scheduler running inside of the app that fires on timed intervals. This process could be external but because it's lightweight we decided to fire it right inside of the ASP.NET app on a separate thread. In my app the code that renders these templates does something like this:var model = new SearchNotificationViewModel() { Entries = entries, Notification = notification, User = user }; // TODO: Need logging for errors sending string razorError = null; var result = AppUtils.RenderRazorTemplate("~/views/template/SearchNotificationTemplate.cshtml", model, razorError); which references a couple of helper functions that set up my RazorFolderHostContainer class:public static string RenderRazorTemplate(string virtualPath, object model,string errorMessage = null) { var razor = AppUtils.CreateRazorHost(); var path = virtualPath.Replace("~/", "").Replace("~", "").Replace("/", "\\"); var merged = razor.RenderTemplateToString(path, model); if (merged == null) errorMessage = razor.ErrorMessage; return merged; } /// <summary> /// Creates a RazorStringHostContainer and starts it /// Call .Stop() when you're done with it. /// /// This is a static instance /// </summary> /// <param name="virtualPath"></param> /// <param name="binBasePath"></param> /// <param name="forceLoad"></param> /// <returns></returns> public static RazorFolderHostContainer CreateRazorHost(string binBasePath = null, bool forceLoad = false) { if (binBasePath == null) { if (HttpContext.Current != null) binBasePath = HttpContext.Current.Server.MapPath("~/"); else binBasePath = AppDomain.CurrentDomain.BaseDirectory; } if (_RazorHost == null || forceLoad) { if (!binBasePath.EndsWith("\\")) binBasePath += "\\"; //var razor = new RazorStringHostContainer(); var razor = new RazorFolderHostContainer(); razor.TemplatePath = binBasePath; binBasePath += "bin\\"; razor.BaseBinaryFolder = binBasePath; razor.UseAppDomain = false; razor.ReferencedAssemblies.Add(binBasePath + "ClassifiedsBusiness.dll"); razor.ReferencedAssemblies.Add(binBasePath + "ClassifiedsWeb.dll"); razor.ReferencedAssemblies.Add(binBasePath + "Westwind.Utilities.dll"); razor.ReferencedAssemblies.Add(binBasePath + "Westwind.Web.dll"); razor.ReferencedAssemblies.Add(binBasePath + "Westwind.Web.Mvc.dll"); razor.ReferencedAssemblies.Add("System.Web.dll"); razor.ReferencedNamespaces.Add("System.Web"); razor.ReferencedNamespaces.Add("ClassifiedsBusiness"); razor.ReferencedNamespaces.Add("ClassifiedsWeb"); razor.ReferencedNamespaces.Add("Westwind.Web"); razor.ReferencedNamespaces.Add("Westwind.Utilities"); _RazorHost = razor; _RazorHost.Start(); //_RazorHost.Engine.Configuration.CompileToMemory = false; } return _RazorHost; } The RazorFolderHostContainer essentially is a full runtime that mimics a folder structure like a typical Web app does including caching semantics and compiling code only if code changes on disk. It maps a folder hierarchy to views using the ~/ path syntax. The host is then configured to add assemblies and namespaces. Unfortunately the engine is not exactly like MVC's Razor - the expression expansion and code execution are the same, but some of the support methods like sections, helpers etc. are not all there so templates have to be a bit simpler. There are other folder hosts provided as well to directly execute templates from strings (using RazorStringHostContainer). The following is an example of an HTML email template @inherits RazorHosting.RazorTemplateFolderHost <ClassifiedsWeb.SearchNotificationViewModel> <html> <head> <title>Search Notifications</title> <style> body { margin: 5px;font-family: Verdana, Arial; font-size: 10pt;} h3 { color: SteelBlue; } .entry-item { border-bottom: 1px solid grey; padding: 8px; margin-bottom: 5px; } </style> </head> <body> Hello @Model.User.Name,<br /> <p>Below are your Search Results for the search phrase:</p> <h3>@Model.Notification.SearchPhrase</h3> <small>since @TimeUtils.ShortDateString(Model.Notification.LastSearch)</small> <hr /> You can see that the syntax is a little different. Instead of the familiar @model header the raw Razor  @inherits tag is used to specify the template base class (which you can extend). I took a quick look through the feature set of RazorEngine on CodePlex (now Github I guess) and the template implementation they use is closer to MVC's razor but there are other differences. In the end don't expect exact behavior like MVC templates if you use an external Razor rendering engine. This is not what I would consider an ideal solution, but it works well enough for this project. My biggest concern is the overhead of hosting a second razor engine in a Web app and the fact that here the differences in template rendering between 'real' MVC Razor views and another RazorEngine really are noticeable. You win some, you lose some It's extremely nice to see that if you have a ControllerContext handy (which probably addresses 99% of Web app scenarios) rendering a view to string using the native MVC Razor engine is pretty simple. Kudos on making that happen - as it solves a problem I see in just about every Web application I work on. But it is a bummer that a ControllerContext is required to make this simple code work. It'd be really sweet if there was a way to render views without being so closely coupled to the ASP.NET or MVC infrastructure that requires a ControllerContext. Alternately it'd be nice to have a way for an MVC based application to create a minimal ControllerContext from scratch - maybe somebody's been down that path. I tried for a few hours to come up with a way to make that work but gave up in the soup of nested contexts (MVC/Controller/View/Http). I suspect going down this path would be similar to hosting the ASP.NET runtime requiring a WorkerRequest. Brrr…. The sad part is that it seems to me that a View should really not require much 'context' of any kind to render output to string. Yes there are a few things that clearly are required like paths to the virtual and possibly the disk paths to the root of the app, but beyond that view rendering should not require much. But, no such luck. For now custom RazorHosting seems to be the only way to make Razor rendering go outside of the MVC context… Resources Full ViewRenderer.cs source code from Westwind.Web.Mvc library Hosting the Razor Engine for Non-Web Applications RazorEngine on GitHub© Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET   ASP.NET  MVC   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Windows Azure: Announcing Support for Windows Server 2012 R2 + Some Nice Price Cuts

    - by ScottGu
    Today we released some great updates to Windows Azure: Virtual Machines: Support for Windows Server 2012 R2 Cloud Services: Support for Windows Server 2012 R2 and .NET 4.5.1 Windows Azure Pack: Use Windows Azure features on-premises using Windows Server 2012 R2 Price Cuts: Up to 22% Price Reduction on Memory-Intensive Instances Below are more details about each of the improvements: Virtual Machines: Support for Windows Server 2012 R2 This morning we announced the release of Windows Server 2012 R2 – which is a fantastic update to Windows Server and includes a ton of great enhancements. This morning we are also excited to announce that the general availability image of Windows Server 2012 RC is now supported on Windows Azure.  Windows Azure is the first cloud provider to offer the final release of Windows Server 2012 R2, and it is incredibly easy to launch your own Windows Server 2012 R2 instance with it. To create a new Windows Server 2012 R2 instance simply choose New->Compute->Virtual Machine within the Windows Azure Management Portal.  You can select the “Windows Server 2012 R2” image and create a new Virtual Machine using the “Quick Create” option: Or alternatively click the “From Gallery” option if you want to customize even more configuration options (endpoints, remote powershell, availability set, etc): Creating and instantiating a new Virtual Machine on Windows Azure is very fast.  In fact, the Windows Server 2012 R2 image now deploys and runs 30% faster than previous versions of Windows Server. Once the VM is deployed you can drill into it to track its health and manage its settings: Clicking the “Connect” button allows you to remote desktop into the VM – at which point you can customize and manage it as a full administrator however you want: If you haven’t tried Windows Server 2012 R2 yet – give it a try with Windows Azure.  There is no easier way to get an instance of it up and running! Cloud Services: Support for using Windows Server 2012 R2 with Web and Worker Roles Today’s Windows Azure release also allows you to now use Windows Server 2012 R2 and .NET 4.5.1 within Web and Worker Roles within Cloud Service based applications.  Enabling this is easy.  You can configure existing existing Cloud Service application to use Windows Server 2012 R2 by updating your Cloud Service Configuration File (.cscfg) to use the new “OS Family 4” setting: Or alternatively you can use the Windows Azure Management Portal to update cloud services that are already deployed on Windows Azure.  Simply choose the configure tab on them and select Windows Server 2012 R2 in the Operating System Family dropdown: The approaches above enable you to immediately take advantage of Windows Server 2012 R2 and .NET 4.5.1 and all the great features they provide. Windows Azure Pack: Use Windows Azure features on Windows Server 2012 R2 Today we also made generally available the Windows Azure Pack, which is a free download that enables you to run Windows Azure Technology within your own datacenter, an on-premises private cloud environment, or with one of our service provider/hosting partners who run Windows Server. Windows Azure Pack enables you to use a management portal that has the exact same UI as the Windows Azure Management Portal, and within which you can create and manage Virtual Machines, Web Sites, and Service Bus – all of which can run on Windows Server and System Center.  The services provided with the Windows Azure Pack are consistent with the services offered within our Windows Azure public cloud offering.  This consistency enables organizations and developers to build applications and solutions that can run in any hosting environment – and which use the same development and management approach.  The end result is an offering with incredible flexibility. You can learn more about Windows Azure Pack and download/deploy it today here. Price Cuts: Up to 22% Reduction on Memory Intensive Instances Today we are also reducing prices by up to 22% on our memory-intensive VM instances (specifically our A5, A6, and A7 instances).  These price reductions apply to both Windows and Linux VM instances, as well as for Cloud Service based applications: These price reductions will take effect in November, and will enable you to run applications that demand larger memory (such as SharePoint, Databases, in-memory analytics, etc) even more cost effectively. Summary Today’s release enables you to start using Windows Server 2012 R2 within Windows Azure immediately, and take advantage of our Cloud OS vision both within our datacenters – and using the Windows Azure Pack within both your existing datacenters and those of our partners. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Install the Ajax Control Toolkit from NuGet

    - by Stephen Walther
    The Ajax Control Toolkit is now available from NuGet. This makes it super easy to add the latest version of the Ajax Control Toolkit to any Web Forms application. If you haven’t used NuGet yet, then you are missing out on a great tool which you can use with Visual Studio to add new features to an application. You can use NuGet with both ASP.NET MVC and ASP.NET Web Forms applications. NuGet is compatible with both Websites and Web Applications and it works with both C# and VB.NET applications. For example, I habitually use NuGet to add the latest version of ELMAH, Entity Framework, jQuery, jQuery UI, and jQuery Templates to applications that I create. To download NuGet, visit the NuGet website at: http://NuGet.org Imagine, for example, that you want to take advantage of the Ajax Control Toolkit RoundedCorners extender to create cross-browser compatible rounded corners in a Web Forms application. Follow these steps. Right click on your project in the Solution Explorer window and select the option Add Library Package Reference. In the Add Library Package Reference dialog, select the Online tab and enter AjaxControlToolkit in the search box: Click the Install button and the latest version of the Ajax Control Toolkit will be installed. Installing the Ajax Control Toolkit makes several modifications to your application. First, a reference to the Ajax Control Toolkit is added to your application. In a Web Application Project, you can see the new reference in the References folder: Installing the Ajax Control Toolkit NuGet package also updates your Web.config file. The tag prefix ajaxToolkit is registered so that you can easily use Ajax Control Toolkit controls within any page without adding a @Register directive to the page. <configuration> <system.web> <compilation debug="true" targetFramework="4.0" /> <pages> <controls> <add tagPrefix="ajaxToolkit" assembly="AjaxControlToolkit" namespace="AjaxControlToolkit" /> </controls> </pages> </system.web> </configuration> You should do a rebuild of your application by selecting the Visual Studio menu option Build, Rebuild Solution so that Visual Studio picks up on the new controls (You won’t get Intellisense for the Ajax Control Toolkit controls until you do a build). After you add the Ajax Control Toolkit to your application, you can start using any of the 40 Ajax Control Toolkit controls in your application (see http://www.asp.net/ajax/ajaxcontroltoolkit/samples/ for a reference for the controls). <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="WebForm1.aspx.cs" Inherits="WebApplication1.WebForm1" %> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head runat="server"> <title>Rounded Corners</title> <style type="text/css"> #pnl1 { background-color: gray; width: 200px; color:White; font: 14pt Verdana; } #pnl1_contents { padding: 10px; } </style> </head> <body> <form id="form1" runat="server"> <div> <asp:Panel ID="pnl1" runat="server"> <div id="pnl1_contents"> I have rounded corners! </div> </asp:Panel> <ajaxToolkit:ToolkitScriptManager ID="sm1" runat="server" /> <ajaxToolkit:RoundedCornersExtender TargetControlID="pnl1" runat="server" /> </div> </form> </body> </html> The page contains the following three controls: Panel – The Panel control named pnl1 contains the content which appears with rounded corners. ToolkitScriptManager – Every page which uses the Ajax Control Toolkit must contain a single ToolkitScriptManager. The ToolkitScriptManager loads all of the JavaScript files used by the Ajax Control Toolkit. RoundedCornersExtender – This Ajax Control Toolkit extender targets the Panel control. It makes the Panel control appear with rounded corners. You can control the “roundiness” of the corners by modifying the Radius property. Notice that you get Intellisense when typing the Ajax Control Toolkit tags. As soon as you type <ajaxToolkit, all of the available Ajax Control Toolkit controls appear: When you open the page in a browser, then the contents of the Panel appears with rounded corners. The advantage of using the RoundedCorners extender is that it is cross-browser compatible. It works great with Internet Explorer, Opera, Firefox, Chrome, and Safari even though different browsers implement rounded corners in different ways. The RoundedCorners extender even works with an ancient browser such as Internet Explorer 6. Getting the Latest Version of the Ajax Control Toolkit The Ajax Control Toolkit continues to evolve at a rapid pace. We are hard at work at fixing bugs and adding new features to the project. We plan to have a new release of the Ajax Control Toolkit each month. The easiest way to get the latest version of the Ajax Control Toolkit is to use NuGet. You can open the NuGet Add Library Package Reference dialog at any time to update the Ajax Control Toolkit to the latest version.

    Read the article

  • 12.04 LTS: unity --reset hangs

    - by Gregory R. Pace
    Nearly each time I reboot my machine, the system panel and integrated app menus fail to load. At a terminal, when issuing 'unity --reset', I get the following errors: ... Initializing widget options...done Initializing winrules options...done Initializing wobbly options...done ERROR 2012-11-05 04:36:48 unity.glib-gobject <unknown>:0 g_object_unref: assertion `G_IS_OBJECT (object)' failed ERROR 2012-11-05 04:36:48 unity.gtk <unknown>:0 gtk_window_resize: assertion `width > 0' failed WARN 2012-11-05 04:37:14 unity <unknown>:0 Unable to fetch children: No such interface `org.ayatana.bamf.view' on object at path /org/ayatana/bamf/application885622223 ERROR 2012-11-05 04:37:21 unity.glib-gobject <unknown>:0 g_object_set_qdata: assertion `G_IS_OBJECT (object)' failed Setting Update "main_menu_key" Setting Update "run_key" WARN 2012-11-05 04:38:06 unity.iconloader IconLoader.cpp:438 Unable to load icon stock-person at size 24 WARN 2012-11-05 04:38:26 unity.glib.dbusproxy GLibDBusProxy.cpp:182 Unable to connect to proxy: Error calling StartServiceByName for com.canonical.Unity.Lens.Applications: Timeout was reached WARN 2012-11-05 04:38:26 unity.glib.dbusproxy GLibDBusProxy.cpp:182 Unable to connect to proxy: Error calling StartServiceByName for com.canonical.Unity.Lens.Applications: Timeout was reached The procedure hangs at this point. Any ideas how to solve these problems ? Thanks in advance.

    Read the article

  • New MySQL Cluster 7.3 Previews: Foreign Keys, NoSQL Node.js API and Auto-Tuned Clusters

    - by Mat Keep
    At this weeks MySQL Connect conference, Oracle previewed an exciting new wave of developments for MySQL Cluster, further extending its simplicity and flexibility by expanding the range of use-cases, adding new NoSQL options, and automating configuration. What’s new: Development Release 1: MySQL Cluster 7.3 with Foreign Keys Early Access “Labs” Preview: MySQL Cluster NoSQL API for Node.js Early Access “Labs” Preview: MySQL Cluster GUI-Based Auto-Installer In this blog, I'll introduce you to the features being previewed. Review the blogs listed below for more detail on each of the specific features discussed. Save the date!: A live webinar is scheduled for Thursday 25th October at 0900 Pacific Time / 1600UTC where we will discuss each of these enhancements in more detail. Registration will be open soon and published to the MySQL webinars page MySQL Cluster 7.3: Development Release 1 The first MySQL Cluster 7.3 Development Milestone Release (DMR) previews Foreign Keys, bringing powerful new functionality to MySQL Cluster while eliminating development complexity. Foreign Key support has been one of the most requested enhancements to MySQL Cluster – enabling users to simplify their data models and application logic – while extending the range of use-cases for both custom projects requiring referential integrity and packaged applications, such as eCommerce, CRM, CMS, etc. Implementation The Foreign Key functionality is implemented directly within the MySQL Cluster data nodes, allowing any client API accessing the cluster to benefit from them – whether they are SQL or one of the NoSQL interfaces (Memcached, C++, Java, JPA, HTTP/REST or the new Node.js API - discussed later.) The core referential actions defined in the SQL:2003 standard are implemented: CASCADE RESTRICT NO ACTION SET NULL In addition, the MySQL Cluster implementation supports the online adding and dropping of Foreign Keys, ensuring the Cluster continues to serve both read and write requests during the operation.  This represents a further enhancement to MySQL Cluster's support for on0line schema changes, ie adding and dropping indexes, adding columns, etc.  Read this blog for a demonstration of using Foreign Keys with MySQL Cluster.  Getting Started with MySQL Cluster 7.3 DMR1: Users can download either the source or binary and evaluate the MySQL Cluster 7.3 DMR with Foreign Keys now! (Select the Development Release tab). MySQL Cluster NoSQL API for Node.js Node.js is hot! In a little over 3 years, it has become one of the most popular environments for developing next generation web, cloud, mobile and social applications. Bringing JavaScript from the browser to the server, the design goal of Node.js is to build new real-time applications supporting millions of client connections, serviced by a single CPU core. Making it simple to further extend the flexibility and power of Node.js to the database layer, we are previewing the Node.js Javascript API for MySQL Cluster as an Early Access release, available for download now from http://labs.mysql.com/. Select the following build: MySQL-Cluster-NoSQL-Connector-for-Node-js Alternatively, you can clone the project at the MySQL GitHub page.  Implemented as a module for the V8 engine, the new API provides Node.js with a native, asynchronous JavaScript interface that can be used to both query and receive results sets directly from MySQL Cluster, without transformations to SQL. Figure 1: MySQL Cluster NoSQL API for Node.js enables end-to-end JavaScript development Rather than just presenting a simple interface to the database, the Node.js module integrates the MySQL Cluster native API library directly within the web application itself, enabling developers to seamlessly couple their high performance, distributed applications with a high performance, distributed, persistence layer delivering 99.999% availability. The new Node.js API joins a rich array of NoSQL interfaces available for MySQL Cluster. Whichever API is chosen for an application, SQL and NoSQL can be used concurrently across the same data set, providing the ultimate in developer flexibility.  Get started with MySQL Cluster NoSQL API for Node.js tutorial MySQL Cluster GUI-Based Auto-Installer Compatible with both MySQL Cluster 7.2 and 7.3, the Auto-Installer makes it simple for DevOps teams to quickly configure and provision highly optimized MySQL Cluster deployments – whether on-premise or in the cloud. Implemented with a standard HTML GUI and Python-based web server back-end, the Auto-Installer intelligently configures MySQL Cluster based on application requirements and auto-discovered hardware resources Figure 2: Automated Tuning and Configuration of MySQL Cluster Developed by the same engineering team responsible for the MySQL Cluster database, the installer provides standardized configurations that make it simple, quick and easy to build stable and high performance clustered environments. The auto-installer is previewed as an Early Access release, available for download now from http://labs.mysql.com/, by selecting the MySQL-Cluster-Auto-Installer build. You can read more about getting started with the MySQL Cluster auto-installer here. Watch the YouTube video for a demonstration of using the MySQL Cluster auto-installer Getting Started with MySQL Cluster If you are new to MySQL Cluster, the Getting Started guide will walk you through installing an evaluation cluster on a singe host (these guides reflect MySQL Cluster 7.2, but apply equally well to 7.3 and the Early Access previews). Or use the new MySQL Cluster Auto-Installer! Download the Guide to Scaling Web Databases with MySQL Cluster (to learn more about its architecture, design and ideal use-cases). Post any questions to the MySQL Cluster forum where our Engineering team and the MySQL Cluster community will attempt to assist you. Post any bugs you find to the MySQL bug tracking system (select MySQL Cluster from the Category drop-down menu) And if you have any feedback, please post them to the Comments section here or in the blogs referenced in this article. Summary MySQL Cluster 7.2 is the GA, production-ready release of MySQL Cluster. The first Development Release of MySQL Cluster 7.3 and the Early Access previews give you the opportunity to preview and evaluate future developments in the MySQL Cluster database, and we are very excited to be able to share that with you. Let us know how you get along with MySQL Cluster 7.3, and other features that you want to see in future releases, by using the comments of this blog.

    Read the article

  • UCM 11g is 4 days old!

    - by kyle.hatlestad
    Ok...so I missed posting a blog entry when UCM 11g and the entire ECM suite released on Tuesday. Hopefully you've already seen the announcements on any number of the Oracle ECM blogs out there such as ECM Alerts, Fusion ECM, bex huff, or C4. So I won't bore you with the same talking points like 179 million check-ins per day or 124 web site page hits per second. Instead, I thought I'd show some screenshots of the new features in UCM and URM 11g. WebLogic Server and Enterprise Manager So probably the biggest change in 11g is UCM and URM now run on top of the WebLogic Server application server. This is a huge step as ECM is now on a standard platform with the rest of Oracle Fusion Middleware which makes installation, configuration, and integration consistent among all the products. From a feature perspective, it's also beneficial because it's now integrated with Oracle Enterprise Manager. Enterprise Manager provides a lot of provisioning control over servers as well as performance monitoring and access to logs and debugging information. Desktop Integration Suite Desktop Integration Suite got a complete overhaul for 11g. It exposes a lot more features within Windows Explorer such as saved searches, workflow queue, and checked-out items. It also now support metadata pop-up screens to let users fill in additional metadata when they drag-n-drop files in! And the integration within Office applications has changed significantly by introducing a dedicated UCM menu to do open, save, compare, etc. Site Studio for External Applications In UCM Site Studio 10gR4, a major architectural shift was introduced which brought several new objects such as elements, region definitions, region templates, and placeholder definitions. This truly separated the content from the display and from the definition. It also allowed separation of the content from needing to be rendered on a complete Site Studio page. Well, the new Site Studio for External Applications takes advantage of that architecture and introduces pre-built tags and plug-ins to JDeveloper to allow to go from simply adding a content area to your web application page to building an entire web site, just like you would have done in Site Studio Designer. In addition to these changes, enhancements to the core Site Studio have been added as well. One of the big ones is called Designer Mode which allows power-users to bypass the standard rules defined by the placeholder definition or template and perform any number of additional actions. This reduces the need to go back to Site Studio Designer or JDeveloper to make more advanced changes to the site. Dashboards As part of the updated records management functionality in both UCM and URM, users can now set a dashboard view on their home page to surface common functions in a single view. It has pre-built "portlets" users can choose from to display and organize they way they want. Behind the scenes, these dashboards are stored as Content Folios. So the dashboards themselves are content items that can be revisioned and shared between users. And new dashboard portlets can be easily added (like the User Profile one in the screenshots) by getting a copy of an existing one, modifying the display, and then checking it in as a new one to select from. URM Interface Enhancements URM includes several new UI and usability enhancements in 11g. There is a new view for physical records, a place to configure "favorite" items to quickly get to, and new placement of the records management menu. BI Publisher Reports Records management in UCM and URM now offer reports generated through embedded BI Publisher. Templates are controlled by rich text files checked directly into the repository, so they can be easily modified. Other Features A new Inbound Refinery conversion option is available that does native Microsoft Office HTML conversion. If your IBR is on Windows and you have the native applications loaded, the IBR can use them to produce HTML. A new GUI template editor for Dynamic Converter is available. It's written in Java so is available through all the supported browsers and platforms. The original ActiveX based editor is also still available. The Component Manager interface has changed to help provide an easier and more descriptive way to enable core components that are installed along with UCM. All of the supported components are immediately available to turn on and do not have to be installed separately as in previous versions. My Downloads is located in the My Content Server menu and provides for easy download of client installs including Desktop Integration Suite and Site Studio Designer. Well, hopefully that gives you a taste for some of the new things in 11g. We're all pretty excited here at Oracle about all the new changes and enhancements. Over the next few months I hope to highlight some of these features more in-depth, so keep your eye out for those posts.

    Read the article

  • New Product: Oracle Java ME Embedded 3.2 – Small, Smart, Connected

    - by terrencebarr
    The Internet of Things (IoT) is coming. And, with todays launch of the Oracle Java ME Embedded 3.2 product, Java is going to play an even greater role in it. Java in the Internet of Things By all accounts, intelligent embedded devices are penetrating the world around us – driving industrial processes, monitoring environmental conditions, providing better health care, analyzing and processing data, and much more. And these devices are becoming increasingly connected, adding another dimension of utility. Welcome to the Internet of Things. As I blogged yesterday, this is a huge opportunity for the Java technology and ecosystem. To enable and utilize these billions of devices effectively you need a programming model, tools, and protocols which provide a feature-rich, consistent, scalable, manageable, and interoperable platform.  Java technology is ideally suited to address these technical and business problems, enabling you eliminate many of the typical challenges in designing embedded solutions. By using Java you can focus on building smarter, more valuable embedded solutions faster. To wit, Java technology is already powering around 10 billion devices worldwide. Delivering on this vision and accelerating the growth of embedded Java solutions, Oracle is today announcing a brand-new product: Oracle Java Micro Edition (ME) Embedded 3.2, accompanied by an update release of the Java ME Software Development Kit (SDK) to version 3.2. What is Oracle Java ME Embedded 3.2? Oracle Java ME Embedded 3.2 is a complete Java runtime client, optimized for ARM architecture connected microcontrollers and other resource-constrained systems. The product provides dedicated embedded functionality and is targeted for low-power, limited memory devices requiring support for a range of network services and I/O interfaces.  What features and APIs are provided by Oracle Java ME Embedded 3.2? Oracle Java ME Embedded 3.2 is a Java ME runtime based on CLDC 1.1 (JSR-139) and IMP-NG (JSR-228). The runtime and virtual machine (VM) are highly optimized for embedded use. Also included in the product are the following optional JSRs and Oracle APIs: File I/O API’s (JSR-75)  Wireless Messaging API’s (JSR-120) Web Services (JSR-172) Security and Trust Services subset (JSR-177) Location API’s (JSR-179) XML API’s (JSR-280)  Device Access API Application Management System (AMS) API AccessPoint API Logging API Additional embedded features are: Remote application management system Support for continuous 24×7 operation Application monitoring, auto-start, and system recovery Application access to peripheral interfaces such as GPIO, I2C, SPIO, memory mapped I/O Application level logging framework, including option for remote logging Headless on-device debugging – source level Java application debugging over IP Connection Remote configuration of the Java VM What type of platforms are targeted by Oracle Java ME 3.2 Embedded? The product is designed for embedded, always-on, resource-constrained, headless (no graphics/no UI), connected (wired or wireless) devices with a variety of peripheral I/O.  The high-level system requirements are as follows: System based on ARM architecture SOCs Memory footprint (approximate) from 130 KB RAM/350KB ROM (for a minimal, customized configuration) to 700 KB RAM/1500 KB ROM (for the full, standard configuration)  Very simple embedded kernel, or a more capable embedded OS/RTOS At least one type of network connection (wired or wireless) The initial release of the product is delivered as a device emulation environment for x86/Windows desktop computers, integrated with the Java ME SDK 3.2. A standard binary of Oracle Java ME Embedded 3.2 for ARM KEIL development boards based on ARM Cortex M-3/4 (KEIL MCBSTM32F200 using ST Micro SOC STM32F207IG) will soon be available for download from the Oracle Technology Network (OTN).  What types of applications can I develop with Oracle Java ME Embedded 3.2? The Oracle Java ME Embedded 3.2 product is a full-featured embedded Java runtime supporting applications based on the IMP-NG application model, which is derived from the well-known MIDP 2 application model. The runtime supports execution of multiple concurrent applications, remote application management, versatile connectivity, and a rich set of APIs and features relevant for embedded use cases, including the ability to interact with peripheral I/O directly from Java applications. This rich feature set, coupled with familiar and best-in class software development tools, allows developers to quickly build and deploy sophisticated embedded solutions for a wide range of use cases. Target markets well supported by Oracle Java ME Embedded 3.2 include wireless modules for M2M, industrial and building control, smart grid infrastructure, home automation, and environmental sensors and tracking. What tools are available for embedded application development for Oracle Java ME Embedded 3.2? Along with the release of Oracle Java ME Embedded 3.2, Oracle is also making available an updated version of the Java ME Software Development Kit (SDK), together with plug-ins for the NetBeans and Eclipse IDEs, to deliver a complete development environment for embedded application development.  OK – sounds great! Where can I find out more? And how do I get started? There is a complete set of information, data sheet, API documentation, “Getting Started Guide”, FAQ, and download links available: For an overview of Oracle Embeddable Java, see here. For the Oracle Java ME Embedded 3.2 press release, see here. For the Oracle Java ME Embedded 3.2 data sheet, see here. For the Oracle Java ME Embedded 3.2 landing page, see here. For the Oracle Java ME Embedded 3.2 documentation page, including a “Getting Started Guide” and FAQ, see here. For the Oracle Java ME SDK 3.2 landing and download page, see here. Finally, to ask more questions, please see the OTN “Java ME Embedded” forum To get started, grab the “Getting Started Guide” and download the Java ME SDK 3.2, which includes the Oracle Java ME Embedded 3.2 device emulation.  Can I learn more about Oracle Java ME Embedded 3.2 at JavaOne and/or Java Embedded @ JavaOne? Glad you asked Both conferences, JavaOne and Java Embedded @ JavaOne, will feature a host of content and information around the new Oracle Java ME Embedded 3.2 product, from technical and business sessions, to hands-on tutorials, and demos. Stay tuned, I will post details shortly. Cheers, – Terrence Filed under: Mobile & Embedded Tagged: "Oracle Java ME Embedded", Connected, embedded, Embedded Java, Java Embedded @ JavaOne, JavaOne, Smart

    Read the article

  • What a Performance! MySQL 5.5 and InnoDB 1.1 running on Oracle Linux

    - by zeynep.koch(at)oracle.com
    The MySQL performance team in Oracle has recently completed a series of benchmarks comparing Read / Write and Read-Only performance of MySQL 5.5 with the InnoDB and MyISAM storage engines. Compared to MyISAM, InnoDB delivered 35x higher throughput on the Read / Write test and 5x higher throughput on the Read-Only test, with 90% scalability across 36 CPU cores. A full analysis of results and MySQL configuration parameters are documented in a new whitepaperIn addition to the benchmark, the new whitepaper, also includes:- A discussion of the use-cases for each storage engine- Best practices for users considering the migration of existing applications from MyISAM to InnoDB- A summary of the performance and scalability enhancements introduced with MySQL 5.5 and InnoDB 1.1.The benchmark itself was based on Sysbench, running on AMD Opteron "Magny-Cours" processors, and Oracle Linux with the Unbreakable Enterprise Kernel You can learn more about MySQL 5.5 and InnoDB 1.1 from here and download it from here to test whether you witness performance gains in your real-world applications.  By Mat Keep

    Read the article

  • LIVE WEBCAST March 24 2pm PT- Why Switch from Red Hat and SUSE Linux to Oracle Linux?

    - by Zeynep Koch
    Oracle has been offering affordable Linux support since 2006 and more than 6,000 customers already use it. Oracle's Unbreakable Linux support program draws on the expertise of a world-class support organization that understands how to diagnose and solve Linux issues integrated with the applications being deployed on it. Find out how you can save 50-90% on your support costs. Join Oracle's Monica Kumar, Sr.Director of Linux, Oracle VM and MySQL and Avi Miller, Principal Sales Consultant, Linux and Virtualization on Thursday, March 24, 2pm PT to hear:The "Why and how" of switching to Oracle LinuxTesting and integration with systems and applicationsFree management and high availability toolsReal life customer scenariosIf you are going to get free access to the most advanced Linux operating system, along with world-class support at a fraction of the cost, better testing and integration with your server and applications, why wouldn't you do it? Register Now

    Read the article

< Previous Page | 187 188 189 190 191 192 193 194 195 196 197 198  | Next Page >