Search Results

Search found 29959 results on 1199 pages for 'enterprise development'.

Page 14/1199 | < Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >

  • Advice on developing a social network [on hold]

    - by Siraj Mansour
    I am doing research on assembling a team, using the right tools, and the cost to develop a highly responsive social network that is capable of dealing with a lot of users. Similar to the Facebook concept but using the basics package for now. Profile, friends, posts, updates, media upload/download, streaming, chat and Inbox messaging are all in the package. We certainly do not expect it to be as popular as Facebook or handle the same number of users and requests, but in its own game it has to be a monster, and expandable for later on. Neglecting the hosting, and servers part, i am looking for technical advise and opinions, on what kind of team i need ? how many developers ? their expertise ? What are the right tools ? languages ? frameworks ? environments ? Any random ideas about the infrastructure ? Quick thoughts on the development process ? Please use references, if you have any to support your ideas. Development cost mere estimation ? NEGLECTING THE COST OF SERVERS I know my question is too broad but my knowledge is very limited and i need detailed help, for any help you can offer i thank you in advance.

    Read the article

  • Oracle BI Server Modeling, Part 1- Designing a Query Factory

    - by bob.ertl(at)oracle.com
      Welcome to Oracle BI Development's BI Foundation blog, focused on helping you get the most value from your Oracle Business Intelligence Enterprise Edition (BI EE) platform deployments.  In my first series of posts, I plan to show developers the concepts and best practices for modeling in the Common Enterprise Information Model (CEIM), the semantic layer of Oracle BI EE.  In this segment, I will lay the groundwork for the modeling concepts.  First, I will cover the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing. Oracle BI EE Query Cycle The purpose of the Oracle BI Server is to bridge the gap between the presentation services and the data sources.  There are typically a variety of data sources in a variety of technologies: relational, normalized transaction systems; relational star-schema data warehouses and marts; multidimensional analytic cubes and financial applications; flat files, Excel files, XML files, and so on. Business datasets can reside in a single type of source, or, most of the time, are spread across various types of sources. Presentation services users are generally business people who need to be able to query that set of sources without any knowledge of technologies, schemas, or how sources are organized in their company. They think of business analysis in terms of measures with specific calculations, hierarchical dimensions for breaking those measures down, and detailed reports of the business transactions themselves.  Most of them create queries without knowing it, by picking a dashboard page and some filters.  Others create their own analysis by selecting metrics and dimensional attributes, and possibly creating additional calculations. The BI Server bridges that gap from simple business terms to technical physical queries by exposing just the business focused measures and dimensional attributes that business people can use in their analyses and dashboards.   After they make their selections and start the analysis, the BI Server plans the best way to query the data sources, writes the optimized sequence of physical queries to those sources, post-processes the results, and presents them to the client as a single result set suitable for tables, pivots and charts. The CEIM is a model that controls the processing of the BI Server.  It provides the subject areas that presentation services exposes for business users to select simplified metrics and dimensional attributes for their analysis.  It models the mappings to the physical data access, the calculations and logical transformations, and the data access security rules.  The CEIM consists of metadata stored in the repository, authored by developers using the Administration Tool client.     Presentation services and other query clients create their queries in BI EE's SQL-92 language, called Logical SQL or LSQL.  The API simply uses ODBC or JDBC to pass the query to the BI Server.  Presentation services writes the LSQL query in terms of the simplified objects presented to the users.  The BI Server creates a query plan, and rewrites the LSQL into fully-detailed SQL or other languages suitable for querying the physical sources.  For example, the LSQL on the left below was rewritten into the physical SQL for an Oracle 11g database on the right. Logical SQL   Physical SQL SELECT "D0 Time"."T02 Per Name Month" saw_0, "D4 Product"."P01  Product" saw_1, "F2 Units"."2-01  Billed Qty  (Sum All)" saw_2 FROM "Sample Sales" ORDER BY saw_0, saw_1       WITH SAWITH0 AS ( select T986.Per_Name_Month as c1, T879.Prod_Dsc as c2,      sum(T835.Units) as c3, T879.Prod_Key as c4 from      Product T879 /* A05 Product */ ,      Time_Mth T986 /* A08 Time Mth */ ,      FactsRev T835 /* A11 Revenue (Billed Time Join) */ where ( T835.Prod_Key = T879.Prod_Key and T835.Bill_Mth = T986.Row_Wid) group by T879.Prod_Dsc, T879.Prod_Key, T986.Per_Name_Month ) select SAWITH0.c1 as c1, SAWITH0.c2 as c2, SAWITH0.c3 as c3 from SAWITH0 order by c1, c2   Probably everybody reading this blog can write SQL or MDX.  However, the trick in designing the CEIM is that you are modeling a query-generation factory.  Rather than hand-crafting individual queries, you model behavior and relationships, thus configuring the BI Server machinery to manufacture millions of different queries in response to random user requests.  This mass production requires a different mindset and approach than when you are designing individual SQL statements in tools such as Oracle SQL Developer, Oracle Hyperion Interactive Reporting (formerly Brio), or Oracle BI Publisher.   The Structure of the Common Enterprise Information Model (CEIM) The CEIM has a unique structure specifically for modeling the relationships and behaviors that fill the gap from logical user requests to physical data source queries and back to the result.  The model divides the functionality into three specialized layers, called Presentation, Business Model and Mapping, and Physical, as shown below. Presentation services clients can generally only see the presentation layer, and the objects in the presentation layer are normally the only ones used in the LSQL request.  When a request comes into the BI Server from presentation services or another client, the relationships and objects in the model allow the BI Server to select the appropriate data sources, create a query plan, and generate the physical queries.  That's the left to right flow in the diagram below.  When the results come back from the data source queries, the right to left relationships in the model show how to transform the results and perform any final calculations and functions that could not be pushed down to the databases.   Business Model Think of the business model as the heart of the CEIM you are designing.  This is where you define the analytic behavior seen by the users, and the superset library of metric and dimension objects available to the user community as a whole.  It also provides the baseline business-friendly names and user-readable dictionary.  For these reasons, it is often called the "logical" model--it is a virtual database schema that persists no data, but can be queried as if it is a database. The business model always has a dimensional shape (more on this in future posts), and its simple shape and terminology hides the complexity of the source data models. Besides hiding complexity and normalizing terminology, this layer adds most of the analytic value, as well.  This is where you define the rich, dimensional behavior of the metrics and complex business calculations, as well as the conformed dimensions and hierarchies.  It contributes to the ease of use for business users, since the dimensional metric definitions apply in any context of filters and drill-downs, and the conformed dimensions enable dashboard-wide filters and guided analysis links that bring context along from one page to the next.  The conformed dimensions also provide a key to hiding the complexity of many sources, including federation of different databases, behind the simple business model. Note that the expression language in this layer is LSQL, so that any expression can be rewritten into any data source's query language at run time.  This is important for federation, where a given logical object can map to several different physical objects in different databases.  It is also important to portability of the CEIM to different database brands, which is a key requirement for Oracle's BI Applications products. Your requirements process with your user community will mostly affect the business model.  This is where you will define most of the things they specifically ask for, such as metric definitions.  For this reason, many of the best-practice methodologies of our consulting partners start with the high-level definition of this layer. Physical Model The physical model connects the business model that meets your users' requirements to the reality of the data sources you have available. In the query factory analogy, think of the physical layer as the bill of materials for generating physical queries.  Every schema, table, column, join, cube, hierarchy, etc., that will appear in any physical query manufactured at run time must be modeled here at design time. Each physical data source will have its own physical model, or "database" object in the CEIM.  The shape of each physical model matches the shape of its physical source.  In other words, if the source is normalized relational, the physical model will mimic that normalized shape.  If it is a hypercube, the physical model will have a hypercube shape.  If it is a flat file, it will have a denormalized tabular shape. To aid in query optimization, the physical layer also tracks the specifics of the database brand and release.  This allows the BI Server to make the most of each physical source's distinct capabilities, writing queries in its syntax, and using its specific functions. This allows the BI Server to push processing work as deep as possible into the physical source, which minimizes data movement and takes full advantage of the database's own optimizer.  For most data sources, native APIs are used to further optimize performance and functionality. The value of having a distinct separation between the logical (business) and physical models is encapsulation of the physical characteristics.  This encapsulation is another enabler of packaged BI applications and federation.  It is also key to hiding the complex shapes and relationships in the physical sources from the end users.  Consider a routine drill-down in the business model: physically, it can require a drill-through where the first query is MDX to a multidimensional cube, followed by the drill-down query in SQL to a normalized relational database.  The only difference from the user's point of view is that the 2nd query added a more detailed dimension level column - everything else was the same. Mappings Within the Business Model and Mapping Layer, the mappings provide the binding from each logical column and join in the dimensional business model, to each of the objects that can provide its data in the physical layer.  When there is more than one option for a physical source, rules in the mappings are applied to the query context to determine which of the data sources should be hit, and how to combine their results if more than one is used.  These rules specify aggregate navigation, vertical partitioning (fragmentation), and horizontal partitioning, any of which can be federated across multiple, heterogeneous sources.  These mappings are usually the most sophisticated part of the CEIM. Presentation You might think of the presentation layer as a set of very simple relational-like views into the business model.  Over ODBC/JDBC, they present a relational catalog consisting of databases, tables and columns.  For business users, presentation services interprets these as subject areas, folders and columns, respectively.  (Note that in 10g, subject areas were called presentation catalogs in the CEIM.  In this blog, I will stick to 11g terminology.)  Generally speaking, presentation services and other clients can query only these objects (there are exceptions for certain clients such as BI Publisher and Essbase Studio). The purpose of the presentation layer is to specialize the business model for different categories of users.  Based on a user's role, they will be restricted to specific subject areas, tables and columns for security.  The breakdown of the model into multiple subject areas organizes the content for users, and subjects superfluous to a particular business role can be hidden from that set of users.  Customized names and descriptions can be used to override the business model names for a specific audience.  Variables in the object names can be used for localization. For these reasons, you are better off thinking of the tables in the presentation layer as folders than as strict relational tables.  The real semantics of tables and how they function is in the business model, and any grouping of columns can be included in any table in the presentation layer.  In 11g, an LSQL query can also span multiple presentation subject areas, as long as they map to the same business model. Other Model Objects There are some objects that apply to multiple layers.  These include security-related objects, such as application roles, users, data filters, and query limits (governors).  There are also variables you can use in parameters and expressions, and initialization blocks for loading their initial values on a static or user session basis.  Finally, there are Multi-User Development (MUD) projects for developers to check out units of work, and objects for the marketing feature used by our packaged customer relationship management (CRM) software.   The Query Factory At this point, you should have a grasp on the query factory concept.  When developing the CEIM model, you are configuring the BI Server to automatically manufacture millions of queries in response to random user requests. You do this by defining the analytic behavior in the business model, mapping that to the physical data sources, and exposing it through the presentation layer's role-based subject areas. While configuring mass production requires a different mindset than when you hand-craft individual SQL or MDX statements, it builds on the modeling and query concepts you already understand. The following posts in this series will walk through the CEIM modeling concepts and best practices in detail.  We will initially review dimensional concepts so you can understand the business model, and then present a pattern-based approach to learning the mappings from a variety of physical schema shapes and deployments to the dimensional model.  Along the way, we will also present the dimensional calculation template, and learn how to configure the many additivity patterns.

    Read the article

  • help on developing enterprise level software solutions

    - by wefwgeweg
    there is a specific niche which I would like to target by providing a complete enterprise level software solution.... the problem is, where do i begin ? meaning, i come from writing just desktop software on VB/ASP .net/PHP/mysql and suddenly unfamiliar terms popup like Oracle, SAP Business Information Warehouse, J2EE.... obviously, something is pointing towards Java, is it common for software suites, or solutions to be developed 100% on Java technology and standards? Are there any other platform to build enterprise level software on ? i am still lacking understanding what exactly is "Enterprise level" ? what is sufficient condition to call a software that sells for $199 and then suddenly it's $19,999 for "enterprise" package. I dont understand why there is such a huge discrepancy between "standard" and "enterprise" versions of software. Is it just attempting to bag large corporations on a spending spree ? so why does one choose to develop so called "enterprise" softwares ? is it because of the large inflated price tag you can justify with ? i would also like some more enterpreneural resources on starting your own enterprise software company in a niche.... Thank you for reading, i am still trying to find the right questions.

    Read the article

  • Development on Windows 7; Web server on Linux - How to share Apache web root?

    - by TheKeys
    I've got a LAMP server that I want to use as a local web server. I've got a Windows 7 machine that I want to use as my development machine. The machines will be on the same LAN (or the Windows box will be VPNed into the LAN). My questions is, what is the best way of sharing the web root of the LAMP server so that I can edit the files on the remote Windows 7 machine and how do I go about configuring this on the Linux machine? (Fedora 16) I would like the solution to be as easy to use as possible with preferably no extra steps required to save/edit/upload files from my IDE on my Windows 7 machine. I'm thinking either a Samba or NFS share are the way to go but I'm concerned I'm going to run into issues with permissions and unix/windows file handling. Is one better than ther other for my use case or is there a better alternative solution? I'm currently using Windows 7 Professional which doesn't have NFS support but would upgrade to Ultimate which does have NFS support if it's the best solution.

    Read the article

  • What a Performance! MySQL 5.5 and InnoDB 1.1 running on Oracle Linux

    - by zeynep.koch(at)oracle.com
    The MySQL performance team in Oracle has recently completed a series of benchmarks comparing Read / Write and Read-Only performance of MySQL 5.5 with the InnoDB and MyISAM storage engines. Compared to MyISAM, InnoDB delivered 35x higher throughput on the Read / Write test and 5x higher throughput on the Read-Only test, with 90% scalability across 36 CPU cores. A full analysis of results and MySQL configuration parameters are documented in a new whitepaperIn addition to the benchmark, the new whitepaper, also includes:- A discussion of the use-cases for each storage engine- Best practices for users considering the migration of existing applications from MyISAM to InnoDB- A summary of the performance and scalability enhancements introduced with MySQL 5.5 and InnoDB 1.1.The benchmark itself was based on Sysbench, running on AMD Opteron "Magny-Cours" processors, and Oracle Linux with the Unbreakable Enterprise Kernel You can learn more about MySQL 5.5 and InnoDB 1.1 from here and download it from here to test whether you witness performance gains in your real-world applications.  By Mat Keep

    Read the article

  • Future of Programmers [closed]

    - by Brian Paul
    Possible Duplicate: Will programmers be around in a few years? I have a passion of web development, but have been wondering of late, what is the future of web programming, and just programming in general. I will give an example to illustrate this, companies now most of them buy/ are willing to spend more money to implement enterprise level products, coming from big companies, than hiring a programmer, because when you look at the long term,instead of paying this programmer, and being tied to his ideas and skills, better buy a product, which you are guaranteed high level functions and support. Therefore what will be the future to programmers?

    Read the article

  • Is it my responsibility to code for errors on a completely separate website and domain when redirecting or doing a single sign on?

    - by kappasims
    If my application is responsible for redirecting/doing a single sign on to a destination managed by a third party, in general, where should I draw the line for error handling during this process? If an error happens on the other application's end, is it reasonable for my stakeholder to expect the application I am working with to share responsibility for handling these scenarios? Notes: I am going to keep solutions limited to those that entail only one request--I am familiar with the "do an xmlhttprequest and see how that fares before doing anything else" approach. I am speaking in terms of an enterprise-level application with fairly decent customer traffic.

    Read the article

  • What are requirements for a successful SOA?

    - by Amir Rezaei
    I’m an EA in an organisation with 10000+ employees. Strategically we are heading towards SOA. Currently I’m researching about SOA’s and creating a road map and I have come over many blogs that talk about “SOA is dead”. We can all agree that SOA is not just web-services. The problem is that I have hard to find any information on the reason behind SOA-fail stories in enterprises. What went bad and what went right? My question is: What are common SOA mistakes in enterprises that make SOA fail in long term? Is the any best practice for SOA? What are the most important requirements for a successful SOA in an enterprise? It would be good feedback towards our SOA strategy in this organisation. I have tried to narrow down the question, but it’s hard due to the nature of the question.

    Read the article

  • Improve Engineering & Construction Project Productivity

    - by [email protected]
    Driving successful project delivery and providing greater value and return for all stakeholders are key goals for firms in the engineering and construction industry. However, increasingly complex construction projects, compressed schedules, ineffective collaboration among project stakeholders, and limited interoperability, can get in the way of these goals and lead to reduced productivity. What E&C firms need are solutions that will improve global team collaboration, optimize processes and better communicate electronic project data. Check out the AutoVue for Engineering and Construction Solution Brief and learn how AutoVue enterprise visualization solutions can: - improve global project collaboration and communication - improve data interoperability - support virtual design and construction projects - improve change management and maintain accountability

    Read the article

  • Premera Blue Cross Deploys PeopleSoft Enterprise 9.1 Human Capital Management, Financial Management, Enterprise Learning Management and Enterprise Portal Solutions

    - by jay.richey
    Optimum Solutions Implements Oracle's PeopleSoft Enterprise 9.1 at Premera Blue Cross Premera chose to upgrade to the latest version of PeopleSoft to help the company achieve its strategic goals, which include building and maintaining a skilled employee team that enables the company to deliver highly efficient and valuable service to plan subscribers, sponsors, and healthcare providers. Its decision was influenced by the key capabilities in PeopleSoft Talent Management 9.1, as well as the common technology enhancements for the PeopleSoft PeopleTools 8.50 toolset across all business process areas, which has helped Premera to maximize process automation, increased ease of use, and minimize long term IT support overhead. Read more...

    Read the article

  • Web 2.0 Extension for ASP.NET

    - by Visual WebGui
    ASP.NET is now much extended to support line of business and data centric applications, providing Web 2.0 rich user interfaces within a native web environment. New capabilities allowed by the Visual WebGui extension turn Visual Studio into a rapid development tool for the web, leveraging the wide set of ASP.NET web infrastructures runtime and extending its paradigms to support highly interactive applications. Taking advantage of the ASP.NET infrastructures Using the native ASP.NET ISAPI filter: aspnet_isapi...(read more)

    Read the article

  • Including additional DLL’s in an MSBuild script for Module Packaging

    - by Chris Hammond
    Late last year I created a blog post and video about a new version of the module development template that I released on Codeplex . This new template uses MSBuild scripts instead of NANT scripts to automate the packaging process for the modules built with the template. The MSBuild script works well out of the box, to package your module you simple change into RELEASE mode and then execute the build. If your project contains references to DLLs (in the website’s BIN folder) that you also need to package...(read more)

    Read the article

  • New Visual Studio 2012 Project Templates for DotNetNuke

    - by Chris Hammond
    Earlier this month Microsoft put the bits up for Visual Studio 2012 RTM out on MSDN Subscriber downloads, and during the first two weeks of September they will officially be releasing Visual Studio 2012. I started working with VS2012 late in the release candidate cycle, doing some DNN module development using my templates at http://christoctemplate.codeplex.com . These templates work fine in Visual Studio 2012 from my testing, but they still face the same problem that they had in Visual Studio 2008...(read more)

    Read the article

  • What are the tools used by modern desktop/"native" application developers? [closed]

    - by kunjaa
    Besides the usual editor and debugger, what do the modern desktop (windows and linux) application developers use for their development. I am more interested in profilers, code analyzers, memory analyzers, packaging tools, GUI frameworks, libraries and any other handy tools and secrets that you couldnt live without. For example, as a web application developer, I have my Firebug and its extensions, Wireshark, jQuery and its extensions, client side and server side mvc frameworks, selenium tests, jsfiddle etc. Edit : Ok let us constrain this by saying you are using C++

    Read the article

  • As the current draft stands, what is the most significant change the "National Strategy for Trusted Identities in Cyberspace" will provoke?

    - by mfg
    A current draft of the "National Strategy for Trusted Identities in Cyberspace" has been posted by the Department of Homeland Security. This question is not asking about privacy or constitutionality, but about how this act will impact developers' business models and development strategies. When the post was made I was reminded of Jeff's November blog post regarding an internet driver's license. Whether that is a perfect model or not, both approaches are attempting to handle a shared problem (of both developers and end users): How do we establish an online identity? The question I ask here is, with respect to the various burdens that would be imposed on developers and users, what are some of the major, foreseeable implementation issues that will arise from the current U.S. Government's proposed solution? For a quick primer on the setup, jump to page 12 for infrastructure components, here are two stand-outs: An Identity Provider (IDP) is responsible for the processes associated with enrolling a subject, and establishing and maintaining the digital identity associated with an individual or NPE. These processes include identity vetting and proofing, as well as revocation, suspension, and recovery of the digital identity. The IDP is responsible for issuing a credential, the information object or device used during a transaction to provide evidence of the subject’s identity; it may also provide linkage to authority, roles, rights, privileges, and other attributes. The credential can be stored on an identity medium, which is a device or object (physical or virtual) used for storing one or more credentials, claims, or attributes related to a subject. Identity media are widely available in many formats, such as smart cards, security chips embedded in PCs, cell phones, software based certificates, and USB devices. Selection of the appropriate credential is implementation specific and dependent on the risk tolerance of the participating entities. Here are the first considered actionable components of the draft: Action 1: Designate a Federal Agency to Lead the Public/Private Sector Efforts Associated with Achieving the Goals of the Strategy Action 2: Develop a Shared, Comprehensive Public/Private Sector Implementation Plan Action 3:Accelerate the Expansion of Federal Services, Pilots, and Policies that Align with the Identity Ecosystem Action 4:Work Among the Public/Private Sectors to Implement Enhanced Privacy Protections Action 5:Coordinate the Development and Refinement of Risk Models and Interoperability Standards Action 6: Address the Liability Concerns of Service Providers and Individuals Action 7: Perform Outreach and Awareness Across all Stakeholders Action 8: Continue Collaborating in International Efforts Action 9: Identify Other Means to Drive Adoption of the Identity Ecosystem across the Nation

    Read the article

  • Enterprise Manager 12c ? ZFS Storage Appliance

    - by user13138569
    ?????????????? Enterprise Manager 12c ??? Sun ZFS Storage Appliance ????????????????????? ???Enterprise Manager ?? Sun ZFS Storage Appliance ?????????????? Enterprise Manager ????????????????? 3??? Sun ZFS Stoarage Appliance ??????????????????? My Oracle Support ???Oracle Technology Network ???????????????????????????? ?????????????????????????? Oracle ZFS Storage Appliance Plugin Downloads Sun ZFS Storage Appliance ????????????????????????????? P.3 ???????????Appliance ???????????? Workflow ?????????? Enterprise Manager ???????????? P.10 ???????????????????????????????????????????Enterprise Manager 11g ??????????????????????? ??????????????????????????? ??????????????????????Sun ZFS Storage Appliance ??????????Database ???????????????????????????????Enterprise Manager ???????????????????????

    Read the article

  • What's New in Database Lifecycle Management in Enterprise Manager 12c Release 3

    - by HariSrinivasan
    Enterprise Manager 12c Release 3 includes improvements and enhancements across every area of the product. This blog provides an overview of the new and enhanced features in the Database Lifecycle Management area. I will deep dive into specific features more in depth in subsequent posts. "What's New?"  In this release, we focused on four things: 1. Lifecycle Management Support for new Database12c - Pluggable Databases 2. Management of long running processes, such as a security patch cycle (Change Activity Planner) 3. Management of large number of systems by · Leveraging new framework capabilities for lifecycle operations, such as the new advanced ‘emcli’ script option · Refining features such as configuration search and compliance 4. Minor improvements and quality fixes to existing features · Rollback support for Single instance databases · Improved "OFFLINE" Patching experience · Faster collection of ORACLE_HOME configurations Lifecycle Management Support for new Database 12c - Pluggable Databases Database 12c introduces Pluggable Databases (PDBs), the brand new addition to help you achieve your consolidation goals. Pluggable databases offer unprecedented consolidation at database level and native lifecycle verbs for creating, plugging and unplugging the databases on a container database (CDB). Enterprise Manager can supplement the capabilities of pluggable databases by offering workflows for migrating, provisioning and cloning them using the software library and the deployment procedures. For example, Enterprise Manager can migrate an existing database to a PDB or clone a PDB by storing a versioned copy in the software library. One can also manage the planned downtime related to patching by  migrating the PDBs to a new CDB. While pluggable databases offer these exciting features, it can also pose configuration management and compliance challenges if not managed properly. Enterprise Manager features like inventory management, topology associations and configuration search can mitigate the sprawl of PDBs and also lock them to predefined golden standards using configuration comparison and compliance rules. Learn More ... Management of Long Running datacenter processes - Change Activity Planner (CAP) Currently, customers resort to cumbersome methods to create, execute, track and monitor change activities within their data center. Some customers use traditional tools such as spreadsheets, project planners and in-house custom built solutions. Customers often have weekly sync up meetings across stake holders to collect status and updates. Some of the change activities, for example the quarterly patch set update (PSU) patch rollouts are not single tasks but processes with multiple tasks. Some of those tasks are performed within Enterprise Manager Cloud Control (for example Patch) and some are performed outside of Enterprise Manager Cloud Control. These tasks often run for a longer period of time and involve multiple people or teams. Enterprise Manger Cloud Control supports core data center operations such as configuration management, compliance management, and automation. Enterprise Manager Cloud Control release 12.1.0.3 leverages these capabilities and introduces the Change Activity Planner (CAP). CAP provides the ability to plan, execute, and track change activities in real time. It covers the typical datacenter activities that are spread over a long period of time, across multiple people and multiple targets (even target types). Here are some examples of Change Activity Process in a datacenter: · Patching large environments (PSU/CPU Patching cycles) · Upgrading large number of database environments · Rolling out Compliance Rules · Database Consolidation to Exadata environments CAP provides user flows for Compliance Officers/Managers (incl. lead administrators) and Operators (DBAs and admins). Managers can create change activity plans for various projects, allocate resources, targets, and groups affected. Upon activation of the plan, tasks are created and automatically assigned to individual administrators based on target ownership. Administrators (DBAs) can identify their tasks and understand the context, schedules, and priorities. They can complete tasks using Enterprise Manager Cloud Control automation features such as patch plans (or in some cases outside Enterprise Manager). Upon completion, compliance is evaluated for validations and updates the status of the tasks and the plans. Learn More about CAP ...  Improved Configuration & Compliance Management of a large number of systems Improved Configuration Comparison:  Get to the configuration comparison results faster for simple ad-hoc comparisons. When performing a 1 to 1 comparison, Enterprise Manager will perform the comparison immediately and take the user directly to the results without having to wait for a job to be submitted and executed. Flattened system comparisons reduce comparison setup time and reduce complexity. In addition to the previously existing topological comparison, users now have an option to compare using a “flattened” methodology. Flattening means to remove duplicate target instances within the systems and remove the hierarchy of member targets. The result are much easier to spot differences particularly for specific use cases like comparing patch levels between complex systems like RAC and Fusion Apps. Improved Configuration Search & Advanced EMCLI Script option for Mass Automation Enterprise manager 12c introduces a new framework level capability to be able to script and stitch together multiple tasks using EMCLI. This powerful capability can be leveraged for lifecycle operations, especially when executing a task over a large number of targets. Specific usages of this include, retrieving a qualified list of targets using Configuration Search and then using the resultset for automation. Another example would be executing a patching operation and then re-executing on targets where it may have failed. This is complemented by other enhancements, such as a better usability for designing reusable configuration searches. IN EM 12c Rel 3, a simplified UI makes building adhoc searches even easier. Searching for missing patches is a common use of configuration search. This required the use of the advanced options which are now clearly defined and easy to use. Perform “Configuration Search” using the EMCLI. Users can find and execute Configuration Searches from the EMCLI which can be extremely useful for building sophisticated automation scripts. For an example, Run the Search named “Oracle Databases on Exadata” which finds all Database targets running on top of Exadata. Further filter the results by refining by options like name, host, etc.. emcli get_targets -config_search="Databases on Exadata" –target_name="exa%“ Use this in powerful mass automation operations using the new emcli script option. For example, to solve the use case of – Finding all DBs running on Exadata and housing E-Biz and Patch them. Create a Python script with emcli functions and invoke it in the new EMCLI script option shell. Invoke the script in the new EMCLI with script option directly: $<path to emcli>/emcli @myPSU_Patch.py Richer compliance content:  Now over 50 Oracle Provided Compliance Standards including new standards for Pluggable Database, Fusion Applications, Oracle Identity Manager, Oracle VM and Internet Directory. 9 Oracle provided Real Time Monitoring Standards containing over 900 Compliance Rules across 500 Facets. These new Real time Compliance Standards covers both Exadata Compute nodes and Linux servers. The result is increased Oracle software coverage and faster time to compliance monitoring on Exadata. Enhancements to Patch Management: Overhauled "OFFLINE" Patching experience: Simplified Patch uploads UI to improve the offline experience of patching. There is now a single step process to get the patches into software library. Customers often maintain local repositories of patches, sometimes called software depots, where they host the patches downloaded from My Oracle Support. In the past, you had to move these patches to your desktop then upload them to the Enterprise Manager's Software library through the Enterprise Manager Cloud Control user interface. You can now use the following EMCLI command to upload multiple patches directly from a remote location within the data center: $emcli upload_patches -location <Path to Patch directory> -from_host <HOSTNAME> The upload process filters all of the new patches, automatically selects the relevant metadata files from the location, and uploads the patches to software library. Other Improvements:  Patch rollback for single instance databases, new option in the Patch Plan to rollback the patches added to the patch plans. Upon execution, the procedure would rollback the patch and the SQL applied to the single instance Databases. Improved and faster configuration collection of Oracle Home targets can enable more reliable automation at higher level functions like Provisioning, Patching or Database as a Service. Just to recap, here is a list of database lifecycle management features:  * Red highlights mark – New or Enhanced in the Release 3. • Discovery, inventory tracking and reporting • Database provisioning including o Migration to Pluggable databases o Plugging and unplugging of pluggable databases o Gold image based cloning o Scaling of RAC nodes •Schema and data change management •End-to-end patch management in online and offline modes, including o Patch advisories in online (connected with My Oracle Support) and offline mode o Patch pre-deployment analysis, deployment and rollback (currently only for single instance databases) o Reporting • Upgrade planning and execution of the upgrade process • Configuration management including • Compliance management with out-of-box content • Change Activity Planner for planning, designing and tracking long running processes For more information on Enterprise Manager’s database lifecycle management capabilities, visit http://www.oracle.com/technetwork/oem/lifecycle-mgmt/index.html

    Read the article

  • Google I/O 2010 - OpenSocial in the Enterprise

    Google I/O 2010 - OpenSocial in the Enterprise Google I/O 2010 - Best practices for implementing OpenSocial in the Enterprise Social Web, Enterprise 201 Mark Weitzel, Matt Tucker, Mark Halvorson, Helen Chen, Chris Schalk Enterprise deployments of OpenSocial technologies brings an additional set of considerations that may not be apparent in a traditional social network implementation. In this session, several enterprise vendors will demonstrate how they've been working together to address these issues in a collection of "Best Practices". This session will also provide a review of existing challenges for enterprise implementations of OpenSocial. For all I/O 2010 sessions, please go to code.google.com From: GoogleDevelopers Views: 5 0 ratings Time: 38:23 More in Science & Technology

    Read the article

  • September issue of the Enterprise Manager Indepth Newsletter

    - by Javier Puerta
    The September issue of the Enterprise Manager Indepth Newsletter is now available here  Featured articles include: Oracle OpenWorld Preview: Don't-Miss Sessions, Hands-on Labs, and MoreBecause of the rapid and widespread adoption of Oracle Enterprise Manager 12c since its launch at Oracle OpenWorld 2011, conference organizers are expecting Oracle Enterprise Manager sessions to attract record crowds at Oracle OpenWorld 2012. Read More Oracle Cloud Builder Summit—Zero to Enterprise Cloud in Two HoursIn August, Oracle launched the worldwide Oracle Cloud Builder Summit series, an event where attendees learn firsthand how to plan, deploy, and manage an enterprise private cloud using Oracle Enterprise Manager 12c—all in a few hours. Read More WEBCASTS Reduce Database Testing Efforts While Maximizing ROIWatch this on-demand Webcast demonstrating how to manage database and system changes with confidence using Oracle Real Application Testing. Viewers will be among the first to hear results from Forrester Consulting's commissioned, multicustomer study, “Total Economic Impact of Oracle Real Application Testing.”

    Read the article

  • Oracle Enterprise Manager 12c(EM12c):????????? ~Exadata??·??~

    - by Kumiko Fujita
    EM?????Exadata?????? Oracle Exadata???????????????????????????Oracle Enterprise Manager 12c????????????Oracle Exadata??????????(Oracle Enterprise Manager 11g)????????????12c???????????????????????????Exadata Storage Server?InfiniBand???????????????????????? Exadata??·?? ??????? 1. ???????????? -Exadata??????????????????????????!- Oracle Enterprise Manager 12c???Oracle Exadata???/????????????????????????????????????????????????????????????????????????????????????????????????? 2. ?????????????????????? -CPU????I/O?????????!- Oracle Exadata???????? 8 ????96?????????·??????????????TB???????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????Oracle Enterprise Manager 12c????????????CPU???????????I/O??????????????????????? 3. ????? -????Exadata????????????!- ????Oracle Exadata?????????????????????????????????????Oracle Enterprise Manager 12c????????????????????????????????·??????????Oracle Exadata??????????????????????????Oracle Enterprise Manager?????????????????????????????????????·???????????????????????Oracle Exadata????????????????????????????? ??????? Storage Server ????????griddisk,celldisk ????FlashCache ???? BIOS,IB??????????DB OS??????OS??????? ??????! ?????Exadata Monitoring?(PDF) ?????????(????????????????) WMV MP4

    Read the article

  • IOUG Enterprise Manager SIG Webinar: WEBINAR: Performance Tuning your Database Cloud in Oracle Enterprise Manager 12c Cloud Control - 360 Degrees

    - by Patrick Rood
    October 25, 2013 EM 12c Sales Blast | IOUG Enterprise Manager SIG WEBINAR: Performance Tuning your Database Cloud in Oracle Enterprise Manager 12c Cloud Control - 360 Degrees Last year, the Independent Oracle User Group (IOUG) established a fast-growing Special Interest Group (SIG) devoted to Enterprise Manager, and has sponsored Quarterly Newsletters and Webinars about EM. To drive more interest in EM and the SIG, IOUG would like Oracle to invite customers to its latest techcast. Your customers will learn how to leverage Oracle Enterprise Manager 12c for tuning, trouble-shooting and monitoring their Oracle Database Cloud Ecosystem. The session covers lessons learned, tips/tricks, recommendations, best practices, "gotchas" and a whole lot more on how to effectively use Oracle Enterprise Manager 12c Cloud Control for quick, easy and intuitive performance tuning of an Oracle Database Cloud. Session Objectives: • Leveraging Enterprise Manager 12c Cloud Control for Oracle Database Tuning/Monitoring • Limited Deep-Dive on Automatic Workload Repository (AWR) • Oracle Database Cloud Performance Tuning • Best Practices for Database Cloud Maintenance and Monitoring Featured Speaker: Tariq Farooq, CEO, BrainSurface and Mike Ault Date & Time: Wednesday, October 30 12:00 PM- 1:00 PM Central Time (USA) Register Here 

    Read the article

  • How to become an expert web-developer?

    - by John Smith
    I am currently a Junior PHP developer and I really LOVE it, I love internet from first time I got into it, I always loved smartly-created websites, always was wondering how it all works, always admired websites with good design and rich functionality, and finally I am creating web-sites on my own and it feels really great. My goals are to become expert web-developer (aiming for creating websites for small and medium business, not enterprise-sized systems), to have a great full-time job, to do freelance and to create my own startup in future. General question: What do I do to be an expert, professional and demanded web-programmer? More concrete questions: 1). How do I choose languages and technologies needed? I know that every web-developer must know HTML+CSS+JS+AJAX+JQuery, I am doing some design aswell cause I like it and I need it for freelance also. But what about backend languages? Currently I picked PHP cause it's most demanded in my area and most of web uses it, but what would happen in future? Say, in 3 years, I am good at PHP and PHP frameworks by than, but what if some other languages get most popular? Do I switch to them? I know that good programmer is not about languages and frameworks but about ability to learn and to aim the goals, but still I think that learning frameworks for some language can take quite some time. Am I wrong? 2). In general, what are basic guidelines to be expert web-developer? What are most important things I should focus on? Thank you!

    Read the article

  • How to manage security of these self hosted web apis, to ensure that the request coming for accessing data is authenticated?

    - by Husrat Mehmood
    Let's pretend I am going to work on an enterprise application. Say I have 11 modules in the application and I would have to develop Dashboards for every role in the organization for whom I are going to develop application. We Decided to use Asp.Net Web Api and return json data from our apis. We are going to include 11 Self hosted web apis projects in our application (one self hosted web api) for every module. All 11 modules are connected to one Sql server 2012 Database. Then once api is ready we would have to create Business Dashboards (Based upon roles in Organization). So Now my web api client is Asp.Net Mvc application.Asp.Net mvc will consume those web apis. Here is the part for whom all explanation is done. How should I manage Security of all 11 self hosted web apis? How should I only authenticated request is coming? If I authenticate user by login and password and then redirect user to appropriate Dashboard designed for the role that user have and load data by consuming web apis. How should I ensure that the request coming for accessing data is authenticated?

    Read the article

  • Applications on the Web/Cloud the way to go? over Desktop apps?

    - by jiewmeng
    i am currently a mainly web developer, but is quite attracted to the performance and great integration with the OS (eg. Windows 7, Jump Lists, Taskbar Thumbnails, etc) something like WPF/C# can provide to the user, improving workflow and productivity. privacy and performance seems like a major downside of web/cloud apps compared to desktop apps. applications on the cloud/web work on the go, increased popularity of smartphones/netbooks majority of users may not benefit as much from increased performance of desktop apps, eg. internet surfing, word processing, probably benefit more from decreased startup times, lower costs and data on the cloud desktop applications increased performance benefits power users like 3D rendering, HD video/photo editing, gamers (i wonder if such processing maybe offset to cloud processing) integration with OS increases productivity (maybe such features can be adapted to a web version? maybe with a local desktop app to work with Web App API) more control over privacy (maybe fixed by encryption?) local data access (esp. large files) guaranteed and fast (YouTube HD fast enough most of the time) work not affected by intermittent/slow/availability internet connections (i know this is changing tho) what do you think?

    Read the article

  • Beginning Game Programming-ADVICES!

    - by udsha
    I like to continue my future career on way that I am looking to choose from few more ways I would like to do. Computer Game Programming / Networking(Security) ...etc Then It is good to know the risks having no those fields and how to begin it as a game programmer! I want those kind of advices and guidance . Can Anyone show me the path ?

    Read the article

< Previous Page | 10 11 12 13 14 15 16 17 18 19 20 21  | Next Page >