Search Results

Search found 11601 results on 465 pages for 'obiee installs and config'.

Page 17/465 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • web.config to redirect except some given IPs

    - by Alvin
    I'm looking for a web.config which is equivalent as the .htaccess file below. <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{REMOTE_HOST} !^123\.123\.123\.123 RewriteCond %{REMOTE_HOST} !^321\.321\.321\.321 RewriteCond %{REQUEST_URI} !/coming-soon\.html$ RewriteRule (.*)$ /coming-soon.html [R=302,L] </IfModule> Which redirects everyone to a coming soon page except for the given IPs. Unfortunately I'm not familiar with IIS. Thank you

    Read the article

  • Value of AppDomain.CurrentDomain.SetupInformation.ConfigurationFile changes based on if the file exi

    - by Dan Neely
    I have a check to make sure the app.config file exists and to report an error if it does not: System.Windows.Forms.MessageBox.Show(AppDomain.CurrentDomain.SetupInformation.ConfigurationFile); if (!File.Exists(AppDomain.CurrentDomain.SetupInformation.ConfigurationFile)) { throw new ConfigurationErrorsException("Unable to find configuration file. File is expected at location: " + AppDomain.CurrentDomain.SetupInformation.ConfigurationFile + "\n"); } When I build the app.config file in my solution is added to the output directory as AppName.exe.config, and if run from outside visual studio AppDomain.CurrentDomain.SetupInformation.ConfigurationFile contains the path C:...\AppName.exe.config (from within VS it's C:..\AppName.vshost.exe.config). If I delete AppName.exe.config, the value is C:..\Appname.config (no .exe). I did a bit of farther experimentation, and if Appname.config exists that file will also work to load my setting values. What's going on here? I need to have everything consistent for error reporting purposes.

    Read the article

  • Does <httpRedirect> in web.config work in a mono setup? Or is it IIS7 specific?

    - by Crystal
    We had some content restructure recently and I'd like to put in some redirect rules into web.config so bookmarks to the old pages can get routed to their new locations/pages. I tried using this approach: <location path="~/product/productA.aspx"> <system.webServer> <httpRedirect enabled="true" destination="~/product/category/productA.aspx" exactDestination="false" childOnly="true" httpResponseStatus="Permanent" /> </system.webServer> </location> But all I'm getting when I go to "http://www.oursite.com/product/productA.aspx" is our http 404 page. Am I doing something wrong, or is the httpRedirect tag in web.config not supported in mono? worked, but not ideal for what we want to achieve. Thank you :)

    Read the article

  • How to version control config files pragmatically?

    - by erenon
    Suppose we have a config file with sensitive passwords. I'd like to version control the whole project, including the config file as well, but I don't want to share my passwords. That could be good, if this config file: password=secret foo=bar becomes password=* foo=bar and the other users of the vcs could also set up the password on they own. To ignoring the file isn't a good approach, the developers should be aware, if the config file changes. Example: Local version: password=own_secret foo=bar config file in vcs: password=* foo=bar Then suddenly, the config file changes: password=* foo=bar baz=foo And the local version would become for each developer: password=own_secret foo=bar baz=foo This is my solution. How could I achieve this behaviour? How do you store your config files? Is there a way to do that, or should I hack something?

    Read the article

  • Is it possible to programmatically control c# health monitoring without using the web.config file?

    - by Adam
    I have developed my own custom provider for the health monitoring; however, I use parameters in the constructor and this is not allowed when using the health monitoring from the web.config file. Does anyone know if I can turn on/off the monitoring and have it watch properly through code (possibly in my global.asax file on application startup). Or, is it possible for me to create my own watcher that will do the same thing as the health monitor. Or, finally - can I just pass variables from the web.config setup (i'm not familiar with the public token part of the provider type declaration). Thanks in advance

    Read the article

  • Can StructureMap be configured so that one can use different .config settings based on whether the p

    - by Mark Rogers
    I know that in StructureMap I can read from my *.config files (or files referenced by them), when I want to pass specific arguments to an object's constructor. ForRequestedType<IConfiguration>() .TheDefault.Is.OfConcreteType<SqlServerConfiguration>() .WithCtorArg("db_server_address") .EqualToAppSetting("data.db_server_address") But what I would like to do is read from one config setting in debug mode and another in release mode. Sure I could surround the .EqualToAppSetting("data.db_server_address"), with #if DEBUG, but for some reason those statements make me cringe a little when I put them in. I'd like to know if there was some way to do this with the StructureMap library itself. So can I feed my objects different settings based on whether the project is built in debug or release mode?

    Read the article

  • Can you pull the connectionString for a log4net AdoNetAppender from elsewhere in a web.config file?

    - by mrsharps
    Hey all. This is my first StackOverflow question. I'm already have a db connection string in my web.config file. I scanned the log4net docs, but can't seem to find a way to use it within the log4net section of my web.config file. Is is possible to do something like this? <connectionStrings> <add name="connStr" connectionString="Data Source=localhost; ..." /> </connectionStrings> <log4net> <appender name="AdoNetAppender" type="log4net.Appender.AdoNetAppender"> <connectionString connectionStringName="connStr"/> ... </log4net> Thanks!

    Read the article

  • How do I use python wx:Config to access the Windows registry?

    - by GreenAsJade
    I've read http://wxpython.org/docs/api/wx.ConfigBase-class.html I've done some basic things like the appended. What I can see is that Config.Create() returns me some sort of configuration object, which has information about python in it. But clearly that's not what I'm looking for: I seem to be missing the magic to say "give me a Config that is the Windows Registry"... Thanks! GaJ import wx from wx import Config app=wx.App(False) config=Config.Create() config.HasGroup("HKEY_CURRENT_USER") False config.GetFirstEntry() (0, u'', -1) config.GetFirstGroup() (1, u'PythonCore', 1) config.GetNextGroup(1) (0, u'', -1) config.GetNumberOfGroups() 1 config.GetPath() u'' config.HasEntry("PythonCore") False config.GetFirstGroup() (1, u'PythonCore', 1)

    Read the article

  • using an encrypted web.config file

    - by regy
    My aim is to make the web.config not readable by external users, but my application should be able to access it. Is there any way to do this??? I have tried the following way, but how to set the application to use string instead of web.config??? I want to encrypt my web.config file so that others do not open the file using any editor like notepad. But my application should be able to use the same web.config file. I could encrypt the web.config file and decrypt it inside the application and I saved the entire web.config to a string file. Now I want to use this string variable instead of web.config(now in encrypted form, which cannot be accessed by the application).

    Read the article

  • Cucumber-rails on jruby installs gem into my apps root directory with bundler

    - by brad
    Just installed cucumber 0.7.2 and cucumber-rails 0.3.1 with jruby-1.4.0 on OSX. When I run a bundle install, it places a cucumber-rails directory in my main app with all of the gem code/dependencies. First off, this is definitely not what I want and I'm not sure why this happens for cucumber-rails only. Second, if I delete this folder and just manually install cucumber-rails, when I run script/generate feature blah I get /Users/bradrobertson/.rvm/rubies/jruby-1.4.0/lib/ruby/site_ruby/1.8/rubygems/source_index.rb:344:in `refresh!': source index not created from disk (RuntimeError) from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/rails/vendor_gem_source_index.rb:34:in `refresh!' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/rails/vendor_gem_source_index.rb:29:in `initialize' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/rails/gem_dependency.rb:21:in `new' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/rails/gem_dependency.rb:21:in `add_frozen_gem_path' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/initializer.rb:298:in `add_gem_load_paths' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/initializer.rb:132:in `process' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/initializer.rb:113:in `run' from /Users/bradrobertson/Repos/app/source/trunk/config/environment.rb:13 from /Users/bradrobertson/Repos/app/source/trunk/config/environment.rb:1:in `require' from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/commands/generate.rb:1 from /Users/bradrobertson/.rvm/gems/jruby-1.4.0/gems/rails-2.3.5/lib/commands/generate.rb:3:in `require' from script/generate:3 Similarly running rake cucumber I get rake aborted! source index not created from disk So something obviously doesn't work. If I add that cucumber-rails directory back in, then my rake cucumber actually runs. Can someone tell me why it would need to install the gem right in my rails app? I've never seen this before. setup jruby-1.4.0 cucumber-0.7.2 cucumber-rails 0.3.1 bundler 0.9.23 webrat 0.7.1

    Read the article

  • Use ini/appconfig file or sql server file to store user config?

    - by h2g2java
    I know that the preference for INI or appconfig XML is their human readability. Let's say user preferences stored for my app are hierarchical and numbers about a thousand items and it would be really confusing for a user to edit an INI to change things anyway. I have always been using a combination of INI with appconfig. I am leaning towards using sql server db file, now. Every time the user changes a preference while using the app, it would be stored into the db file - that's my line of thought. I am also thinking that such a config db file could move around with the app too, just like an INI. Before I do that, any advice 1. If there are any disadvantages against using a db file over INI or appconfig. 2. If a shop uses mysql or oracle, do you think your colleagues would lift up their pro-mysql or pro-oracle eyebrow questioning why you would use sql server technology in a mysql or oracle shop? I mean, I am just using it like an INI file or app.config anyway, right?

    Read the article

  • New Communications Industry Data Model with "Factory Installed" Predictive Analytics using Oracle Da

    - by charlie.berger
    Oracle Introduces Oracle Communications Data Model to Provide Actionable Insight for Communications Service Providers   We've integrated pre-installed analytical methodologies with the new Oracle Communications Data Model to deliver automated, simple, yet powerful predictive analytics solutions for customers.  Churn, sentiment analysis, identifying customer segments - all things that can be anticipated and hence, preconcieved and implemented inside an applications.  Read on for more information! TM Forum Management World, Nice, France - 18 May 2010 News Facts To help communications service providers (CSPs) manage and analyze rapidly growing data volumes cost effectively, Oracle today introduced the Oracle Communications Data Model. With the Oracle Communications Data Model, CSPs can achieve rapid time to value by quickly implementing a standards-based enterprise data warehouse that features communications industry-specific reporting, analytics and data mining. The combination of the Oracle Communications Data Model, Oracle Exadata and the Oracle Business Intelligence (BI) Foundation represents the most comprehensive data warehouse and BI solution for the communications industry. Also announced today, Hong Kong Broadband Network enhanced their data warehouse system, going live on Oracle Communications Data Model in three months. The leading provider increased its subscriber base by 37 percent in six months and reduced customer churn to less than one percent. Product Details Oracle Communications Data Model provides industry-specific schema and embedded analytics that address key areas such as customer management, marketing segmentation, product development and network health. CSPs can efficiently capture and monitor critical data and transform it into actionable information to support development and delivery of next-generation services using: More than 1,300 industry-specific measurements and key performance indicators (KPIs) such as network reliability statistics, provisioning metrics and customer churn propensity. Embedded OLAP cubes for extremely fast dimensional analysis of business information. Embedded data mining models for sophisticated trending and predictive analysis. Support for multiple lines of business, such as cable, mobile, wireline and Internet, which can be easily extended to support future requirements. With Oracle Communications Data Model, CSPs can jump start the implementation of a communications data warehouse in line with communications-industry standards including the TM Forum Information Framework (SID), formerly known as the Shared Information Model. Oracle Communications Data Model is optimized for any Oracle Database 11g platform, including Oracle Exadata, which can improve call data record query performance by 10x or more. Supporting Quotes "Oracle Communications Data Model covers a wide range of business areas that are relevant to modern communications service providers and is a comprehensive solution - with its data model and pre-packaged templates including BI dashboards, KPIs, OLAP cubes and mining models. It helps us save a great deal of time in building and implementing a customized data warehouse and enables us to leverage the advanced analytics quickly and more effectively," said Yasuki Hayashi, executive manager, NTT Comware Corporation. "Data volumes will only continue to grow as communications service providers expand next-generation networks, deploy new services and adopt new business models. They will increasingly need efficient, reliable data warehouses to capture key insights on data such as customer value, network value and churn probability. With the Oracle Communications Data Model, Oracle has demonstrated its commitment to meeting these needs by delivering data warehouse tools designed to fill communications industry-specific needs," said Elisabeth Rainge, program director, Network Software, IDC. "The TM Forum Conformance Mark provides reassurance to customers seeking standards-based, and therefore, cost-effective and flexible solutions. TM Forum is extremely pleased to work with Oracle to certify its Oracle Communications Data Model solution. Upon successful completion, this certification will represent the broadest and most complete implementation of the TM Forum Information Framework to date, with more than 130 aggregate business entities," said Keith Willetts, chairman and chief executive officer, TM Forum. Supporting Resources Oracle Communications Oracle Communications Data Model Data Sheet Oracle Communications Data Model Podcast Oracle Data Warehousing Oracle Communications on YouTube Oracle Communications on Delicious Oracle Communications on Facebook Oracle Communications on Twitter Oracle Communications on LinkedIn Oracle Database on Twitter The Data Warehouse Insider Blog

    Read the article

  • Oracle BI Server Modeling, Part 1- Designing a Query Factory

    - by bob.ertl(at)oracle.com
      Welcome to Oracle BI Development's BI Foundation blog, focused on helping you get the most value from your Oracle Business Intelligence Enterprise Edition (BI EE) platform deployments.  In my first series of posts, I plan to show developers the concepts and best practices for modeling in the Common Enterprise Information Model (CEIM), the semantic layer of Oracle BI EE.  In this segment, I will lay the groundwork for the modeling concepts.  First, I will cover the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing. Oracle BI EE Query Cycle The purpose of the Oracle BI Server is to bridge the gap between the presentation services and the data sources.  There are typically a variety of data sources in a variety of technologies: relational, normalized transaction systems; relational star-schema data warehouses and marts; multidimensional analytic cubes and financial applications; flat files, Excel files, XML files, and so on. Business datasets can reside in a single type of source, or, most of the time, are spread across various types of sources. Presentation services users are generally business people who need to be able to query that set of sources without any knowledge of technologies, schemas, or how sources are organized in their company. They think of business analysis in terms of measures with specific calculations, hierarchical dimensions for breaking those measures down, and detailed reports of the business transactions themselves.  Most of them create queries without knowing it, by picking a dashboard page and some filters.  Others create their own analysis by selecting metrics and dimensional attributes, and possibly creating additional calculations. The BI Server bridges that gap from simple business terms to technical physical queries by exposing just the business focused measures and dimensional attributes that business people can use in their analyses and dashboards.   After they make their selections and start the analysis, the BI Server plans the best way to query the data sources, writes the optimized sequence of physical queries to those sources, post-processes the results, and presents them to the client as a single result set suitable for tables, pivots and charts. The CEIM is a model that controls the processing of the BI Server.  It provides the subject areas that presentation services exposes for business users to select simplified metrics and dimensional attributes for their analysis.  It models the mappings to the physical data access, the calculations and logical transformations, and the data access security rules.  The CEIM consists of metadata stored in the repository, authored by developers using the Administration Tool client.     Presentation services and other query clients create their queries in BI EE's SQL-92 language, called Logical SQL or LSQL.  The API simply uses ODBC or JDBC to pass the query to the BI Server.  Presentation services writes the LSQL query in terms of the simplified objects presented to the users.  The BI Server creates a query plan, and rewrites the LSQL into fully-detailed SQL or other languages suitable for querying the physical sources.  For example, the LSQL on the left below was rewritten into the physical SQL for an Oracle 11g database on the right. Logical SQL   Physical SQL SELECT "D0 Time"."T02 Per Name Month" saw_0, "D4 Product"."P01  Product" saw_1, "F2 Units"."2-01  Billed Qty  (Sum All)" saw_2 FROM "Sample Sales" ORDER BY saw_0, saw_1       WITH SAWITH0 AS ( select T986.Per_Name_Month as c1, T879.Prod_Dsc as c2,      sum(T835.Units) as c3, T879.Prod_Key as c4 from      Product T879 /* A05 Product */ ,      Time_Mth T986 /* A08 Time Mth */ ,      FactsRev T835 /* A11 Revenue (Billed Time Join) */ where ( T835.Prod_Key = T879.Prod_Key and T835.Bill_Mth = T986.Row_Wid) group by T879.Prod_Dsc, T879.Prod_Key, T986.Per_Name_Month ) select SAWITH0.c1 as c1, SAWITH0.c2 as c2, SAWITH0.c3 as c3 from SAWITH0 order by c1, c2   Probably everybody reading this blog can write SQL or MDX.  However, the trick in designing the CEIM is that you are modeling a query-generation factory.  Rather than hand-crafting individual queries, you model behavior and relationships, thus configuring the BI Server machinery to manufacture millions of different queries in response to random user requests.  This mass production requires a different mindset and approach than when you are designing individual SQL statements in tools such as Oracle SQL Developer, Oracle Hyperion Interactive Reporting (formerly Brio), or Oracle BI Publisher.   The Structure of the Common Enterprise Information Model (CEIM) The CEIM has a unique structure specifically for modeling the relationships and behaviors that fill the gap from logical user requests to physical data source queries and back to the result.  The model divides the functionality into three specialized layers, called Presentation, Business Model and Mapping, and Physical, as shown below. Presentation services clients can generally only see the presentation layer, and the objects in the presentation layer are normally the only ones used in the LSQL request.  When a request comes into the BI Server from presentation services or another client, the relationships and objects in the model allow the BI Server to select the appropriate data sources, create a query plan, and generate the physical queries.  That's the left to right flow in the diagram below.  When the results come back from the data source queries, the right to left relationships in the model show how to transform the results and perform any final calculations and functions that could not be pushed down to the databases.   Business Model Think of the business model as the heart of the CEIM you are designing.  This is where you define the analytic behavior seen by the users, and the superset library of metric and dimension objects available to the user community as a whole.  It also provides the baseline business-friendly names and user-readable dictionary.  For these reasons, it is often called the "logical" model--it is a virtual database schema that persists no data, but can be queried as if it is a database. The business model always has a dimensional shape (more on this in future posts), and its simple shape and terminology hides the complexity of the source data models. Besides hiding complexity and normalizing terminology, this layer adds most of the analytic value, as well.  This is where you define the rich, dimensional behavior of the metrics and complex business calculations, as well as the conformed dimensions and hierarchies.  It contributes to the ease of use for business users, since the dimensional metric definitions apply in any context of filters and drill-downs, and the conformed dimensions enable dashboard-wide filters and guided analysis links that bring context along from one page to the next.  The conformed dimensions also provide a key to hiding the complexity of many sources, including federation of different databases, behind the simple business model. Note that the expression language in this layer is LSQL, so that any expression can be rewritten into any data source's query language at run time.  This is important for federation, where a given logical object can map to several different physical objects in different databases.  It is also important to portability of the CEIM to different database brands, which is a key requirement for Oracle's BI Applications products. Your requirements process with your user community will mostly affect the business model.  This is where you will define most of the things they specifically ask for, such as metric definitions.  For this reason, many of the best-practice methodologies of our consulting partners start with the high-level definition of this layer. Physical Model The physical model connects the business model that meets your users' requirements to the reality of the data sources you have available. In the query factory analogy, think of the physical layer as the bill of materials for generating physical queries.  Every schema, table, column, join, cube, hierarchy, etc., that will appear in any physical query manufactured at run time must be modeled here at design time. Each physical data source will have its own physical model, or "database" object in the CEIM.  The shape of each physical model matches the shape of its physical source.  In other words, if the source is normalized relational, the physical model will mimic that normalized shape.  If it is a hypercube, the physical model will have a hypercube shape.  If it is a flat file, it will have a denormalized tabular shape. To aid in query optimization, the physical layer also tracks the specifics of the database brand and release.  This allows the BI Server to make the most of each physical source's distinct capabilities, writing queries in its syntax, and using its specific functions. This allows the BI Server to push processing work as deep as possible into the physical source, which minimizes data movement and takes full advantage of the database's own optimizer.  For most data sources, native APIs are used to further optimize performance and functionality. The value of having a distinct separation between the logical (business) and physical models is encapsulation of the physical characteristics.  This encapsulation is another enabler of packaged BI applications and federation.  It is also key to hiding the complex shapes and relationships in the physical sources from the end users.  Consider a routine drill-down in the business model: physically, it can require a drill-through where the first query is MDX to a multidimensional cube, followed by the drill-down query in SQL to a normalized relational database.  The only difference from the user's point of view is that the 2nd query added a more detailed dimension level column - everything else was the same. Mappings Within the Business Model and Mapping Layer, the mappings provide the binding from each logical column and join in the dimensional business model, to each of the objects that can provide its data in the physical layer.  When there is more than one option for a physical source, rules in the mappings are applied to the query context to determine which of the data sources should be hit, and how to combine their results if more than one is used.  These rules specify aggregate navigation, vertical partitioning (fragmentation), and horizontal partitioning, any of which can be federated across multiple, heterogeneous sources.  These mappings are usually the most sophisticated part of the CEIM. Presentation You might think of the presentation layer as a set of very simple relational-like views into the business model.  Over ODBC/JDBC, they present a relational catalog consisting of databases, tables and columns.  For business users, presentation services interprets these as subject areas, folders and columns, respectively.  (Note that in 10g, subject areas were called presentation catalogs in the CEIM.  In this blog, I will stick to 11g terminology.)  Generally speaking, presentation services and other clients can query only these objects (there are exceptions for certain clients such as BI Publisher and Essbase Studio). The purpose of the presentation layer is to specialize the business model for different categories of users.  Based on a user's role, they will be restricted to specific subject areas, tables and columns for security.  The breakdown of the model into multiple subject areas organizes the content for users, and subjects superfluous to a particular business role can be hidden from that set of users.  Customized names and descriptions can be used to override the business model names for a specific audience.  Variables in the object names can be used for localization. For these reasons, you are better off thinking of the tables in the presentation layer as folders than as strict relational tables.  The real semantics of tables and how they function is in the business model, and any grouping of columns can be included in any table in the presentation layer.  In 11g, an LSQL query can also span multiple presentation subject areas, as long as they map to the same business model. Other Model Objects There are some objects that apply to multiple layers.  These include security-related objects, such as application roles, users, data filters, and query limits (governors).  There are also variables you can use in parameters and expressions, and initialization blocks for loading their initial values on a static or user session basis.  Finally, there are Multi-User Development (MUD) projects for developers to check out units of work, and objects for the marketing feature used by our packaged customer relationship management (CRM) software.   The Query Factory At this point, you should have a grasp on the query factory concept.  When developing the CEIM model, you are configuring the BI Server to automatically manufacture millions of queries in response to random user requests. You do this by defining the analytic behavior in the business model, mapping that to the physical data sources, and exposing it through the presentation layer's role-based subject areas. While configuring mass production requires a different mindset than when you hand-craft individual SQL or MDX statements, it builds on the modeling and query concepts you already understand. The following posts in this series will walk through the CEIM modeling concepts and best practices in detail.  We will initially review dimensional concepts so you can understand the business model, and then present a pattern-based approach to learning the mappings from a variety of physical schema shapes and deployments to the dimensional model.  Along the way, we will also present the dimensional calculation template, and learn how to configure the many additivity patterns.

    Read the article

  • Oracle Financial Analytics for SAP Certified with Oracle Data Integrator EE

    - by denis.gray
    Two days ago Oracle announced the release of Oracle Financial Analytics for SAP.  With the amount of press this has garnered in the past two days, there's a key detail that can't be missed.  This release is certified with Oracle Data Integrator EE - now making the combination of Data Integration and Business Intelligence a force to contend with.  Within the Oracle Press Release there were two important bullets: ·         Oracle Financial Analytics for SAP includes a pre-packaged ABAP code compliant adapter and is certified with Oracle Data Integrator Enterprise Edition to integrate SAP Financial Accounting data directly with the analytic application.  ·         Helping to integrate SAP financial data and disparate third-party data sources is Oracle Data Integrator Enterprise Edition which delivers fast, efficient loading and transformation of timely data into a data warehouse environment through its high-performance Extract Load and Transform (E-LT) technology. This is very exciting news, demonstrating Oracle's overall commitment to Oracle Data Integrator EE.   This is a great way to start off the new year and we look forward to building on this momentum throughout 2011.   The following links contain additional information and media responses about the Oracle Financial Analytics for SAP release. IDG News Service (Also appeared in PC World, Computer World, CIO: "Oracle is moving further into rival SAP's turf with Oracle Financial Analytics for SAP, a new BI (business intelligence) application that can crunch ERP (enterprise resource planning) system financial data for insights." Information Week: "Oracle talks a good game about the appeal of an optimized, all-Oracle stack. But the company also recognizes that we live in a predominantly heterogeneous IT world" CRN: "While some businesses with SAP Financial Accounting already use Oracle BI, those integrations had to be custom developed. The new offering provides pre-built integration capabilities." ECRM Guide:  "Among other features, Oracle Financial Analytics for SAP helps front-line managers improve financial performance and decision-making with what the company says is comprehensive, timely and role-based information on their departments' expenses and revenue contributions."   SAP Getting Started Guide for ODI on OTN: http://www.oracle.com/technetwork/middleware/data-integrator/learnmore/index.html For more information on the ODI and its SAP connectivity please review the Oracle® Fusion Middleware Application Adapters Guide for Oracle Data Integrator11g Release 1 (11.1.1)

    Read the article

  • New DevExpress Web.Config Settings

    Starting with DXperience v2010.1, were making a small and useful change. Were adding a new section to the web.config file for settings used by DevExpress ASP.NET controls. New Section Heres the default section that youll find at the bottom of a new web project using DXperience v2010.1 release: <devExpress> <compression enableHtmlCompression="false" enableCallbackCompression="true" enableResourceCompression="true" ...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Permanent redirect domain to www subdomain without web.config

    - by Lord Simpson
    I've just set up a site via 1and1 and have run into an issue, I want to accomplish the simple task of redirecting the root domain to the www sub domain however due to complications I cant seam to find a way to get it to work. I'm on a Microsoft (asp.net) package so can't use .htaccess, also the IIS server they have doesn't have the URL redirect module installed (so can't use <rewrite> in web.config). They have built in HTTP forwarding options however if I set the root domain to redirect to the www sub domain it just infinitely redirects. Hopefully there is some obvious option/method I've missed during the past two days of searching!

    Read the article

  • Cannot log into Cinnamon after deleting ~/.config

    - by msoa
    After I removed "./.config" from Home folder, I can not log in to the cinnamon session: failed to load session "cinnamon" What do I do? xsession-error: Xsession: X session started for tux at Fri Oct 26 06:35:58 IRST 2012 localuser:tux being added to access control list Setting IM through im-switch for locale=en_US. Start IM through /etc/X11/xinit/xinput.d/all_ALL linked to /etc/X11/xinit/xinput.d/default. Failed to connect to the VirtualBox kernel service Failed to connect to the VirtualBox kernel service Failed to connect to the VirtualBox kernel service Failed to connect to the VirtualBox kernel service I am runing Cinnamon on a local machine, no on Virtualbox. but virtualbox is installed for some usage. !?

    Read the article

  • How to config multiple monitor with optimus?

    - by irrational
    I have an Acer Aspire 8951G running 12.04 Pangolin with bumblebee working beautifully. My problem is that when I connect either the VGA port or the HDMI to my projector there is no way I can see to properly set up the resolutions or colours. The default basic display driver sees the projector correctly, but messes up colours and resolutions (on hdmi) and resolutions on VGA. (Its a 1280 X 720 projector) Am I missing some sort of Xorg configuration? nvidia-xconfig does not seem to exist and running optirun nvidia-settings -c :8 opens the settings, but of course only for the one display. I just want a way to set a default config for my projector via VGA or preferably HDMI. Any help would be wonderful.

    Read the article

  • Free Live Webinar on Oracle Primavera P6 Analytics

    - by mark.kromer
    We are having a free live webinar to introduce customers to the new Oracle Primavera P6 Analytics built on the Oracle Business Intelligence platform. Here is the registration link for this webinar which is on June 18 @ 2 PM EDT: https://event.on24.com/eventRegistration/EventLobbyServlet?target=registration.jsp&eventid=209488&sessionid=1&key=DC3994754137CE4292161B2041C0E35D&partnerref=homepagebanner Hope to see you there! Best, Mark

    Read the article

  • Backup systems config files

    - by David ???
    I'm planning on installing nVidia proprietary drivers on my Ubuntu 10.10. Historically this always ends-up with me being left with no graphical interface. No ability to revert - and reinstalling the whole system. So now, before trying this anew, I wish to backup all relevant config files. I'll try 1 or 2 methods. I'll list each one's commands. I'll appreciate if anyone can tell me how to backup the relevant file, or what's the reverse of this operation. 10x, David Method I - as described here: apt-get --purge remove xserver-xorg-video-nouveau As described in this answer: edit /etc/default/grub and add the line GRUB_CMDLINE_LINUX="nouveau.modeset=0" sudo update-grub Reboot Install original drivers downloaded from nVidia site. Method II - as described here: sudo apt-get purge nvidia* [possibly 'sudo gedit /etc/modprobe.d/blacklist.conf' adding 'vga16fb' 'nouveau' sudo apt-get install nvidia-glx-185 sudo modprobe nvidia sudo lsmod | grep -i nvidia sudo nvidia-xconfig

    Read the article

  • Top 10 Vulnerabilidades de Seguridad en el WEB.CONFIG- PARTE 1

    - by Jason Ulloa
    Durante estos post, mostraré los 10 problemas o errores de configuración en el web.config que provocan grandes vulnerabilidades en las aplicaciones. Estos errores, en su mayoría vienen dados por desconocimiento a fondo del manejo de las secciones de configuración de nuestras aplicaciones. En esta primera parte, veremos los primeros 5 de ellos. 1. El modo Custom Errors Este es el primero de nuestra lista, ya que, será uno de los que casi siempre habilitemos cuando estamos desarrollando una aplicación web y que es de mucho cuidado. Una etiqueta común y vulnerable de esta configuración sería <configuration> <system.web> <customErrors mode="Off">   Una forma de corregir la vulnerabilidad que se expone a continuación sería cambiando la etiqueta por <configuration> <system.web> <customErrors mode="RemoteOnly">

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >