Search Results

Search found 59142 results on 2366 pages for 'data annotation'.

Page 19/2366 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Java @Contented annotation to help reduce false sharing

    - by Dave
    See this posting by Aleksey Shipilev for details -- @Contended is something we've wanted for a long time. The JVM provides automatic layout and placement of fields. Usually it'll (a) sort fields by descending size to improve footprint, and (b) pack reference fields so the garbage collector can process a contiguous run of reference fields when tracing. @Contended gives the program a way to provide more explicit guidance with respect to concurrency and false sharing. Using this facility we can sequester hot frequently written shared fields away from other mostly read-only or cold fields. The simple rule is that read-sharing is cheap, and write-sharing is very expensive. We can also pack fields together that tend to be written together by the same thread at about the same time. More generally, we're trying to influence relative field placement to minimize coherency misses. Fields that are accessed closely together in time should be placed proximally in space to promote cache locality. That is, temporal locality should condition spatial locality. Fields accessed together in time should be nearby in space. That having been said, we have to be careful to avoid false sharing and excessive invalidation from coherence traffic. As such, we try to cluster or otherwise sequester fields that tend to written at approximately the same time by the same thread onto the same cache line. Note that there's a tension at play: if we try too hard to minimize single-threaded capacity misses then we can end up with excessive coherency misses running in a parallel environment. Theres no single optimal layout for both single-thread and multithreaded environments. And the ideal layout problem itself is NP-hard. Ideally, a JVM would employ hardware monitoring facilities to detect sharing behavior and change the layout on the fly. That's a bit difficult as we don't yet have the right plumbing to provide efficient and expedient information to the JVM. Hint: we need to disintermediate the OS and hypervisor. Another challenge is that raw field offsets are used in the unsafe facility, so we'd need to address that issue, possibly with an extra level of indirection. Finally, I'd like to be able to pack final fields together as well, as those are known to be read-only.

    Read the article

  • In Search Data Structure And Algorithm Project Title Based on Topic

    - by Salehin Suhaimi
    As the title says, my lecturer gave me a project that i needed to finish in 3 weeks before final semester exams. So i thought i will start now. The requirement is to "build a simple program that has GUI based on all the chapter that we've learned." But i got stuck on WHAT program should i build. Any idea a program that is related to this chapter i've learned? Any input will help. list, array list, linked list, vectors, stacks, Queues, ADT, Hashing, Binary Search Tree, AVL Tree, That's about all i can remember. Any idea where can i start looking?

    Read the article

  • Am I sending large amounts of data sensibly?

    - by Sofus Albertsen
    I am about to design a video conversion service, that is scalable on the conversion side. The architecture is as follows: Webpage for video upload When done, a message gets sent out to one of several resizing servers The server locates the video, saves it on disk, and converts it to several formats and resolutions The resizing server uploads the output to a content server, and messages back that the conversion is done. Messaging is something I have covered, but right now I am transferring via FTP, and wonder if there is a better way? is there something faster, or more reliable? All the servers will be sitting in the same gigabit switch or neighboring switch, so fast transfer is expected.

    Read the article

  • Using replacement to get possible outcomes to then search through HUGE amount of data

    - by Samuel Cambridge
    I have a database table holding 40 million records (table A). Each record has a string a user can search for. I also have a table with a list of character replacements (table B) i.e. i = Y, I = 1 etc. I need to be able to take the string a user is searching for, iterate through each letter and create an array of every possible outcome (the users string, then each outcome with alternative letters used). I need to check for alternatives on both lower and uppercase letters in the word A search string can be no longer than 10 characters long. I'm using PHP and a MySQL database. Does anyone have any thoughts / articles / guidance on doing this in an efficient way?

    Read the article

  • Is A Web App Feasible For A Heavy Use Data Entry System?

    - by Rob
    Looking for opinions on this, we're working on a project that is essentially a data entry system for a production line. Heavy data input by users who normally work in Excel or other thick client data systems. We've been told (as a consequence) that we have to develop this as a thick client using .NET. Our argument was to develop as a web app, as it resolves a lot of issues and would be easier to write and maintain. Their argument against the web is that (supposedly) the web is not ready yet for a heavy duty data entry system, and that the web in a browser does not offer the speed, responsiveness, and fluid experience for the end-user that a thick client can (citing things such as drag and drop, rapid auto-entry and data navigation, etc.) Personally, I think that with good form design and JQuery/AJAX, a web app could do everything a thick client does just as well, and they just don't know what they're talking about. The irony is that a thick client has to go to a lot more effort to manage the deployment and connectivity back to the central data server than a web app would need to do, so in terms of speed I would expect a web app to be faster. What are the thoughts of those out there? Are there any technologies currently in production use that modern data entry systems are being developed as web apps in? Appreciate any feedback. Regards, Rob.

    Read the article

  • Pulling Data out of an object in Javascript

    - by PerryCS
    I am having a problem retreiving data out of an object passed back from PHP. I've tried many different ways to access this data and none work. In Firebug I see the following... (it looks nicer in Firebug) - I tried to make this look as close to Firebug as possible results Object { data="{"formName":"form3","formData":"data goes here"}", phpLiveDebug="<...s: 198.91.215.227"} data "{"formName":"form3","formData":"data goes here"}" phpLiveDebug "<...s: 198.91.215.227" I can access phpLiveDebug no problem, but the data portion is an object. I have tried the following... success: function(results) { //$("#formName").val(results.data.formName); //$("#formName").val(results.data[0].formName); //$("#formName").val(results.data[0]); //$("#formName").val(results.data[1]); //$("#formName").val(results.data[0]["formName"]); var tmp = results.data[formName]; alert("!" + tmp + "!"); $("#formName").val(tmp); $("#jqueryPHPDebug").val(results.phpLiveDebug); } This line works in the example above... $("#jqueryPHPDebug").val(results.phpLiveDebug); but... I can't figure out how to get at the data inside the results.data portion... as you can see above, I have been trying different things and more not even listed there. I was really hoping this line would work :) var tmp = results.data[formName]; But it doesn't. So, after many days of reading, tinkering, my solution was to re-write it to return data similar to the phpLiveDebug but then I thought... it's gotta be something simple I'm overlooking... Thank you for your time. Please try and explain why my logic (my horrible attempts at trying to figure out the proper method) above is wrong if you can?

    Read the article

  • Building dynamic OLAP data marts on-the-fly

    - by DrJohn
    At the forthcoming SQLBits conference, I will be presenting a session on how to dynamically build an OLAP data mart on-the-fly. This blog entry is intended to clarify exactly what I mean by an OLAP data mart, why you may need to build them on-the-fly and finally outline the steps needed to build them dynamically. In subsequent blog entries, I will present exactly how to implement some of the techniques involved. What is an OLAP data mart? In data warehousing parlance, a data mart is a subset of the overall corporate data provided to business users to meet specific business needs. Of course, the term does not specify the technology involved, so I coined the term "OLAP data mart" to identify a subset of data which is delivered in the form of an OLAP cube which may be accompanied by the relational database upon which it was built. To clarify, the relational database is specifically create and loaded with the subset of data and then the OLAP cube is built and processed to make the data available to the end-users via standard OLAP client tools. Why build OLAP data marts? Market research companies sell data to their clients to make money. To gain competitive advantage, market research providers like to "add value" to their data by providing systems that enhance analytics, thereby allowing clients to make best use of the data. As such, OLAP cubes have become a standard way of delivering added value to clients. They can be built on-the-fly to hold specific data sets and meet particular needs and then hosted on a secure intranet site for remote access, or shipped to clients' own infrastructure for hosting. Even better, they support a wide range of different tools for analytical purposes, including the ever popular Microsoft Excel. Extension Attributes: The Challenge One of the key challenges in building multiple OLAP data marts based on the same 'template' is handling extension attributes. These are attributes that meet the client's specific reporting needs, but do not form part of the standard template. Now clearly, these extension attributes have to come into the system via additional files and ultimately be added to relational tables so they can end up in the OLAP cube. However, processing these files and filling dynamically altered tables with SSIS is a challenge as SSIS packages tend to break as soon as the database schema changes. There are two approaches to this: (1) dynamically build an SSIS package in memory to match the new database schema using C#, or (2) have the extension attributes provided as name/value pairs so the file's schema does not change and can easily be loaded using SSIS. The problem with the first approach is the complexity of writing an awful lot of complex C# code. The problem of the second approach is that name/value pairs are useless to an OLAP cube; so they have to be pivoted back into a proper relational table somewhere in the data load process WITHOUT breaking SSIS. How this can be done will be part of future blog entry. What is involved in building an OLAP data mart? There are a great many steps involved in building OLAP data marts on-the-fly. The key point is that all the steps must be automated to allow for the production of multiple OLAP data marts per day (i.e. many thousands, each with its own specific data set and attributes). Now most of these steps have a great deal in common with standard data warehouse practices. The key difference is that the databases are all built to order. The only permanent database is the metadata database (shown in orange) which holds all the metadata needed to build everything else (i.e. client orders, configuration information, connection strings, client specific requirements and attributes etc.). The staging database (shown in red) has a short life: it is built, populated and then ripped down as soon as the OLAP Data Mart has been populated. In the diagram below, the OLAP data mart comprises the two blue components: the Data Mart which is a relational database and the OLAP Cube which is an OLAP database implemented using Microsoft Analysis Services (SSAS). The client may receive just the OLAP cube or both components together depending on their reporting requirements.  So, in broad terms the steps required to fulfil a client order are as follows: Step 1: Prepare metadata Create a set of database names unique to the client's order Modify all package connection strings to be used by SSIS to point to new databases and file locations. Step 2: Create relational databases Create the staging and data mart relational databases using dynamic SQL and set the database recovery mode to SIMPLE as we do not need the overhead of logging anything Execute SQL scripts to build all database objects (tables, views, functions and stored procedures) in the two databases Step 3: Load staging database Use SSIS to load all data files into the staging database in a parallel operation Load extension files containing name/value pairs. These will provide client-specific attributes in the OLAP cube. Step 4: Load data mart relational database Load the data from staging into the data mart relational database, again in parallel where possible Allocate surrogate keys and use SSIS to perform surrogate key lookup during the load of fact tables Step 5: Load extension tables & attributes Pivot the extension attributes from their native name/value pairs into proper relational tables Add the extension attributes to the views used by OLAP cube Step 6: Deploy & Process OLAP cube Deploy the OLAP database directly to the server using a C# script task in SSIS Modify the connection string used by the OLAP cube to point to the data mart relational database Modify the cube structure to add the extension attributes to both the data source view and the relevant dimensions Remove any standard attributes that not required Process the OLAP cube Step 7: Backup and drop databases Drop staging database as it is no longer required Backup data mart relational and OLAP database and ship these to the client's infrastructure Drop data mart relational and OLAP database from the build server Mark order complete Start processing the next order, ad infinitum. So my future blog posts and my forthcoming session at the SQLBits conference will all focus on some of the more interesting aspects of building OLAP data marts on-the-fly such as handling the load of extension attributes and how to dynamically alter the structure of an OLAP cube using C#.

    Read the article

  • The Best Data Integration for Exadata Comes from Oracle

    - by maria costanzo
    Oracle Data Integrator and Oracle GoldenGate offer unique and optimized data integration solutions for Oracle Exadata. For example, customers that choose to feed their data warehouse or reporting database with near real-time throughout the day, can do so without decreasing  performance or availability of source and target systems. And if you ask why real-time, the short answer is: in today’s fast-paced, always-on world, business decisions need to use more relevant, timely data to be able to act fast and seize opportunities. A longer response to "why real-time" question can be found in a related blog post. If we look at the solution architecture, as shown on the diagram below,  Oracle Data Integrator and Oracle GoldenGate are both uniquely designed to take full advantage of the power of the database and to eliminate unnecessary middle-tier components. Oracle Data Integrator (ODI) is the best bulk data loading solution for Exadata. ODI is the only ETL platform that can leverage the full power of Exadata, integrate directly on the Exadata machine without any additional hardware, and by far provides the simplest setup and fastest overall performance on an Exadata system. We regularly see customers achieving a 5-10 times boost when they move their ETL to ODI on Exadata. For  some companies the performance gain is even much higher. For example a large insurance company did a proof of concept comparing ODI vs a traditional ETL tool (one of the market leaders) on Exadata. The same process that was taking 5hrs and 11 minutes to complete using the competing ETL product took 7 minutes and 20 seconds with ODI. Oracle Data Integrator was 42 times faster than the conventional ETL when running on Exadata.This shows that Oracle's own data integration offering helps you to gain the most out of your Exadata investment with a truly optimized solution. GoldenGate is the best solution for streaming data from heterogeneous sources into Exadata in real time. Oracle GoldenGate can also be used together with Data Integrator for hybrid use cases that also demand non-invasive capture, high-speed real time replication. Oracle GoldenGate enables real-time data feeds from heterogeneous sources non-invasively, and delivers to the staging area on the target Exadata system. ODI runs directly on Exadata to use the database engine power to perform in-database transformations. Enterprise Data Quality is integrated with Oracle Data integrator and enables ODI to load trusted data into the data warehouse tables. Only Oracle can offer all these technical benefits wrapped into a single intelligence data warehouse solution that runs on Exadata. Compared to traditional ETL with add-on CDC this solution offers: §  Non-invasive data capture from heterogeneous sources and avoids any performance impact on source §  No mid-tier; set based transformations use database power §  Mini-batches throughout the day –or- bulk processing nightly which means maximum availability for the DW §  Integrated solution with Enterprise Data Quality enables leveraging trusted data in the data warehouse In addition to Starwood Hotels and Resorts, Morrison Supermarkets, United Kingdom’s fourth-largest food retailer, has seen the power of this solution for their new BI platform and shared their story with us. Morrisons needed to analyze data across a large number of manufacturing, warehousing, retail, and financial applications with the goal to achieve single view into operations for improved customer service. The retailer deployed Oracle GoldenGate and Oracle Data Integrator to bring new data into Oracle Exadata in near real-time and replicate the data into reporting structures within the data warehouse—extending visibility into operations. Using Oracle's data integration offering for Exadata, Morrisons produced financial reports in seconds, rather than minutes, and improved staff productivity and agility. You can read more about Morrison’s success story here and hear from Starwood here. From an Irem Radzik article.

    Read the article

  • Data access pattern

    - by andlju
    I need some advice on what kind of pattern(s) I should use for pushing/pulling data into my application. I'm writing a rule-engine that needs to hold quite a large amount of data in-memory in order to be efficient enough. I have some rather conflicting requirements; It is not acceptable for the engine to always have to wait for a full pre-load of all data before it is functional. Only fetching and caching data on-demand will lead to the engine taking too long before it is running quickly enough. An external event can trigger the need for specific parts of the data to be reloaded. Basically, I think I need a combination of pushing and pulling data into the application. A simplified version of my current "pattern" looks like this (in psuedo-C# written in notepad): // This interface is implemented by all classes that needs the data interface IDataSubscriber { void RegisterData(Entity data); } // This interface is implemented by the data access class interface IDataProvider { void EnsureLoaded(Key dataKey); void RegisterSubscriber(IDataSubscriber subscriber); } class MyClassThatNeedsData : IDataSubscriber { IDataProvider _provider; MyClassThatNeedsData(IDataProvider provider) { _provider = provider; _provider.RegisterSubscriber(this); } public void RegisterData(Entity data) { // Save data for later StoreDataInCache(data); } void UseData(Key key) { // Make sure that the data has been stored in cache _provider.EnsureLoaded(key); Entity data = GetDataFromCache(key); } } class MyDataProvider : IDataProvider { List<IDataSubscriber> _subscribers; // Make sure that the data for key has been loaded to all subscribers public void EnsureLoaded(Key key) { if (HasKeyBeenMarkedAsLoaded(key)) return; PublishDataToSubscribers(key); MarkKeyAsLoaded(key); } // Force all subscribers to get a new version of the data for key public void ForceReload(Key key) { PublishDataToSubscribers(key); MarkKeyAsLoaded(key); } void PublishDataToSubscribers(Key key) { Entity data = FetchDataFromStore(key); foreach(var subscriber in _subscribers) { subscriber.RegisterData(data); } } } // This class will be spun off on startup and should make sure that all data is // preloaded as quickly as possible class MyPreloadingThread { IDataProvider _provider; MyPreloadingThread(IDataProvider provider) { _provider = provider; } void RunInBackground() { IEnumerable<Key> allKeys = GetAllKeys(); foreach(var key in allKeys) { _provider.EnsureLoaded(key); } } } I have a feeling though that this is not necessarily the best way of doing this.. Just the fact that explaining it seems to take two pages feels like an indication.. Any ideas? Any patterns out there I should have a look at?

    Read the article

  • Data access pattern, combining push and pull?

    - by andlju
    I need some advice on what kind of pattern(s) I should use for pushing/pulling data into my application. I'm writing a rule-engine that needs to hold quite a large amount of data in-memory in order to be efficient enough. I have some rather conflicting requirements; It is not acceptable for the engine to always have to wait for a full pre-load of all data before it is functional. Only fetching and caching data on-demand will lead to the engine taking too long before it is running quickly enough. An external event can trigger the need for specific parts of the data to be reloaded. Basically, I think I need a combination of pushing and pulling data into the application. A simplified version of my current "pattern" looks like this (in psuedo-C# written in notepad): // This interface is implemented by all classes that needs the data interface IDataSubscriber { void RegisterData(Entity data); } // This interface is implemented by the data access class interface IDataProvider { void EnsureLoaded(Key dataKey); void RegisterSubscriber(IDataSubscriber subscriber); } class MyClassThatNeedsData : IDataSubscriber { IDataProvider _provider; MyClassThatNeedsData(IDataProvider provider) { _provider = provider; _provider.RegisterSubscriber(this); } public void RegisterData(Entity data) { // Save data for later StoreDataInCache(data); } void UseData(Key key) { // Make sure that the data has been stored in cache _provider.EnsureLoaded(key); Entity data = GetDataFromCache(key); } } class MyDataProvider : IDataProvider { List<IDataSubscriber> _subscribers; // Make sure that the data for key has been loaded to all subscribers public void EnsureLoaded(Key key) { if (HasKeyBeenMarkedAsLoaded(key)) return; PublishDataToSubscribers(key); MarkKeyAsLoaded(key); } // Force all subscribers to get a new version of the data for key public void ForceReload(Key key) { PublishDataToSubscribers(key); MarkKeyAsLoaded(key); } void PublishDataToSubscribers(Key key) { Entity data = FetchDataFromStore(key); foreach(var subscriber in _subscribers) { subscriber.RegisterData(data); } } } // This class will be spun off on startup and should make sure that all data is // preloaded as quickly as possible class MyPreloadingThread { IDataProvider _provider; MyPreloadingThread(IDataProvider provider) { _provider = provider; } void RunInBackground() { IEnumerable<Key> allKeys = GetAllKeys(); foreach(var key in allKeys) { _provider.EnsureLoaded(key); } } } I have a feeling though that this is not necessarily the best way of doing this.. Just the fact that explaining it seems to take two pages feels like an indication.. Any ideas? Any patterns out there I should have a look at?

    Read the article

  • Import csv data (SDK iphone)

    - by Ni
    I am new to cocoa. I have been working on these stuff for a few days. For the following code, i can read all the data in the string, and successfully get the data for plot. NSMutableArray *contentArray = [NSMutableArray array]; NSString *filePath = @"995,995,995,995,995,995,995,995,1000,997,995,994,992,993,992,989,988,987,990,993,989"; NSArray *myText = [filePath componentsSeparatedByString:@","]; NSInteger idx; for (idx = 0; idx < myText.count; idx++) { NSString *data =[myText objectAtIndex:idx]; NSLog(@"%@", data); id x = [NSNumber numberWithFloat:0+idx*0.002777778]; id y = [NSDecimalNumber decimalNumberWithString:data]; [contentArray addObject: [NSMutableDictionary dictionaryWithObjectsAndKeys:x, @"x", y, @"y", nil]]; } self.dataForPlot = contentArray; then, i try to load the data from csv file. the data in Data.csv file has the same value and the same format as 995,995,995,995,995,995,995,995,1000,997,995,994,992,993,992,989,988,987,990,993,989. I run the code, it is supposed to give the same graph output. however, it seems that the data is not loaded from csv file successfully. i can not figure out what's wrong with my code. NSMutableArray *contentArray = [NSMutableArray array]; NSString *filePath = [[NSBundle mainBundle] pathForResource:@"Data" ofType:@"csv"]; NSString *Data = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:nil ]; if (Data) { NSArray *myText = [Data componentsSeparatedByString:@","]; NSInteger idx; for (idx = 0; idx < myText.count; idx++) { NSString *data =[myText objectAtIndex:idx]; NSLog(@"%@", data); id x = [NSNumber numberWithFloat:0+idx*0.002777778]; id y = [NSDecimalNumber decimalNumberWithString:data]; [contentArray addObject: [NSMutableDictionary dictionaryWithObjectsAndKeys:x, @"x", y, @"y",nil]]; } self.dataForPlot = contentArray; } The only difference is NSString *filePath = [[NSBundle mainBundle] pathForResource:@"Data" ofType:@"csv"]; NSString *Data = [NSString stringWithContentsOfFile:filePath encoding:NSUTF8StringEncoding error:nil ]; if (data){ } did i do anything wrong here?? Thanks for your help!!!!

    Read the article

  • POST data not being received

    - by Alexander
    I've got an iPhone App that is supposed to send POST data to my server to register the device in a MySQL database so we can send notifications etc... to it. It sends it's unique identifier, device name, token, and a few other small things like passwords and usernames as a POST request to our server. The problem is that sometimes the server doesn't receive the data. And by this I mean, its not just receiving blank values for the POST inputs but, its not receiving ANY post data at all. I am logging all POST inputs to my server into some log files and when the script that relies on the POST data from the device fails (detects no data) I notice that its because NO POST data was sent. Is this a problem on the server, like refusing data or something or does this have to be on the client's side? What could be causing this?

    Read the article

  • Oracle Big Data Learning Library - Click on LEARN BY PRODUCT to Open Page

    - by chberger
    Oracle Big Data Learning Library... Learn about Oracle Big Data, Data Science, Learning Analytics, Oracle NoSQL Database, and more! Oracle Big Data Essentials Attend this Oracle University Course! Using Oracle NoSQL Database Attend this Oracle University class! Oracle and Big Data on OTN See the latest resource on OTN. Search Welcome Get Started Learn by Role Learn by Product Latest Additions Additional Resources Oracle Big Data Appliance Oracle Big Data and Data Science Basics Meeting the Challenge of Big Data Oracle Big Data Tutorial Video Series Oracle MoviePlex - a Big Data End-to-End Series of Demonstrations Oracle Big Data Overview Oracle Big Data Essentials Data Mining Oracle NoSQL Database Tutorial Videos Oracle NoSQL Database Tutorial Series Oracle NoSQL Database Release 2 New Features Using Oracle NoSQL Database Exalytics Enterprise Manager 12c R3: Manage Exalytics Setting Up and Running Summary Advisor on an E s Oracle R Enterprise Oracle R Enterprise Tutorial Series Oracle Big Data Connectors Integrate All Your Data with Oracle Big Data Connectors Using Oracle Direct Connector for HDFS to Read the Data from HDSF Using Oracle R Connector for Hadoop to Analyze Data Oracle NoSQL Database Oracle NoSQL Database Tutorial Videos Oracle NoSQL Database Tutorial Series Oracle NoSQL Database Release 2 New Features  Using Oracle NoSQL Database eries Oracle Business Intelligence Enterprise Edition Oracle Business Intelligence Oracle BI 11g R1: Create Analyses and Dashboards - 4 day class Oracle BI Publisher 11g R1: Fundamentals - 3 day class Oracle BI 11g R1: Build Repositories - 5 day class

    Read the article

  • Let's introduce the Oracle Enterprise Data Quality family!

    - by Sarah Zanchetti
    The Oracle Enterprise Data Quality family of products helps you to achieve maximum value from their business applications by delivering fit-­for-­purpose data. OEDQ is a state-of-the-art collaborative data quality profiling, analysis, parsing, standardization, matching and merging product, designed to help you understand, improve, protect and govern the quality of the information your business uses, all from a single integrated environment. Oracle Enterprise Data Quality products are: Oracle Enterprise Data Quality Profile and Audit Oracle Enterprise Data Quality Parsing and Standardization Oracle Enterprise Data Quality Match and Merge Oracle Enterprise Data Quality Address Verification Server Oracle Enterprise Data Quality Product Data Parsing and Standardization Oracle Enterprise Data Quality Product Data Match and Merge Also, the following are some of the key features of OEDQ: Integrated data profiling, auditing, cleansing and matching Browser-based client access Ability to handle all types of data – for example customer, product, asset, financial, operational Connection to any JDBC-compliant data sources and targets Multi-user project support (role-based access, issue tracking, process annotation, and version control) Services Oriented Architecture (SOA) - support for designing processes that may be exposed to external applications as a service Designed to process large data volumes A single repository to hold data along with gathered statistics and project tracking information, with shared access Intuitive graphical user interface designed to help you solve real-world information quality issues quickly Easy, data-led creation and extension of validation and transformation rules Fully extensible architecture allowing the insertion of any required custom processing  If you need to learn more about EDQ, or get assistance for any kind of issue, the Oracle Technology Network offers a huge range of resources on Oracle software. Discuss technical problems and solutions on the Discussion Forums. Get hands-on step-by-step tutorials with Oracle By Example. Download Sample Code. Get the latest news and information on any Oracle product. You can also get further help and information with Oracle software from: My Oracle Support Oracle Support Services An Information Center is available, where you can find technical information and fast solutions to the most common already solved issues: Information Center: Oracle Enterprise Data Quality [ID 1555073.2]

    Read the article

  • strange data annotations issue in MVC 2

    - by femi
    Hello, I came across something strange when creating an edit form with MVC 2. i realised that my error messages come up on form sumission even when i have filled ut valid data! i am using a buddy class which i have configured correctly ( i know that cos i can see my custom errors). Here is the code from the viewmodel that generates this; <%@ Control Language="C#" Inherits="System.Web.Mvc.ViewUserControl<TG_Careers.Models.Applicant>" %> <script src="/Scripts/MicrosoftAjax.js" type="text/javascript"></script> <script src="/Scripts/MicrosoftMvcAjax.js" type="text/javascript"></script> <script src="/Scripts/MicrosoftMvcValidation.js" type="text/javascript"></script> <%= Html.ValidationSummary() %> <% Html.EnableClientValidation(); %> <% using (Html.BeginForm()) {%> <div class="confirm-module"> <table cellpadding="4" cellspacing="2"> <tr> <td><%= Html.LabelFor(model => model.FirstName) %> </td> <td><%= Html.EditorFor(model => model.FirstName) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.FirstName) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.MiddleName) %></td> <td><%= Html.EditorFor(model => model.MiddleName) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.MiddleName) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.LastName) %></td> <td><%= Html.EditorFor(model => model.LastName) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.LastName) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.Gender) %></td> <td><%= Html.EditorFor(model => model.Gender) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.Gender) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.MaritalStatus) %></td> <td> <%= Html.EditorFor(model => model.MaritalStatus) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.MaritalStatus) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.DateOfBirth) %></td> <td><%= Html.EditorFor(model => model.DateOfBirth) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.DateOfBirth) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.Address) %></td> <td><%= Html.EditorFor(model => model.Address) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.Address) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.City) %></td> <td><%= Html.EditorFor(model => model.City) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.City) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.State) %></td> <td><%= Html.EditorFor(model => model.State) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.State) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.StateOfOriginID) %></td> <td><%= Html.DropDownList("StateOfOriginID", new SelectList(ViewData["States"] as IEnumerable, "StateID", "Name", Model.StateOfOriginID))%></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.StateOfOriginID) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.CompletedNYSC) %></td> <td><%= Html.EditorFor(model => model.CompletedNYSC) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.CompletedNYSC) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.YearsOfExperience) %></td> <td><%= Html.EditorFor(model => model.YearsOfExperience) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.YearsOfExperience) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.MobilePhone) %></td> <td><%= Html.EditorFor(model => model.MobilePhone) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.MobilePhone) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.DayPhone) %></td> <td> <%= Html.EditorFor(model => model.DayPhone) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.DayPhone) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.CVFileName) %></td> <td><%= Html.EditorFor(model => model.CVFileName) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.CVFileName) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.CurrentPosition) %></td> <td><%= Html.EditorFor(model => model.CurrentPosition) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.CurrentPosition) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.EmploymentCommenced) %></td> <td><%= Html.EditorFor(model => model.EmploymentCommenced) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.EmploymentCommenced) %></td> </tr> <tr> <td><%= Html.LabelFor(model => model.DateofTakingupCurrentPosition) %></td> <td><%= Html.EditorFor(model => model.DateofTakingupCurrentPosition) %></td> </tr> <tr> <td colspan="2"><%= Html.ValidationMessageFor(model => model.DateofTakingupCurrentPosition) %></td> </tr> <tr> <td>&nbsp;</td> <td>&nbsp;</td> </tr> <tr> <td colspan="2">&nbsp;</td> </tr> </table> <p> <input type="submit" value="Save Profile Details" /> </p> </div> <% } %> Any ideas on this one please? Thanks

    Read the article

  • Oracle Enterprise Data Quality - Geared Up and Ready for OpenWorld 2012

    - by Mala Narasimharajan
    10 days and counting till Oracle OpenWorld 2012 is upon us.  Enterprise data quality is key to every information integration and consolidation initiative. At this year's OpenWorld, hear how Oracle Enterprise Data Quality provides the critical piece to achieving trusted, reliable master data and increases the value of data integration initiatives. Here are the different ways you can learn and experience Enterprise Data Quality at OpenWorld:  Conference sessions: Oracle Enterprise Data Quality: Product Overview and Roadmap - Monday 10/1/12, 1:45-2:45 PM - Moscone West - 3006 Data Preparation and Ongoing Governance with the Oracle Enterprise Data Quality Platform - Wednesday 10/3/2012, 1:15-2:15 PM - Moscone West - 3000  Data Acquisition, Migration and Integration with the Oracle Enterprise Data Quality Platform - Thursday 10/4/2012, 12:45-1:45 PM - Moscone West - 3005  Hands on Labs: Introduction to Oracle Enterprise Data Quality Platform -  Monday 10/2/2012, 4:45-5:45 PM - Marriot Marquis - Salon 1/2 Demos:  Trusted Data with Oracle Enterprise Data Quality - Moscone South, Right - S-243 (note: proceed to Middleware Demo grounds) For a list of Master Data Management and Data Quality sessions and other events click here. 

    Read the article

  • SL3/SL4 - Ado.Net Data Services Error during new DataServiceCollection<T>(queryResponse)

    - by Soulhuntre
    Hey all, I have two functions in a SL project (VS2010) that do almost exactly the same thing, yet one throws an error and the other does not. It seems to be related to the projections, but I am unsure about the best way to resolve. The function that works is... public void LoadAllChunksExpandAll(DataHelperReturnHandler handler, string orderby) { DataServiceCollection<CmsChunk> data = null; DataServiceQuery<CmsChunk> theQuery = _dataservice .CmsChunks .Expand("CmsItemState") .AddQueryOption("$orderby", orderby); theQuery.BeginExecute( delegate(IAsyncResult asyncResult) { _callback_dispatcher.BeginInvoke( () => { try { DataServiceQuery<CmsChunk> query = asyncResult.AsyncState as DataServiceQuery<CmsChunk>; if (query != null) { //create a tracked DataServiceCollection from the result of the asynchronous query. QueryOperationResponse<CmsChunk> queryResponse = query.EndExecute(asyncResult) as QueryOperationResponse<CmsChunk>; data = new DataServiceCollection<CmsChunk>(queryResponse); handler(data); } } catch { handler(data); } } ); }, theQuery ); } This compiles and runs as expected. A very, very similar function (shown below) fails... public void LoadAllPagesExpandAll(DataHelperReturnHandler handler, string orderby) { DataServiceCollection<CmsPage> data = null; DataServiceQuery<CmsPage> theQuery = _dataservice .CmsPages .Expand("CmsChildPages") .Expand("CmsParentPage") .Expand("CmsItemState") .AddQueryOption("$orderby", orderby); theQuery.BeginExecute( delegate(IAsyncResult asyncResult) { _callback_dispatcher.BeginInvoke( () => { try { DataServiceQuery<CmsPage> query = asyncResult.AsyncState as DataServiceQuery<CmsPage>; if (query != null) { //create a tracked DataServiceCollection from the result of the asynchronous query. QueryOperationResponse<CmsPage> queryResponse = query.EndExecute(asyncResult) as QueryOperationResponse<CmsPage>; data = new DataServiceCollection<CmsPage>(queryResponse); handler(data); } } catch { handler(data); } } ); }, theQuery ); } Clearly the issue is the Expand projections that involve a self referencing relationship (pages can contain other pages). This is under SL4 or SL3 using ADONETDataServices SL3 Update CTP3. I am open to any work around or pointers to goo information, a Google search for the error results in two hits, neither particularly helpful that I can decipher. The short error is "An item could not be added to the collection. When items in a DataServiceCollection are tracked by the DataServiceContext, new items cannot be added before items have been loaded into the collection." The full error is... System.Reflection.TargetInvocationException was caught Message=Exception has been thrown by the target of an invocation. StackTrace: at System.RuntimeMethodHandle.InvokeMethodFast(IRuntimeMethodInfo method, Object target, Object[] arguments, SignatureStruct& sig, MethodAttributes methodAttributes, RuntimeType typeOwner) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture, Boolean skipVisibilityChecks) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) at System.Reflection.MethodBase.Invoke(Object obj, Object[] parameters) at System.Data.Services.Client.ClientType.ClientProperty.SetValue(Object instance, Object value, String propertyName, Boolean allowAdd) at System.Data.Services.Client.AtomMaterializer.ApplyItemsToCollection(AtomEntry entry, ClientProperty property, IEnumerable items, Uri nextLink, ProjectionPlan continuationPlan) at System.Data.Services.Client.AtomMaterializer.ApplyFeedToCollection(AtomEntry entry, ClientProperty property, AtomFeed feed, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.MaterializeResolvedEntry(AtomEntry entry, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.Materialize(AtomEntry entry, Type expectedEntryType, Boolean includeLinks) at System.Data.Services.Client.AtomMaterializer.DirectMaterializePlan(AtomMaterializer materializer, AtomEntry entry, Type expectedEntryType) at System.Data.Services.Client.AtomMaterializerInvoker.DirectMaterializePlan(Object materializer, Object entry, Type expectedEntryType) at System.Data.Services.Client.ProjectionPlan.Run(AtomMaterializer materializer, AtomEntry entry, Type expectedType) at System.Data.Services.Client.AtomMaterializer.Read() at System.Data.Services.Client.MaterializeAtom.MoveNextInternal() at System.Data.Services.Client.MaterializeAtom.MoveNext() at System.Linq.Enumerable.d_b11.MoveNext() at System.Data.Services.Client.DataServiceCollection1.InternalLoadCollection(IEnumerable1 items) at System.Data.Services.Client.DataServiceCollection1.StartTracking(DataServiceContext context, IEnumerable1 items, String entitySet, Func2 entityChanged, Func2 collectionChanged) at System.Data.Services.Client.DataServiceCollection1..ctor(DataServiceContext context, IEnumerable1 items, TrackingMode trackingMode, String entitySetName, Func2 entityChangedCallback, Func2 collectionChangedCallback) at System.Data.Services.Client.DataServiceCollection1..ctor(IEnumerable1 items) at Phinli.Dashboard.Silverlight.Helpers.DataHelper.<>c__DisplayClass44.<>c__DisplayClass46.<LoadAllPagesExpandAll>b__43() InnerException: System.InvalidOperationException Message=An item could not be added to the collection. When items in a DataServiceCollection are tracked by the DataServiceContext, new items cannot be added before items have been loaded into the collection. StackTrace: at System.Data.Services.Client.DataServiceCollection1.InsertItem(Int32 index, T item) at System.Collections.ObjectModel.Collection`1.Add(T item) InnerException: Thanks for any help!

    Read the article

  • iphone xcode annotation pin drop with slider value change also remove

    - by chirag
    i have to add annotation pin on location with UIslider Value change .... this is code where i add annotation (MKAnnotationView *) mapView:(MKMapView *)mapView1 viewForAnnotation:(id ) annotation{ MKAnnotationView* annotationView = nil; NSString* identifier = @"Pin"; MyAnnotationView* annView = (MyAnnotationView*)[mapView1 dequeueReusableAnnotationViewWithIdentifier:identifier]; // annotationView.leftCalloutAccessoryView = myImage; //myImage = [UIButton buttonWithType:UIButtonTypeCustom]; // [myImage setImage:[UIImage imageNamed:@"mark.png"]forState:UIControlStateNormal]; //Property_Photo UIButton *mybtn = [UIButton buttonWithType:UIButtonTypeCustom]; if([annotation isKindOfClass:[AddressAnnotation class]]){ AddressAnnotation x=(AddressAnnotation)annotation; mybtn.frame = CGRectMake(0, 0, 35, 35); mybtn.contentVerticalAlignment = UIControlContentVerticalAlignmentCenter; mybtn.contentHorizontalAlignment = UIControlContentHorizontalAlignmentCenter; [mybtn setImage:[UIImage imageNamed:@"btn.png"] forState:UIControlStateNormal]; [mybtn setTitle:[NSString stringWithFormat:@"%@",[x getID]] forState:UIControlStateDisabled]; [mybtn addTarget:self action:@selector(btnShowProperty:) forControlEvents:UIControlEventTouchUpInside]; ((IMOVEISAppDelegate*)[[UIApplication sharedApplication]delegate]).strPropertyPrice = [[myTblArray objectAtIndex:imgIndex]valueForKey:@"Property_Price"]; NSLog(@"property price: %@",((IMOVEISAppDelegate*)[[UIApplication sharedApplication]delegate]).strPropertyPrice); if(nil == annView) { ///if(annView!=nil && [annView retainCount]>0){ [annView release]; annView=nil; } annView = [[[MyAnnotationView alloc] initWithAnnotation:x reuseIdentifier:identifier] autorelease]; if(Objslider.value==10){ [myMapView removeAnnotations:myMapView.annotations]; } } NSURL *imgURL = [NSURL URLWithString:[[myTblArray objectAtIndex:imgIndex]valueForKey:@"Property_Photo"]]; UIImage *imgPhoto = [UIImage imageWithData:[NSData dataWithContentsOfURL:imgURL]]; UIImageView *pinImgView = [[UIImageView alloc]initWithFrame:CGRectMake(0, 0,35, 35)]; imgIndex++; [pinImgView setImage:imgPhoto]; annView.rightCalloutAccessoryView = mybtn; annView.leftCalloutAccessoryView = pinImgView; [annView setBackgroundColor:[UIColor clearColor]]; } [annView setEnabled:YES]; annView.canShowCallout = YES; annView.calloutOffset = CGPointMake(-5, 5); annotationView = annView; return annotationView; }

    Read the article

  • Xml configuration versus Annotation based configuration

    - by Abarax
    In a few large projects i have been working on lately it seems to become increasingly important to choose one or the other (XML or Annotation). As projects grow, consistency is very important for maintainability. My question is, what do people prefer. Do you prefer XML based or Annotation based? or Both? Everybody talks about XML configuration hell and how annotations are the answer, what about Annotation configuration hell?

    Read the article

  • 3rd party data - Store in Data Warehouse or Primary database?

    - by brydgesk
    This is mostly a data warehouse philosophy question. My project involves an Oracle forms application, and a Teradata Data Warehouse for reporting and ad-hoc purposes. In addition to the primary data created by the users of our application, we also require data from various other sources. Currently, this 3rd party data comes via FTPd flat files directly to our Data Warehouse. To access the data, our users must use a series of custom BusinessObjects reports. My question is, would it make more sense for this data to be sent to our source Oracle system instead? Is it ever appropriate for a Data Warehouse to be the point of origin for users to access raw data? In short, is it more important that the operational database contain only the data created by your project, or that the data warehouse remain dedicated solely to reporting and analysis?

    Read the article

  • JBoss ignores @RemoteBinding annotation

    - by tputkonen
    I would like to specify JNDI name for an EJB3 bean using annotation, but JBoss 5.1.0 GA seems to ignore the annotation completely. Bean's annotations are: @Remote(Foobar.class) @Stateless(name = "Foobar") @TransactionManagement(TransactionManagementType.BEAN) @RemoteBinding(jndiBinding="ejb/Foobar") public class FoobarBean implements Foobar { ... I tested deploying also using @RemoteBindings annotation, but the result was same: @RemoteBindings({@RemoteBinding(jndiBinding="ejb/Foobar")}) The bean does not get bound to JNDI with the specified name, and the log file doesn't give any clues.

    Read the article

  • Wanted: How to reliably, consistently select an MKMapView annotation

    - by jdandrea
    After calling MKMapView's setCenterCoordinate:animated: method (without animation), I'd like to call selectAnnotation:animated: (with animation) so that the annotation pops out from the newly-centered pushpin. For now, I simply watch for mapViewDidFinishLoadingMap: and then select the annotation. However, this is problematic. For instance, this method isn't called when there's no need to load additional map data. In those cases, my annotation isn't selected. :( Very well. I could call this immediately after setting the center coordinate instead. Ahh, but in that case it's possible that there is map data to load (but it hasn't finished loading yet). I'd risk calling it too soon, with the animation becoming spotty at best. Thus, if I understand correctly, it's not a matter of knowing if my coordinate is visible, since it's possible to stray almost a screenful of distance and have to load new map data. Rather, it's a matter of knowing if new map data needs to be loaded, and then acting accordingly. Any ideas on how to accomplish this, or how to otherwise (reliably) select an annotation after re-centering the map view on the coordinate where that annotation lives? Clues appreciated - thanks!

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >