Search Results

Search found 59969 results on 2399 pages for 'data dictionary'.

Page 6/2399 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • Master Data Management – A Foundation for Big Data Analysis

    - by Manouj Tahiliani
    While Master Data Management has crossed the proverbial chasm and is on its way to becoming mainstream, businesses are being hammered by a new megatrend called Big Data. Big Data is characterized by massive volumes, its high frequency, the variety of less structured data sources such as email, sensors, smart meters, social networks, and Weblogs, and the need to analyze vast amounts of data to determine value to improve upon management decisions. Businesses that have embraced MDM to get a single, enriched and unified view of Master data by resolving semantic discrepancies and augmenting the explicit master data information from within the enterprise with implicit data from outside the enterprise like social profiles will have a leg up in embracing Big Data solutions. This is especially true for large and medium-sized businesses in industries like Retail, Communications, Financial Services, etc that would find it very challenging to get comprehensive analytical coverage and derive long-term success without resolving the limitations of the heterogeneous topology that leads to disparate, fragmented and incomplete master data. For analytical success from Big Data or in other words ROI from Big Data Investments, businesses need to acquire, organize and analyze the deluge of data to make better decisions. There will need to be a coexistence of structured and unstructured data and to maintain a tight link between the two to extract maximum insights. MDM is the catalyst that helps maintain that tight linkage by providing an understanding about the identity, characteristics of Persons, Companies, Products, Suppliers, etc. associated with the Big Data and thereby help accelerate ROI. In my next post I will discuss about patterns for co-existing Big Data Solutions and MDM. Feel free to provide comments and thoughts on above as well as Integration or Architectural patterns.

    Read the article

  • How to Assure an Effective Data Model

    As a general rule in my opinion the effectiveness of a data model can be directly related to the accuracy and complexity of a project’s requirements. For example there is no need to work on very detailed data models when the details surrounding a specific data model have not been defined or even clarified. Developing data models when the clarity of project requirements is limited tends to introduce designed issues because the proper details to create an effective data model are not even known. One way to avoid this issue is to create data models that correspond to the complexity of the existing project requirements so that when requirements are updated then new data models can be created based any new discoveries regarding requirements on a fine grain level.  This allows for data models to be composed of general entities to be created initially when a project’s requirements are very vague and then the entities are refined as new and more substantial requirements are defined or redefined. This promotes communication amongst all stakeholders within a project as they go through the process of defining and finalizing project requirements.In addition, here are some general tips that can be applied to projects in regards to data modeling.Initially model all data generally and slowly reactor the data model as new requirements and business constraints are applied to a project.Ensure that data modelers have the proper tools and training they need to design a data model accurately.Create a common location for all project documents so that everyone will be able to review a project’s data models along with any other project documentation.All data models should follow a clear naming schema that tells readers the intended purpose for the data and how it is going to be applied within a project.

    Read the article

  • SQL SERVER – Why Do We Need Data Quality Services – Importance and Significance of Data Quality Services (DQS)

    - by pinaldave
    Databases are awesome.  I’m sure my readers know my opinion about this – I have made SQL Server my life’s work after all!  I love technology and all things computer-related.  Of course, even with my love for technology, I have to admit that it has its limits.  For example, it takes a human brain to notice that data has been input incorrectly.  Computer “brains” might be faster than humans, but human brains are still better at pattern recognition.  For example, a human brain will notice that “300” is a ridiculous age for a human to be, but to a computer it is just a number.  A human will also notice similarities between “P. Dave” and “Pinal Dave,” but this would stump most computers. In a database, these sorts of anomalies are incredibly important.  Databases are often used by multiple people who rely on this data to be true and accurate, so data quality is key.  That is why the improved SQL Server features Master Data Management talks about Data Quality Services.  This service has the ability to recognize and flag anomalies like out of range numbers and similarities between data.  This allows a human brain with its pattern recognition abilities to double-check and ensure that P. Dave is the same as Pinal Dave. A nice feature of Data Quality Services is that once you set the rules for the program to follow, it will not only keep your data organized in the future, but go to the past and “fix up” any data that has already been entered.  It also allows you do combine data from multiple places and it will apply these rules across the board, so that you don’t have any weird issues that crop up when trying to fit a round peg into a square hole. There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system.  It is installed on the side of (it needs to install separately after SQL Server is installed) SQL Server and runs quietly in the background, performing all its cleanup services. DQS Client is the user interface that you can interact with to set the rules and check over your data.  There are three main aspects of Client: knowledge base management, data quality projects and administration.  Knowledge base management is the part of the system that allows you to set the rules, or program the “knowledge base,” so that your database is clean and consistent. Data Quality projects are what run in the background and clean up the data that is already present.  The administration allows you to check out what DQS Client is doing, change rules, and generally oversee the entire process.  The whole process is user-friendly and a pleasure to use.  I highly recommend implementing Data Quality Services in your database. Here are few of my blog posts which are related to Data Quality Services and I encourage you to try this out. SQL SERVER – Installing Data Quality Services (DQS) on SQL Server 2012 SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS SQL SERVER – DQS Error – Cannot connect to server – A .NET Framework error occurred during execution of user-defined routine or aggregate “SetDataQualitySessions” – SetDataQualitySessionPhaseTwo SQL SERVER – Configuring Interactive Cleansing Suggestion Min Score for Suggestions in Data Quality Services (DQS) – Sensitivity of Suggestion SQL SERVER – Unable to DELETE Project in Data Quality Projects (DQS) Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

  • Welcome Oracle Data Integration 12c: Simplified, Future-Ready Solutions with Extreme Performance

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The big day for the Oracle Data Integration team has finally arrived! It is my honor to introduce you to Oracle Data Integration 12c. Today we announced the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. With the 12c release Oracle becomes the new leader in the data integration and replication technologies as no other vendor offers such a complete set of data integration capabilities for pervasive, continuous access to trusted data across Oracle platforms as well as third-party systems and applications. Oracle Data Integration 12c release addresses data-driven organizations’ critical and evolving data integration requirements under 3 key themes: Future-Ready Solutions Extreme Performance Fast Time-to-Value       There are many new features that support these key differentiators for Oracle Data Integrator 12c and for Oracle GoldenGate 12c. In this first 12c blog post, I will highlight only a few:·Future-Ready Solutions to Support Current and Emerging Initiatives: Oracle Data Integration offer robust and reliable solutions for key technology trends including cloud computing, big data analytics, real-time business intelligence and continuous data availability. Via the tight integration with Oracle’s database, middleware, and application offerings Oracle Data Integration will continue to support the new features and capabilities right away as these products evolve and provide advance features. E    Extreme Performance: Both GoldenGate and Data Integrator are known for their high performance. The new release widens the gap even further against competition. Oracle GoldenGate 12c’s Integrated Delivery feature enables higher throughput via a special application programming interface into Oracle Database. As mentioned in the press release, customers already report up to 5X higher performance compared to earlier versions of GoldenGate. Oracle Data Integrator 12c introduces parallelism that significantly increases its performance as well. Fast Time-to-Value via Higher IT Productivity and Simplified Solutions:  Oracle Data Integrator 12c’s new flow-based declarative UI brings superior developer productivity, ease of use, and ultimately fast time to market for end users.  It also gives the ability to seamlessly reuse mapping logic speeds development.Oracle GoldenGate 12c ‘s Integrated Delivery feature automatically optimally tunes the process, saving time while improving performance. This is just a quick glimpse into Oracle Data Integrator 12c and Oracle GoldenGate 12c. On November 12th we will reveal much more about the new release in our video webcast "Introducing 12c for Oracle Data Integration". Our customer and partner speakers, including SolarWorld, BT, Rittman Mead will join us in launching the new release. Please join us at this free event to learn more from our executives about the 12c release, hear our customers’ perspectives on the new features, and ask your questions to our experts in the live Q&A. Also, please continue to follow our blogs, tweets, and Facebook updates as we unveil more about the new features of the latest release. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • New Slides - and a discussion about Dictionary Statistics

    - by Mike Dietrich
    First of all we have just upoaded a new version of the Upgrade and Migration Workshop slides with some added information. So please feel free to download them from here.The slides have one new interesting information which lead to a discussion I've had in the past days with a very large customer regarding their upgrades - and internally on the mailing list targeting an EBS database upgrade from Oracle 10.2 to Oracle 11.2. Why are we creating dictionary statistics during upgrade? I'd believe this forced dictionary statistics creation got introduced with the desupport of the Rule Based Optimizer in Oracle 10g. The goal: as RBO is not supported anymore we have to make sure that the data dictionary has fresh and non-stale statistics. Actually that would have led in Oracle 9i to strange behaviour in some databases - so in Oracle 9i this was strongly disrecommended. The upgrade scripts got hardcoded to create these stats. But during tests we had the following findings: It's important to create dictionary statistics the night before the upgrade. Not two weeks before, not 60 minutes before your downtime begins. But very close to the upgrade. From Oracle 10g onwards you'd just say: $ execute DBMS_STATS.GATHER_DICTIONARY_STATS; This is important to make sure you have fresh dictionary statistics during upgrade for performance reasons. Tests have shown that running an upgrade without valid dictionary statistics might slow down the whole upgrade by factors of 2x-3x. And it would be also a great idea post upgrade to create again fresh dictionary statistics when you've did suppress the stats creation during the upgrade process. Suppress? Yes, you could set this underscore parameter in the init.ora: _optim_dict_stats_at_db_cr_upg=FALSE to suppress the forced dictionary statistics collection during an upgrade. We believe strongly that (a) people using the default statistics creation process which will create dictionary statistics by default and (b) create fresh stats before upgrade on the dictionary. Therefore we find it save once you have followed our advice to use the underscore during upgrade. And we've taken out that forced statistics collection during upgrade in the next release of the database. Please note: If you are using the DBUA for the upgrade it will remove underscore parameters for the upgrade run to improve performance - which is generally a good idea. So you'll have to start the DBUA with that call: $ dbua -initParam "_optim_dict_stats_at_cb_cr_upg"=FALSE -Mike

    Read the article

  • Add collection or array to wpf resource dictionary

    - by Chris Cap
    I've search high and low and can't find an answer to this. I have two questions How do you create an array or collection in XAML. I've got an array I want to stick in there and bind to a combo box. My first idea was to put an ItemsControl in a resource dictionary, but the ItemsSource of a combo box expects IEnumerable so that didn't work. Here's what I've tried in my resource dictionary and neither works <ItemsControl x:Key="stateList"> <sys:String>AL</sys:String> <sys:String>CA</sys:String> <sys:String>CN</sys:String> </ItemsControl> <ItemsControl x:Key="stateList2"> <ComboBoxItem>AL</ComboBoxItem> <ComboBoxItem>CA</ComboBoxItem> <ComboBoxItem>CN</ComboBoxItem> </ItemsControl> and here's how I bind to it <ComboBox SelectedValue="{Binding Path=State}" ItemsSource="{Binding Source={StaticResource stateList2}}" > </ComboBox> EDIT: UPDATED I got this first part to work this way <col:ArrayList x:Key="stateList3"> <sys:String>AL</sys:String> <sys:String>CA</sys:String> <sys:String>CN</sys:String> </col:ArrayList> However, I'd rather not use an array list, I'd like to use a generic list so if anyone knows how please let me know. EDIT UPDATE: I guess XAML has very limited support for generics so maybe an array list is the best I can do for now, but I would still like help on the second question if anyone has an anser 2nd. I've tried referencing a merged resource dictionary in my XAML and had problems because under Window.resources I had more than just the dictionary so it required me to add x:Key. Once I add the key, the system can no longer find the items in my resource dictionary. I had to move the merged dictionary to Grid.Resources instead. Ideally I'd like to reference the merged dictionary in the app.xaml but I have the same problem Here's some sample code. This first part required an x:key to compile because I have converter and it complained that every item must have a key if there is more than one <UserControl.Resources> <win:BooleanToVisibilityConverter x:Key="VisibilityConverter" /> <ResourceDictionary> <ResourceDictionary.MergedDictionaries> <ResourceDictionary Source="/ResourcesD.xaml" /> </ResourceDictionary.MergedDictionaries> </ResourceDictionary> </UserControl.Resources> I had to change it to this <UI:BaseStep.Resources> <win:BooleanToVisibilityConverter x:Key="VisibilityConverter" /> </UI:BaseStep.Resources> <Grid> <Grid.Resources> <ResourceDictionary> <ResourceDictionary.MergedDictionaries> <ResourceDictionary Source="/ResourcesD.xaml" /> </ResourceDictionary.MergedDictionaries> </ResourceDictionary> </Grid.Resources> </Grid> Thank you

    Read the article

  • C# Dictionary as a ListBox.DataSource

    - by Steve H.
    I am trying to bind a dictionary as a DataSource to a ListBox. The solution in How to bind a dicationary to a ListBox in winforms will not work for me because my dictionary is a class-level variable and not a method-level variable, so I can not use var. When you put a class-level variable into new BindingSource(...) with null as the second argument I get an ArgumentNull exception. How do I bind a class-level dictionary as a data source for a list box? I don't like the List< KeyValuePair< string, string work-around becuase Where(...) and First(...) are ugly, complicated, and confusing compared to TryGetValue(...) and other Dictionary functionality.

    Read the article

  • NameValueCollection vs Dictionary<string,string>

    - by frankadelic
    Any reason I should use Dictionary<string,string instead of NameValueCollection? (in C# / .NET Framework) Option 1, using NameValueCollection: //enter values: NameValueCollection nvc = new NameValueCollection() { {"key1", "value1"}, {"key2", "value2"}, {"key3", "value3"} }; // retrieve values: foreach(string key in nvc.AllKeys) { string value = nvc[key]; // do something } Option 2, using Dictionary<string,string... //enter values: Dictionary<string, string> dict = new Dictionary<string, string>() { {"key1", "value1"}, {"key2", "value2"}, {"key3", "value3"} }; // retrieve values: foreach (KeyValuePair<string, string> kvp in dict) { string key = kvp.Key; string val = kvp.Value; // do something } For these use cases, is there any advantage to use one versus the other? Any difference in performance, memory use, sort order, etc.?

    Read the article

  • Python: Filter a dictionary

    - by Adam Matan
    Hi, I have a dictionary of points, say: >>> points={'a':(3,4), 'b':(1,2), 'c':(5,5), 'd':(3,3)} I want to create a new dictionary with all the points whose x and y value is smaller than 5, i.e. points 'a', 'b' and 'd'. According to the the book, each dictionary has the items() function, which returns a list of (key, pair) tuple: >>> points.items() [('a', (3, 4)), ('c', (5, 5)), ('b', (1, 2)), ('d', (3, 3))] So I have written this: >>> for item in [i for i in points.items() if i[1][0]<5 and i[1][1]<5]: ... points_small[item[0]]=item[1] ... >>> points_small {'a': (3, 4), 'b': (1, 2), 'd': (3, 3)} Is there a more elegant way? I was expecting Python to have some super-awesome dictionary.filter(f) function... Adam

    Read the article

  • "Verbose Dictionary" in C#, 'override new' this[] or implement IDictionary

    - by Benjol
    All I want is a dictionary which tells me which key it couldn't find, rather than just saying The given key was not present in the dictionary. I briefly considered doing a subclass with override new this[TKey key], but felt it was a bit hacky, so I've gone with implementing the IDictionary interface, and passing everything through directly to an inner Dictionary, with the only additional logic being in the indexer: public TValue this[TKey key] { get { ThrowIfKeyNotFound(key); return _dic[key]; } set { ThrowIfKeyNotFound(key); _dic[key] = value; } } private void ThrowIfKeyNotFound(TKey key) { if(!_dic.ContainsKey(key)) throw new ArgumentOutOfRangeException("Can't find key [" + key + "] in dictionary"); } Is this the right/only way to go? Would newing over the this[] really be that bad?

    Read the article

  • English dictionary as txt or xml file with support of synonyms

    - by Simon
    Can someone point me to where I can download English dictionary as a txt or xml file. I am building a simple app for myself and looking for something what I could start using immediately without learning complex API. Support for synonyms would be great, that is it should be easier to retrieve all the synonyms for particular word. It would be absolutely fantastic if dictionary would be listing British and American spelling of the words where they are differ. Even if it would be small dictionary (few 000's words) that's ok, I only need it for small project. I even would be willing to buy one if the price is reasonable, and dictionary is easy to use - simple xml wold be great. Any directions please.

    Read the article

  • How to get values after dictionary sorting by values with linq

    - by user301639
    hey, I've a dictionary, which i sorted by value with linq, how can i get those sorted value from the sorted result i get that's what i did so far Dictionary<char, int> lettersAcurr = new Dictionary<char, int>();//sort by int value var sortedDict = (from entry in lettersAcurr orderby entry.Value descending select entry); during the debug i can see that sortedDic has a KeyValuePar, but i cant accesses to it thanks for help

    Read the article

  • .NET Dictionary as a Property

    - by Anand
    Can someone point me out to some C# code examples or provide some code, where a Dictionary has been used as a property for a Class. The examples I have seen so far don't cover all the aspects viz how to declare the dictionary as property, add, remove, and retrieve the elements from the dictionary.

    Read the article

  • C++ and Scripting.Dictionary from scrrun.dll

    - by MaxFX
    Hello. I have some trouble with Scripting.Dictionary in C++. I'm trying to use interface IDictionary via smart pointer but methods of creating object don't work and I can't understand why. CoInitialize(NULL); IDictionaryPtr dict; dict.CreateInstance(__uuidof(Dictionary)); _variant_t num1 = 1; _variant_t num2 = 2; dict->Add(&num1, &num2); long i; dict->get_Count(&i); cout << i << "\n"; But method Add does not work and cout of elements in dictionary is always 0. How correct to use Scripting.Dictionary in that case. PS.: I'm getting Scripting interfaces by #import "scrrun.dll"

    Read the article

  • Does my Dictionary must use locking mechanism?

    - by theateist
    Many threads have access to summary. Each thread will have an unique key for accessing the dictionary; Dictionary<string, List<Result>> summary; Do I need locking for following operations? summary[key] = new List<Result>() summary[key].Add(new Result()); It seems that I don't need locking because each thread will access dictionary with different key, but won't the (1) be problematic because of adding concurrently new record to dictionary with other treads?

    Read the article

  • How to use Delegate in C# for Dictionary<int, List<string>>

    - by Emanuel
    The code: private delegate void ThreadStatusCallback(ReceiveMessageAction action, Dictionary<int, List<string>> message); ... Dictionary<int, List<string>> messagesForNotification = new Dictionary<int, List<string>>(); ... Invoke(new ThreadStatusCallback(ReceivesMessagesStatus), ReceiveMessageAction.Notification , messagesForNotification ); ... private void ReceivesMessagesStatus(ReceiveMessageAction action, object value) { ... } How can I send the two variable of type ReceiveMessageAction respectively Dictionary<int, List<string>> to the ReceivesMessagesStatus method. Thanks.

    Read the article

  • KeyNotFound Exception in Dictionary(of T)

    - by C Patton
    I'm about ready to bang my head against the wall I have a class called Map which has a dictionary called tiles. class Map { public Dictionary<Location, Tile> tiles = new Dictionary<Location, Tile>(); public Size mapSize; public Map(Size size) { this.mapSize = size; } //etc... I fill this dictionary temporarily to test some things.. public void FillTemp(Dictionary<int, Item> itemInfo) { Random r = new Random(); for(int i =0; i < mapSize.Width; i++) { for(int j=0; j<mapSize.Height; j++) { Location temp = new Location(i, j, 0); int rint = r.Next(0, (itemInfo.Count - 1)); Tile t = new Tile(new Item(rint, rint)); tiles[temp] = t; } } } and in my main program code Map m = new Map(10, 10); m.FillTemp(iInfo); Tile t = m.GetTile(new Location(2, 2, 0)); //The problem line now, if I add a breakpoint in my code, I can clearly see that my instance (m) of the map class is filled with pairs via the function above, but when I try to access a value with the GetTile function: public Tile GetTile(Location location) { if(this.tiles.ContainsKey(location)) { return this.tiles[location]; } else { return null; } } it ALWAYS returns null. Again, if I view inside the Map object and find the Location key where x=2,y=2,z=0 , I clearly see the value being a Tile that FillTemp generated.. Why is it doing this? I've had no problems with a Dictionary such as this so far. I have no idea why it's returning null. and again, when debugging, I can CLEARLY see that the Map instance contains the Location key it says it does not... very frustrating. Any clues? Need any more info? Help would be greatly appreciated :)

    Read the article

  • get dictionary key by value

    - by loviji
    how to get a Dictionary key by value in C#? Dictionary<string, string> types = new Dictionary<string, string>() { {"1", "one"}, {"2", "two"}, {"3", "three"} }; I want simething like this: getByValueKey(string value); getByValueKey("one") must be return "1". What is the best way do this? may be HashTable, SortedLists?

    Read the article

  • Generic Dictionary - Getting Convertion Error

    - by pm_2
    The following code is giving me an error: // GetDirectoryList() returns Dictionary<string, DirectoryInfo> Dictionary<string, DirectoryInfo> myDirectoryList = GetDirectoryList(); // The following line gives a compile error foreach (Dictionary<string, DirectoryInfo> eachItem in myDirectoryList) The error it gives is as follows: Cannot convert type 'System.Collections.Generic.KeyValuePair<string,System.IO.DirectoryInfo>' to 'System.Collections.Generic.Dictionary<string,System.IO.DirectoryInfo>’ My question is: why is it trying to perform this conversion? Can I not use a foreach loop on this type of object?

    Read the article

  • Generic Dictionary - Getting Conversion Error

    - by pm_2
    The following code is giving me an error: // GetDirectoryList() returns Dictionary<string, DirectoryInfo> Dictionary<string, DirectoryInfo> myDirectoryList = GetDirectoryList(); // The following line gives a compile error foreach (Dictionary<string, DirectoryInfo> eachItem in myDirectoryList) The error it gives is as follows: Cannot convert type 'System.Collections.Generic.KeyValuePair<string,System.IO.DirectoryInfo>' to 'System.Collections.Generic.Dictionary<string,System.IO.DirectoryInfo>’ My question is: why is it trying to perform this conversion? Can I not use a foreach loop on this type of object?

    Read the article

  • When I add elements to a dictionary, the elements are also added to another dictionary (C# + XNA)

    - by sFuller
    I have some code that looks like this: public static class Control { public static Dictionary<PlayerIndex, GamePadState> gamePadState = new Dictionary<PlayerIndex,GamePadState>(); public static Dictionary<PlayerIndex, GamePadState> oldGamePadState = new Dictionary<PlayerIndex, GamePadState>(); public static void UpdateControlls() { gamePadState.Clear(); foreach (PlayerIndex pIndex in pIndexArray) { gamePadState.Add(pIndex,GamePad.GetState(pIndex)); } } } As I looked through the code in Debug, when I called gamePadState.Add(...);, It also added to oldGamePadState, even though I never called oldGamePadState.Add(...); This is verry strange. Thanks everybody!

    Read the article

  • Big Data – Final Wrap and What Next – Day 21 of 21

    - by Pinal Dave
    In yesterday’s blog post we explored various resources related to learning Big Data and in this blog post we will wrap up this 21 day series on Big Data. I have been exploring various terms and technology related to Big Data this entire month. It was indeed fun to write about Big Data in 21 days but the subject of Big Data is much bigger and larger than someone can cover it in 21 days. My first goal was to write about the basics and I think we have got that one covered pretty well. During this 21 days I have received many questions and answers related to Big Data. I have covered a few of the questions in this series and a few more I will be covering in the next coming months. Now after understanding Big Data basics. I am personally going to do a list of the things next. I thought I will share the same with you as this will give you a good idea how to continue the journey of the Big Data. Build a schedule to read various Apache documentations Watch all Pluralsight Courses Explore HortonWorks Sandbox Start building presentation about Big Data – this is a great way to learn something new Present in User Groups Meetings on Big Data Topics Write more blog posts about Big Data I am going to continue learning about Big Data – I want you to continue learning Big Data. Please leave a comment how you are going to continue learning about Big Data. I will publish all the informative comments on this blog with due credit. I want to end this series with the infographic by UMUC. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

  • Partner Webcast - Focus on Oracle Data Profiling and Data Quality 11g

    - by lukasz.romaszewski(at)oracle.com
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi; mso-ansi-language:RO;} Partner Webcast Focus on Oracle Data Profiling and Data Quality 11g February 24th, 12am  CET   Oracle offers an integrated suite Data Quality software architected to discover and correct today's data quality problems and establish a platform prepared for tomorrow's yet unknown data challenges. Oracle Data Profiling provides data investigation, discovery, and profiling in support of quality, migration, integration, stewardship, and governance initiatives. It includes a broad range of features that expand upon basic profiling, including automated monitoring, business-rule validation, and trend analysis. Oracle Data Quality for Data Integrator provides cleansing, standardization, matching, address validation, location enrichment, and linking functions for global customer data and operational business data. It ensures that data adheres to established standards that are adaptable to fit each organization's specific needs.  Both single - and double - byte data are processed in local languages to provide a unique and centralized view of customers, products and services.   During this in-person briefing, Data Integration Solution Specialists will be providing a technical overview and a walkthrough.   Agenda ·         Oracle Data Integration Strategy overview ·         A focus on Oracle Data Profiling and Oracle Data Quality for Data Integrator: o   Oracle Data Profiling o   Oracle Data Quality for Data Integrator o   Live demoo   Q&A Delivery Format  This FREE online LIVE eSeminar will be delivered over the Web and Conference Call. Registrations   received less than 24hours  prior to start time may not receive confirmation to attend. To register , click here. For any questions please contact [email protected]

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >