Search Results

Search found 83878 results on 3356 pages for 'google data api'.

Page 29/3356 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • SQL SERVER – Why Do We Need Data Quality Services – Importance and Significance of Data Quality Services (DQS)

    - by pinaldave
    Databases are awesome.  I’m sure my readers know my opinion about this – I have made SQL Server my life’s work after all!  I love technology and all things computer-related.  Of course, even with my love for technology, I have to admit that it has its limits.  For example, it takes a human brain to notice that data has been input incorrectly.  Computer “brains” might be faster than humans, but human brains are still better at pattern recognition.  For example, a human brain will notice that “300” is a ridiculous age for a human to be, but to a computer it is just a number.  A human will also notice similarities between “P. Dave” and “Pinal Dave,” but this would stump most computers. In a database, these sorts of anomalies are incredibly important.  Databases are often used by multiple people who rely on this data to be true and accurate, so data quality is key.  That is why the improved SQL Server features Master Data Management talks about Data Quality Services.  This service has the ability to recognize and flag anomalies like out of range numbers and similarities between data.  This allows a human brain with its pattern recognition abilities to double-check and ensure that P. Dave is the same as Pinal Dave. A nice feature of Data Quality Services is that once you set the rules for the program to follow, it will not only keep your data organized in the future, but go to the past and “fix up” any data that has already been entered.  It also allows you do combine data from multiple places and it will apply these rules across the board, so that you don’t have any weird issues that crop up when trying to fit a round peg into a square hole. There are two parts of Data Quality Services that help you accomplish all these neat things.  The first part is DQL Server, which you can think of as the hardware component of the system.  It is installed on the side of (it needs to install separately after SQL Server is installed) SQL Server and runs quietly in the background, performing all its cleanup services. DQS Client is the user interface that you can interact with to set the rules and check over your data.  There are three main aspects of Client: knowledge base management, data quality projects and administration.  Knowledge base management is the part of the system that allows you to set the rules, or program the “knowledge base,” so that your database is clean and consistent. Data Quality projects are what run in the background and clean up the data that is already present.  The administration allows you to check out what DQS Client is doing, change rules, and generally oversee the entire process.  The whole process is user-friendly and a pleasure to use.  I highly recommend implementing Data Quality Services in your database. Here are few of my blog posts which are related to Data Quality Services and I encourage you to try this out. SQL SERVER – Installing Data Quality Services (DQS) on SQL Server 2012 SQL SERVER – Step by Step Guide to Beginning Data Quality Services in SQL Server 2012 – Introduction to DQS SQL SERVER – DQS Error – Cannot connect to server – A .NET Framework error occurred during execution of user-defined routine or aggregate “SetDataQualitySessions” – SetDataQualitySessionPhaseTwo SQL SERVER – Configuring Interactive Cleansing Suggestion Min Score for Suggestions in Data Quality Services (DQS) – Sensitivity of Suggestion SQL SERVER – Unable to DELETE Project in Data Quality Projects (DQS) Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: Data Quality Services, DQS

    Read the article

  • Welcome Oracle Data Integration 12c: Simplified, Future-Ready Solutions with Extreme Performance

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 The big day for the Oracle Data Integration team has finally arrived! It is my honor to introduce you to Oracle Data Integration 12c. Today we announced the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. With the 12c release Oracle becomes the new leader in the data integration and replication technologies as no other vendor offers such a complete set of data integration capabilities for pervasive, continuous access to trusted data across Oracle platforms as well as third-party systems and applications. Oracle Data Integration 12c release addresses data-driven organizations’ critical and evolving data integration requirements under 3 key themes: Future-Ready Solutions Extreme Performance Fast Time-to-Value       There are many new features that support these key differentiators for Oracle Data Integrator 12c and for Oracle GoldenGate 12c. In this first 12c blog post, I will highlight only a few:·Future-Ready Solutions to Support Current and Emerging Initiatives: Oracle Data Integration offer robust and reliable solutions for key technology trends including cloud computing, big data analytics, real-time business intelligence and continuous data availability. Via the tight integration with Oracle’s database, middleware, and application offerings Oracle Data Integration will continue to support the new features and capabilities right away as these products evolve and provide advance features. E    Extreme Performance: Both GoldenGate and Data Integrator are known for their high performance. The new release widens the gap even further against competition. Oracle GoldenGate 12c’s Integrated Delivery feature enables higher throughput via a special application programming interface into Oracle Database. As mentioned in the press release, customers already report up to 5X higher performance compared to earlier versions of GoldenGate. Oracle Data Integrator 12c introduces parallelism that significantly increases its performance as well. Fast Time-to-Value via Higher IT Productivity and Simplified Solutions:  Oracle Data Integrator 12c’s new flow-based declarative UI brings superior developer productivity, ease of use, and ultimately fast time to market for end users.  It also gives the ability to seamlessly reuse mapping logic speeds development.Oracle GoldenGate 12c ‘s Integrated Delivery feature automatically optimally tunes the process, saving time while improving performance. This is just a quick glimpse into Oracle Data Integrator 12c and Oracle GoldenGate 12c. On November 12th we will reveal much more about the new release in our video webcast "Introducing 12c for Oracle Data Integration". Our customer and partner speakers, including SolarWorld, BT, Rittman Mead will join us in launching the new release. Please join us at this free event to learn more from our executives about the 12c release, hear our customers’ perspectives on the new features, and ask your questions to our experts in the live Q&A. Also, please continue to follow our blogs, tweets, and Facebook updates as we unveil more about the new features of the latest release. /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;}

    Read the article

  • Google Analytics - Google Adwords [closed]

    - by Fiona
    Hi there, I have a number of Google Analytics accounts. I also have one adwords account. At the moment I've linked my adwords to one of my GA accounts. However I'd like to link to my other GA accounts. Can this be done? and if so how? Thanks, Fiona

    Read the article

  • How can I tell if a user came to a page via a Google Adwords PPC campaign?

    - by Mike Crittenden
    I have a form with a hidden "Came from Adwords" field that will be marked true (via javascript) if the user came from a PPC campaign and will stay false if not. That way, when the user submits the form, we will have each submission stored with info about whether that submission came from adwords or not, all without the user knowing. How can I fetch this info? I know that Google sets a cookie called Conversion whenever you click a PPC link to a page, but the cookie's content is just random alphanumeric characters. Is there something in the Analytics/Adwords API that will let me test for this? Do I have to resort to adding ?ref=adwords or something onto the PPC URLs so that I can test that way?

    Read the article

  • Is there a Google 3d Warehouse API?

    - by Jayesh
    Does anyone know if there is an official or unofficial API for Google 3D warehouse. I know of the iPhone app NaviCAD, which shows Collada models from Google Warehouse - it has search, most-viewed, most-recent functionality; so I guess it is using some sort of API to get that data. But I couldn't find any auch api after searching around. Do you know if there is any?

    Read the article

  • how to: dynamically load google ajax api into chrome extension content script

    - by Hoff
    Hi there, I'm trying to make use of google's ajax apis in a chorme extension's "content script". On a regular html page, I would just do this: <script src="http://www.google.com/jsapi"></script> <script> google.load("language", "1"); </script> But since I'm trying to load the tranlation library dynamically from js code, I've tried: script = document.createElement("script"); script.src = "http://www.google.com/jsapi"; script.type = "text/javascript"; document.getElementsByTagName("head")[0].appendChild(script); google.load('language','1') but the last line throws the following error: Uncaught TypeError: Object # has no method 'load' Funny enough, when i enter the same "google.load('language','1')" in chrome's js console, it works as intended... I've also tried with jquery's .getScript() but the same problem persists... Does anybody have any clue what might be the problem and how it could be solved? Many thanks in advance! Martin

    Read the article

  • Google Reader API - feed/[FEEDURL]/ is coming back as Not found

    - by JustinXXVII
    There is one feed I'm subscribed to which always turns up as NOT FOUND when I try to use the API. I return an array of Dictionaries, containing 3 objects. The first in the list represents the user himself, like so: { FeedID = "user/MY_UNIQUE_NUMBER/state/com.google/reading-list"; Timestamp = 1273448807271463; Unread = 59; } The Unread count is very important. My client depends on downloading 59 items from Google before it refreshes. If a feed doesn't download properly, the count is off and the client won't update. An example of a working Feed is here: { FeedID = "feed/http://arstechnica.com/index.rssx"; Timestamp = 1273447158484528; Unread = 13; } The FeedID value combines with a specially formatted URL string and gives back a list of articles. The above example works fine. However, the following feed always returns NOT FOUND on Google, and if I paste the URL verbatim into a browser, it never turns up. See here: { FeedID = "feed/http://www.peopleofwalmart.com/?feed=rss2"; Timestamp = 1273424138183529; Unread = 6; } http://www.google.com/reader/api/0/stream/contents/feed/http://www.peopleofwalmart.com/?feed=rss2?ot=1&r=n&xt=user/-/state/com.google/read&n=6&ck=1273449028&client=testClient If you are at all proficient with the API, can you please help me? Like I said, since Google always says NOT FOUND when I search for that feed, my download count is off by N articles and won't update. I would rather not hack around it, honestly. Thanks!

    Read the article

  • Using the Google AJAX Search API for SEO Purposes

    - by User
    I am looking at writing a .net application that uses the Google AJAX Search API to determine where our website falls for a given term compared to a competitor. I can find alot about the old SOAP API however for the new AJAX api I cannot find any information on the following: Is this sort of use allowed as the Terms of use are vague Is there a limit to the number of requests per day As you can only get a max of 8 results at a time, is the only way to get the top 100 results to keep requesting the next set and is this an issue? Thanks

    Read the article

  • Move file or folder to a different folder in google document using api problem

    - by Minh Nguyen
    In Google Document i have a struct: Folder1 +------Folder1-1 +------+------File1-1-1 +------Folder1-2 +------File1-1 Folder2 I want to move "File1-1" to "Folder2" using .Net google api library(Google Data API SDK) public static void moveFolder(string szUserName, string szPassword, string szResouceID, string szToFolderResourceID) { string szSouceUrl = "https://docs.google.com/feeds/default/private/full" + "/" + HttpContext.Current.Server.UrlEncode(szResouceID); Uri sourceUri = new Uri(szSouceUrl); //create a atom entry AtomEntry atom = new AtomEntry(); atom.Id = new AtomId(szSouceUrl); string szTargetUrl = "http://docs.google.com/feeds/default/private/full/folder%3Aroot/contents/"; if (szToFolderResourceID != "") { szTargetUrl = "https://docs.google.com/feeds/default/private/full" + "/" + HttpContext.Current.Server.UrlEncode(szToFolderResourceID) + "/contents" ; } Uri targetUri = new Uri(szTargetUrl); DocumentsService service = new DocumentsService(SERVICENAME); ((GDataRequestFactory)service.RequestFactory).KeepAlive = false; service.setUserCredentials(szUserName, szPassword); service.EntrySend(targetUri, atom, GDataRequestType.Insert); } After run this function i have: Folder1 +------Folder1-1 +------+------File1-1-1 +------Folder1-2 +------File1-1 Folder2 +------File1-1 "File1-1" display in both "Folder1" and "Folder2", and when i delete it from a folder it will be deleted in another folder. (expect: "File1-1" display only in "Folder2") What happen? How can i solve this problem?

    Read the article

  • How to get the name of a placemark using google earth api?

    - by user1444402
    I'm trying to create a web page where an user could manage diferent placemark. The management is based on create, drag and drop and delete placemarks. At the moment, I've achieved to create multiple placemarks and drag&drop them but I'm not able to delete it because I cannot identify them individually. I'm using the google earth api examples1 but I cannot find this functionality. I want to get the name of the different placemarks, any idea?

    Read the article

  • DB API for shell scripting (any shell)

    - by foampile
    I am faced with some legacy shell scripts that run batch data processing jobs in Oracle using SQL+. For the most part, the data tier does not have to communicate back to the script with retrieved data to be passed for shell-level processing but in a few cases it does. The problem is, SQL+ is really meant to be an end user app and not an API that can communicate with other clients programmaticaly. That is why people have invented APIs such as DBD::DBI for Perl, JDBC for Java, ODBC etc. The way it is done is they invoke SQL+ and then parse the output, which is clearly designed for human eye consumption, using tools like sed and awk. The whole thing is at best a hack and very prone to bugs. Since this client is rather conservative with their technology, they don't want to scale their scripts up to Perl or Python where there are data access APIs. So I am wondering whether there are similar APIs for shell, e.g. K or bash. What I would like is if an API would return data in a 2-dimensional array or strings (for the lack of type setting) so that I can just read DB data like that. The way they do it now is akin to parsing regular web page HTML to get a single stock quote rather than cleanly calling a web service and be done with it. Anybody know of a product I can use? Thanks

    Read the article

  • How do I use an API?

    - by GRardB
    Background I have no idea how to use an API. I know that all APIs are different, but I've been doing research and I don't fully understand the documentation that comes along with them. There's a programming competition at my university in a month and a half that I want to compete in (revolved around APIs) but nobody on my team has ever used one. We're computer science majors, so we have experience programming, but we've just never been exposed to an API. I tried looking at Twitter's documentation, but I'm lost. Would anyone be able to give me some tips on how to get started? Maybe a very easy API with examples, or explaining essential things about common elements of different APIs? I don't need a full-blown tutorial on Stack Overflow; I just need to be pointed in the right direction. Update The programming languages that I'm most fluent in are C (simple text editor usually) and Java (Eclipse). In an attempt to be more specific with my question: I understand that APIs (and yes, external libraries are what I was referring to) are simply sets of functions. Question I guess what I'm trying to ask is how I would go about accessing those functions. Do I need to download specific files and include them in my programs, or do they need to be accessed remotely, etc.?

    Read the article

  • How do I use an API?

    - by GRardB
    Background I have no idea how to use an API. I know that all APIs are different, but I've been doing research and I don't fully understand the documentation that comes along with them. There's a programming competition at my university in a month and a half that I want to compete in (revolved around APIs) but nobody on my team has ever used one. We're computer science majors, so we have experience programming, but we've just never been exposed to an API. I tried looking at Twitter's documentation, but I'm lost. Would anyone be able to give me some tips on how to get started? Maybe a very easy API with examples, or explaining essential things about common elements of different APIs? I don't need a full-blown tutorial on Stack Overflow; I just need to be pointed in the right direction. Update The programming languages that I'm most fluent in are C (simple text editor usually) and Java (Eclipse). In an attempt to be more specific with my question: I understand that APIs (and yes, external libraries are what I was referring to) are simply sets of functions. Question I guess what I'm trying to ask is how I would go about accessing those functions. Do I need to download specific files and include them in my programs, or do they need to be accessed remotely, etc.?

    Read the article

  • Save searches in Google Reader

    - by pinniger
    Ok, I’m trying to find a way to search my rss feeds in Google Reader every day for certain phrases. If any of the phrases are found, I want to be notified. I thought Google Alerts would do this no problem, but it does not. Does anybody know of any services or any other way of doing this?

    Read the article

  • Is Google Desktop Search Safe?

    - by JW
    Somebody told me, that Google DS is unsafe and Google would safe kind of an user data list about the files on my PC... can anybody tell me something about it? Any experience? Greetz, JW

    Read the article

  • svn path in google apps script

    - by deepasun
    Hi, I want to write a google apps script in google docs spreadsheet, such that it should update the svn revision of particular component automatically (which is in one cell of that spreadsheet) when i run that script

    Read the article

  • Fast way to search stackoverflow.com using google

    - by eSKay
    Everytime I have to search something on stackoverflow.com using Google I have to type the rather long <search term> site:stackoverflow.com Is there some way to speedup the process, so that I need not type the whole 23 characters of site:stackoverflow.com each and every time? I am using Google Chrome.

    Read the article

  • Friday Fun: Play Air Hockey in Google Chrome

    - by Asian Angel
    Do you like the challenge of fast-paced games? Then get ready to put yourself to the test with the Air Hockey extension for Google Chrome during company time. Air Hockey in Action There are two ways that you can play Air Hockey…either using the drop-down window or opening the game in a new tab. For our example we chose to play in a new tab. Before starting the game you can choose the difficulty level, to enable/disable the sound, and/or to go to full screen if desired. Note: Screenshot of “Full Screen” version shown below. While playing you really have to stay on top of things…the computer player will beat you rather quickly if you do not. Hustle hustle hustle! With a little bit of practice it does become easier but even the “Easy Level” on this game will keep you busy. If the normal size game screen seems just a bit small you can easily get a larger version using the “Full Screen Link” below the game window. Whether your browser is non-maximized as shown here or totally maximized it will fill the entire browser window area. Conclusion If you like fast paced games then the Air Hockey extension certainly fits that criteria and will keep you on your toes. Make sure and keep the sound off while playing during Friday afternoon though! Links Download the Air Hockey extension (Google Chrome Extensions) Similar Articles Productive Geek Tips Friday Fun: Play Tetris in Google ChromeFriday Fun: Play MineSweeper in Google ChromeFriday Fun: Play 3D Rally Racing in Google ChromeHow to Make Google Chrome Your Default BrowserPlay a Webpage Display Prank in Google Chrome TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 LocPDF is a Visual PDF Search Tool Download Free iPad Wallpapers at iPad Decor Get Your Delicious Bookmarks In Firefox’s Awesome Bar Manage Photos Across Different Social Sites With Dropico Test Drive Windows 7 Online Download Wallpapers From National Geographic Site

    Read the article

  • Big Data – Final Wrap and What Next – Day 21 of 21

    - by Pinal Dave
    In yesterday’s blog post we explored various resources related to learning Big Data and in this blog post we will wrap up this 21 day series on Big Data. I have been exploring various terms and technology related to Big Data this entire month. It was indeed fun to write about Big Data in 21 days but the subject of Big Data is much bigger and larger than someone can cover it in 21 days. My first goal was to write about the basics and I think we have got that one covered pretty well. During this 21 days I have received many questions and answers related to Big Data. I have covered a few of the questions in this series and a few more I will be covering in the next coming months. Now after understanding Big Data basics. I am personally going to do a list of the things next. I thought I will share the same with you as this will give you a good idea how to continue the journey of the Big Data. Build a schedule to read various Apache documentations Watch all Pluralsight Courses Explore HortonWorks Sandbox Start building presentation about Big Data – this is a great way to learn something new Present in User Groups Meetings on Big Data Topics Write more blog posts about Big Data I am going to continue learning about Big Data – I want you to continue learning Big Data. Please leave a comment how you are going to continue learning about Big Data. I will publish all the informative comments on this blog with due credit. I want to end this series with the infographic by UMUC. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >