Search Results

Search found 59036 results on 2362 pages for 'fake data'.

Page 124/2362 | < Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >

  • Microsoft Press Deal of the Day - 4/April/2012 - Accessing Data with Microsoft® .NET Framework 4

    - by TATWORTH
    The Microsoft Press half price deal of the day at http://shop.oreilly.com/product/9780735627390.do is the training material for the MCTS Self-Paced Training Kit (Exam 70-516)  "Accessing Data with Microsoft® .NET Framework 4""EXAM PREP GUIDE Ace your preparation for the skills measured by MCTS Exam 70-516—and on the job—with this official Microsoft study guide. Work at your own pace through a series of lessons and reviews that fully cover each exam objective. Then, reinforce and apply what you’ve learned through real-world case scenarios and practice exercises. Maximize your performance on the exam by mastering the skills and experience measured by these objectives: Modeling dataManaging connections and contextQuerying dataManipulating dataDeveloping and deploying reliable applications"

    Read the article

  • SQL – Quick Start with Explorer Sections of NuoDB – Query NuoDB Database

    - by Pinal Dave
    This is the third post in the series of the blog posts I am writing about NuoDB. NuoDB is very innovative and easy-to-use product. I can clearly see how one can scale-out NuoDB with so much ease and confidence. In my very first blog post we discussed how we can install NuoDB (link), and in my second post I discussed how we can manage the NuoDB database transaction engines and storage managers with a few clicks (link). Note: You can Download NuoDB from here. In this post, we will learn how we can use the Explorer feature of NuoDB to do various SQL operations. NuoDB has a browser-based Explorer, which is very powerful and has many of the features any IDE would normally have. Let us see how it works in the following step-by-step tutorial. Let us go to the NuoDBNuoDB Console by typing the following URL in your browser: http://localhost:8080/ It will bring you to the QuickStart screen. Make sure that you have created the sample database. If you have not created sample database, click on Create Database and create it successfully. Now go to the NuoDB Explorer by clicking on the main tab, and it will ask you for your domain username and password. Enter the username as a domain and password as a bird. Alternatively you can also enter username as a quickstart and password as a quickstart. Once you enter the password you will be able to see the databases. In our example we have installed the Sample Database hence you will see the Test database in our Database Hierarchy screen. When you click on database it will ask for the database login. Note that Database Login is different from Domain login and you will have to enter your database login over here. In our case the database username is dba and password is goalie. Once you enter a valid username and password it will display your database. Further expand your database and you will notice various objects in your database. Once you explore various objects, select any database and click on Open. When you click on execute, it will display the SQL script to select the data from the table. The autogenerated script displays entire result set from the database. The NuoDB Explorer is very powerful and makes the life of developers very easy. If you click on List SQL Statements it will list all the available SQL statements right away in Query Editor. You can see the popup window in following image. Here is the cool thing for geeks. You can even click on Query Plan and it will display the text based query plan as well. In case of a SELECT, the query plan will be much simpler, however, when we write complex queries it will be very interesting. We can use the query plan tab for performance tuning of the database. Here is another feature, when we click on List Tables in NuoDB Explorer.  It lists all the available tables in the query editor. This is very helpful when we are writing a long complex query. Here is a relatively complex example I have built using Inner Join syntax. Right below I have displayed the Query Plan. The query plan displays all the little details related to the query. Well, we just wrote multi-table query and executed it against the NuoDB database. You can use the NuoDB Admin section and do various analyses of the query and its performance. NuoDB is a distributed database built on a patented emergent architecture with full support for SQL and ACID guarantees.  It allows you to add Transaction Engine processes to a running system to improve the performance of your system.  You can also add a second Storage Engine to your running system for redundancy purposes.  Conversely, you can shut down processes when you don’t need the extra database resources. NuoDB also provides developers and administrators with a single intuitive interface for centrally monitoring deployments. If you have read my blog posts and have not tried out NuoDB, I strongly suggest that you download it today and catch up with the learnings with me. Trust me though the product is very powerful, it is extremely easy to learn and use. Reference: Pinal Dave (http://blog.sqlauthority.com)   Filed under: Big Data, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: NuoDB

    Read the article

  • How to Import Data Taxonomy Into Managed Metada Service in SharePoint 2010

    - by Wayne
    First, Open the Term Store Management Tool (Site Actions > Site Settings > Term Store Management) an download the sample import file. (Remember, Service Applications are configured on a per Web Application basis, so use any site collection inside a WebApp configured with your MMS.) Second, Insert your data. In the photo below, I demonstrate creating a term called USA. Under that, I create the term Alabama. Under that, 4 cities. Then under USA, a term called Alaska. The point is that the we have a hierarchy. Using the import file, we can go 7 layers deep. The last steps is Save the file, head back into the Term Store Management Tool, select/create a group, and Import the file.

    Read the article

  • Google I/O 2012 - Powering Your Application's Data using Google Cloud Storage

    Google I/O 2012 - Powering Your Application's Data using Google Cloud Storage Navneet Joneja, Nathan Herring Since opening its doors to all developers at Google I/O last year, the Google Cloud Storage team has shipped several features that let you use Google Cloud Storage for a variety of advanced use cases. This session will open with a quick introduction to the product, and quickly shift focus to implementing a variety of advanced applications using new features in Google Cloud Storage. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 48 1 ratings Time: 58:32 More in Science & Technology

    Read the article

  • Displaying XML data using XSLT transformations in an ASP.Net site

    - by nikolaosk
    In this post, I will try to show you how to display xml data in asp.net website after making some xslt transformations. You will need to know a few things about XSLT. The best place to find out about XSLT is this link . I am going to explain a few things about XSLT elements and functions in this post, anyway. You will see the namespaces we are going to use and some of the main classes and methods.All these come with the FCL (Framework class library) and we just have to know what they do. We will...(read more)

    Read the article

  • Oracle Database 11g Underground Advice for Database Administrators, by April C. Sims

    - by alejandro.vargas
    Recently I received a request to review the book "Oracle Database 11g Underground Advice for Database Administrators" by April C. Sims I was happy to have the opportunity know some details about the author, she is an active contributor to the Oracle DBA community, through her blog "Oracle High Availability" . The book is a serious and interesting work, I think it provides a good study and reference guide for DBA's that want to understand and implement highly available environments. She starts walking over the more general aspects and skills required by a DBA and then goes on explaining the steps required to implement Data Guard, using RMAN, upgrading to 11g, etc.

    Read the article

  • Kicking yourself because you missed the Oracle OpenWorld and Oracle Develop Call for Papers?

    - by charlie.berger
    Here's a great opportunity!If you missed the Oracle OpenWorld and Oracle Develop Call for Papers, here is another opportunity to submit a paper to present. Submit a paper and ask your colleagues, Oracle Mix community, friends and anyone else you know to vote for your session. As applications of data mining and predictive analytics are always interesting, your chances of getting accepted by votes is higher.  Note, only Oracle Mix members are allowed to vote. Voting is open from the end of May through June 20. For the most part, the top voted sessions will be selected for the program (although we may choose sessions in order to balance the content across the program). Please note that Oracle reserves the right to decline sessions that are not appropriate for the conference, such as subjects that are competitive in nature or sessions that cover outdated versions of products. Oracle OpenWorld and Oracle DevelopSuggest-a-Sessionhttps://mix.oracle.com/oow10/proposals FAQhttps://mix.oracle.com/oow10/faq

    Read the article

  • Exam 71-516: Accessing Data with Microsoft .NET Framework 4

    - by Ricardo Peres
    I had the chance to take the beta version of exam 71-516 today. Here are my thoughts on it: first, I was rather annoyed to discover that I will only know if I passed or not about 8 weeks after the beta period expires (July, 02), which probably means September. It was a difficult exam, especially since I don't have any practice on some of the new Entity Framework options. The items covered, from the most covered to the least covered, were: Entity Framework (50-50 for POCO/Non-POCO) LINQ to SQL WCF Data Services Classic ADO.NET (DataSets, DataTables, DataAdapters, TableAdapters, Connections and Commands LINQ to XML Sync Framework (surprise!) All added up, I think it was a difficult exam. My advise is that you practice a lot! I will post the result as soon as I know it.

    Read the article

  • Video: Telerik Silverlight Chart showing live data from SharePoint 2010

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). In this video, I demonstrate  - The process of writing, authoring, deploying, configuring, and debugging a Custom WCF service in SharePoint 2010 Integrating it with the Telerik Silverlight RAD Chart, that shows live data from the server showing CPU Usage of your web front end – can be enhanced to show whatever else you want. Doing all this in Visual Studio 2010, how you’d put your project together, how do you go about diagnosing it, debugging it – the whole bit. The whole presentation is about 45 mins, and it’s mostly all code, so plenty of juicy stuff here! At the end of this, you have a pretty sexy app running .. just fast forward to the end of the video below, and you’ll see what I’m talking about. :) You can watch the video here Comment on the article ....

    Read the article

  • How to Send Custom data in ADF/XML format

    - by Kaidul Islam Sazal
    I have built a lead generator form Here which will send email in ADF/XML format.But all I found in internet about ADF that there are limited tags like contact, customer, vendors, vehicle and their corresponding sub tags.I have to send these information via ADF/XML : Your Company Name App Name Description for iTunes and Google Keywords Image or logo for icon First splash page image Second splash page image About Company work schedule List up to 10 services your company provide Company Email Company Phone Company Website Company Facebook Company Youtube Company Twitter Company Google Comments My question is that, how I can send all this data in ADF/XML ? what will be tags for these ? What will be format? I didn't find any specific answers on it in internet.

    Read the article

  • Script to embed image and CSS data in HTML files

    - by andreas-h
    I have a HTML page which references external stylesheets and shows some images. I'm looking for an easy way to include all referenced external resources in the HTML file directly, so that it doesn't have any external references any more. (Images should be included in the HTML file using the <img src="data:image/jpg;base64,[...] method). EDIT: I want to do this so that my web proxy can deliver a nice-looking error page if the backend is down, so I have to assume all my webservers cannot deliver the static content which would normally be linked to from my websites (CSS, images).

    Read the article

  • Manchester UG Presentation Video

    In July I was invited to speak at the UK SQL Server UG event in Manchester.  I spoke about Excel being a good data mining client.  I was a little rushed at the end as Chris Testa-ONeill told me I had only 5 minutes to go when I had only been talking for 10 minutes.  Apparently I have a reputation for running over my time allocation.  At the event we also had a product demo from SQL Sentry around their BI monitoring dashboard solution.  This includes SSIS but the main thrust was SSAS Then came Chris with a look at Analysis Services.  If you have never heard Chris talk then take the opportunity now, he is a top class presenter and I am often found sat at the back of his classes. Here is the video link

    Read the article

  • A deque based on binary trees

    - by Greg Ros
    This is a simple immutable deque based on binary trees. What do you think about it? Does this kind of data structure, or possibly an improvement thereof, seem useful? How could I improve it, preferably without getting rid of its strengths? (Not in the sense of more operations, in the sense of different design) Does this sort of thing have a name? Red nodes are newly instantiated; blue ones are reused. Nodes aren't actually red or anything, it's just for emphasis.

    Read the article

  • MySQL - ODBC Data Connector

    - by Stuart Brierley
    Having previsouly installed and then configured MySQL, you may now need to install the ODBC Data Connector driver in order to connect to your MySQL database. Following the Splash screen the first thing to choose is the Setup Type for your installation.  As usual I chose custom so that I could see the components that were actually being installed.  In this case the custom set up screen allows you to choose to install the driver and the documentation. Finally you can complete the installation Assuming it completes okay you have now installed the MySQL ODBC driver. My intention for installing all these MySQL components is so that I can now attempt to get BizTalk 2009 talking to the MySQL database for a solution that I am currently working on.  For this I will next be looking at the Community ODBC Adapter.

    Read the article

  • Best Practices for SOA 11g Multi Data Center Active – Active Deployment – White Paper

    - by JuergenKress
    Best practice for High Availability This paper describes the recommended Active - Active solutions that can be used for protecting an Oracle Fusion Middleware 11 g SOA system against downtime across multiple locations (referred to as SOA Active - Active Disaster Recovery Solution or SOA Multi Data Center Active - Active Deployment). It provides the required configuration steps for setting up the recommended topologies and guidance about the performance and failover implications of such a configuration. Get the white paper here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: high availability,best practice,active deployment,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • April 18: Learn about Oracle Hyperion Data Relationship Management

    - by Theresa Hickman
    Do you have multiple charts of accounts on different application instances? Would you like an easy way to synchronize your charts of accounts across instances? If you answered yes, then please join us in an informal reference call with Johnson Controls who were able to synchronize their charts of accounts across 5 HFM (Hyperion Financial Management) instances using Hyperion Data Relationship Management (DRM). Johnson Controls is a global technology and industrial leader with 162,000 employees, serving customers in more than 150 countries. This call will include a brief overview of Johnson Controls and their solution followed by a candid discussion and an open question and answer session. When: April 18, 2012 Time: 8:00 am PST Duration: 1 Hour Speaker: Raymond Chontos, HFM Application Manager Global Financial Systems Click here to register.

    Read the article

  • SQL Server Data Tools–BI for Visual Studio 2013 Re-released

    - by Greg Low
    Customers used to complain that the tooling for creating BI projects (Analysis Services MD and Tabular, Reporting Services, and Integration services) has been based on earlier versions of Visual Studio than the ones they were using for their other work in Visual Studio (such as C#, VB, and ASP.NET projects). To alleviate that problem, the shipment of those tools has been decoupled from the shipment of the SQL Server product. In SQL Server 2014, the BI tooling isn’t even included in the released version of SQL Server. This allows the team to keep up-to-date with the releases of Visual Studio. A little while back, I was really pleased to see that the Visual Studio 2013 update for SSDT-BI (SQL Server Data Tools for Business Intelligence) had been released. Unfortunately, they then had to be withdrawn. The good news is that they’re back and you can get the latest version from here: http://www.microsoft.com/en-us/download/details.aspx?id=42313

    Read the article

  • Showing schema.org aggregate rating in Google rich snippets

    - by nickh
    After adding schema.org microdata markup for reviews and aggregate ratings, I expected review and rating information to show up in rich snippets. Unfortunately, neither are being shown. Google's Structured Data Testing Tool finds the microdata, and there're no errors or warnings on the page. Any idea what's wrong with the microdata markup? Example 1: Live Page: http://www.shelflife.net/ljn-thundercats-series-3/bengali Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fljn-thundercats-series-3%2Fbengali Example 2: Live Page: http://www.shelflife.net/star-wars-mighty-muggs/asajj-ventress Google Test: http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.shelflife.net%2Fstar-wars-mighty-muggs%2Fasajj-ventress

    Read the article

  • How to Calculate TCP Socket Buffer Sizes for Data Guard Environments

    - by alejandro.vargas
    The MAA best practices contains an example of how to calculate the optimal TCP socket buffer sizes, that is quite important for very busy Data Guard environments, this document Formula to Calculate TCP Socket Buffer Sizes.pdf contains an example of using the instructions provided on the best practices document. In order to execute the calculation you need to know which is the band with or your network interface, usually will be 1Gb, on my example is a 10Gb network; and the round trip time, RTT, that is the time it takes for a packet to make a travel to the other end of the network and come back, on my example that was provided by the network administrator and was 3 ms (1000/seconds)

    Read the article

  • Az OTP Bank az Oracle Warehouse Builder-t használja

    - by Fekete Zoltán
    Az Oracle.com-on az ügyfél sikertörténetek között az imént jelent meg a következo dokumentum: OTP Bank Data Warehouse Development Team Improves Service Level and Lowers Reporting Lead Time for Business Fields by 80%, azaz az OTP Bank az adattárház fejlesztéshez az Oracle Warehouse Builder ETL-ELT eszközt használja. AZ OTP Bank Tranzakciós Adattárház fejleszto csapata magasabb minoségi szintre emelte a belso megrendeloknek nyújtott szoltáltatásait, amely egyik eredménye, hogy 80%-al csökkentette az üzletágak közötti riportolási folyamatok átfutási idotartamát. A magyar nyelvu sikertörténet innen töltheto le. A legfontosabb eredmények az OWB kapcsán: - ETL folyamatok sztenderdizációján keresztül elért adatminoség javulás, OWB - Oracle Business Intelligence EE: az üzleti területek és az IT fejlesztés közötti együttmoködés hatékonyabb - sztenderdizált ETL és riportolási folyamatok: - fix jelentés készletek hatására tudatos üzleti metaadat kezelés - egységes terminológia - komplex banki folyamatok pontos ismerete: üzleti területek és IT fejlesztok számára - hatékony banki együttmoködés - a megrendeléstol az adatpublikációig tartó folyamatok idotartama lecsökkent - az ad-hoc riportok elkészítése a korábbi 1,5 hétrol 80%-al, átlagosan 2 munkanapra csökkent

    Read the article

  • HDWF [was OHDF]

    - by Glen McCallum
    Acronyms, acronyms ... same name (Oracle Healthcare Data Warehouse Foundation). Now it goes by HDWF. Don't ask me why. HDWF Version 2.0 was released quietly on 12 May. I'm told it is available on eDelivery. I've been spending more time working on HDWF this month. There's no question Oracle is moving at full-steam on this one. I've even spent a few nights this week working on India time with the team over there. We're busy moving Oracle's Operating Room Analytics application onto the new HDWF enterprise healthcare model. It's really been a great illustration of the comprehensiveness of the model. It was easily able accomodate all of the information required by ORA downstream.

    Read the article

  • Hurricanes Since 1851 [Visualization]

    - by Jason Fitzpatrick
    Much like you can map out volcanic eruptions to create a neat pattern around the Pacific Ring of Fire, you can also map out hurricanes and tropical storms. Check out this high-resolution visualization to see the pattern formed by a century and a half of storms. Courtesy of UXBlog and data from the National Oceanic and Atmospheric Administration, the above projection shows the path of tropical storms around the equator (the perspective, if the map looks unfamiliar to you, is bottom up with Antarctica and the lower portion of South America in the center). For a full resolution copy of the image and more information about how it was rendered, hit up the link below. Hurricanes Since 1851 [via Cool Infographics] How To Get a Better Wireless Signal and Reduce Wireless Network Interference How To Troubleshoot Internet Connection Problems 7 Ways To Free Up Hard Disk Space On Windows

    Read the article

  • 12/14 IDC Webcast on Insurance Distribution Strategies -- Manage Data and Engage Customers

    - by charles.knapp
    The insurance industry faces unprecedented challenges from new competition, more rigorous regulatory obligations, tighter capital restrictions, and more demanding customers. The winners will be those insurers that can successfully manage complex and disparate data resources to engage successfully with their customers, building trust through outstanding, multi-channel customer service with the insurer and its agents. At the heart of all these issues is the ability of insurers to engage directly with agents and customers using their preferred channels; measure risk and profitability accurately, and quickly to enable swift decision-making; and transform aging IT infrastructure so that the business can drive down costs and protect eroding margins. In this one-hour webcast, moderated by Insurance & Technology Magazine Executive Editor Anthony O'Donnell, you will learn about critical distribution management strategies that work. Join Peter Farley of analyst firm IDC Financial Insights, Scott Mampre of Capgemini, and Srini Venkat of Oracle Insurance to learn ways to maximize improvements to competitiveness, customer service, operating efficiencies - and ultimately profitability and growth. Please join us!

    Read the article

  • Closest location - Heapify or Build-heap

    - by Trevor Adams
    So lets say we have a set of gps data points and your current location. If asked to give the closest point to your current location we can utilize a heap with the distance being the key. Now if we update the current location, I suspect that only a few of the keys will change enough to violate the heap property. Would it be more efficient to rebuild the heap after recalculating the keys or to run heapify (assuming that only a few of the keys have changed enough). It is assumed that we don't jump around with the new location (new current location is close to the last current location).

    Read the article

  • Javascript and PHP how should I en/decode my data

    - by Ron
    Hello everyone. I whould like to know in what encryption should I encode my data and why first of all, I use GET method because it is search engine inside website. second, I use RTL language (hebrew) and thrid which basically is why I ask this question - firefox and safari (as I understood) encode and decode urls automaticly so if I encoded url, in firefox I will see it decoded which is good but if I copy-paste the url to the address bar and than enter the site firefox encode the uncoded url to utf (i think). anyway, what en/decode should I use, and how can I overcome the firefox auto en/decode?

    Read the article

< Previous Page | 120 121 122 123 124 125 126 127 128 129 130 131  | Next Page >