Search Results

Search found 15930 results on 638 pages for 'analysis services'.

Page 264/638 | < Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >

  • Use a Fake Http Channel to Unit Test with HttpClient

    - by Steve Michelotti
    Applications get data from lots of different sources. The most common is to get data from a database or a web service. Typically, we encapsulate calls to a database in a Repository object and we create some sort of IRepository interface as an abstraction to decouple between layers and enable easier unit testing by leveraging faking and mocking. This works great for database interaction. However, when consuming a RESTful web service, this is is not always the best approach. The WCF Web APIs that are available on CodePlex (current drop is Preview 3) provide a variety of features to make building HTTP REST services more robust. When you download the latest bits, you’ll also find a new HttpClient which has been updated for .NET 4.0 as compared to the one that shipped for 3.5 in the original REST Starter Kit. The HttpClient currently provides the best API for consuming REST services on the .NET platform and the WCF Web APIs provide a number of extension methods which extend HttpClient and make it even easier to use. Let’s say you have a client application that is consuming an HTTP service – this could be Silverlight, WPF, or any UI technology but for my example I’ll use an MVC application: 1: using System; 2: using System.Net.Http; 3: using System.Web.Mvc; 4: using FakeChannelExample.Models; 5: using Microsoft.Runtime.Serialization; 6:   7: namespace FakeChannelExample.Controllers 8: { 9: public class HomeController : Controller 10: { 11: private readonly HttpClient httpClient; 12:   13: public HomeController(HttpClient httpClient) 14: { 15: this.httpClient = httpClient; 16: } 17:   18: public ActionResult Index() 19: { 20: var response = httpClient.Get("Person(1)"); 21: var person = response.Content.ReadAsDataContract<Person>(); 22:   23: this.ViewBag.Message = person.FirstName + " " + person.LastName; 24: 25: return View(); 26: } 27: } 28: } On line #20 of the code above you can see I’m performing an HTTP GET request to a Person resource exposed by an HTTP service. On line #21, I use the ReadAsDataContract() extension method provided by the WCF Web APIs to serialize to a Person object. In this example, the HttpClient is being passed into the constructor by MVC’s dependency resolver – in this case, I’m using StructureMap as an IoC and my StructureMap initialization code looks like this: 1: using StructureMap; 2: using System.Net.Http; 3:   4: namespace FakeChannelExample 5: { 6: public static class IoC 7: { 8: public static IContainer Initialize() 9: { 10: ObjectFactory.Initialize(x => 11: { 12: x.For<HttpClient>().Use(() => new HttpClient("http://localhost:31614/")); 13: }); 14: return ObjectFactory.Container; 15: } 16: } 17: } My controller code currently depends on a concrete instance of the HttpClient. Now I *could* create some sort of interface and wrap the HttpClient in this interface and use that object inside my controller instead – however, there are a few why reasons that is not desirable: For one thing, the API provided by the HttpClient provides nice features for dealing with HTTP services. I don’t really *want* these to look like C# RPC method calls – when HTTP services have REST features, I may want to inspect HTTP response headers and hypermedia contained within the message so that I can make intelligent decisions as to what to do next in my workflow (although I don’t happen to be doing these things in my example above) – this type of workflow is common in hypermedia REST scenarios. If I just encapsulate HttpClient behind some IRepository interface and make it look like a C# RPC method call, it will become difficult to take advantage of these types of things. Second, it could get pretty mind-numbing to have to create interfaces all over the place just to wrap the HttpClient. Then you’re probably going to have to hard-code HTTP knowledge into your code to formulate requests rather than just “following the links” that the hypermedia in a message might provide. Third, at first glance it might appear that we need to create an interface to facilitate unit testing, but actually it’s unnecessary. Even though the code above is dependent on a concrete type, it’s actually very easy to fake the data in a unit test. The HttpClient provides a Channel property (of type HttpMessageChannel) which allows you to create a fake message channel which can be leveraged in unit testing. In this case, what I want is to be able to write a unit test that just returns fake data. I also want this to be as re-usable as possible for my unit testing. I want to be able to write a unit test that looks like this: 1: [TestClass] 2: public class HomeControllerTest 3: { 4: [TestMethod] 5: public void Index() 6: { 7: // Arrange 8: var httpClient = new HttpClient("http://foo.com"); 9: httpClient.Channel = new FakeHttpChannel<Person>(new Person { FirstName = "Joe", LastName = "Blow" }); 10:   11: HomeController controller = new HomeController(httpClient); 12:   13: // Act 14: ViewResult result = controller.Index() as ViewResult; 15:   16: // Assert 17: Assert.AreEqual("Joe Blow", result.ViewBag.Message); 18: } 19: } Notice on line #9, I’m setting the Channel property of the HttpClient to be a fake channel. I’m also specifying the fake object that I want to be in the response on my “fake” Http request. I don’t need to rely on any mocking frameworks to do this. All I need is my FakeHttpChannel. The code to do this is not complex: 1: using System; 2: using System.IO; 3: using System.Net.Http; 4: using System.Runtime.Serialization; 5: using System.Threading; 6: using FakeChannelExample.Models; 7:   8: namespace FakeChannelExample.Tests 9: { 10: public class FakeHttpChannel<T> : HttpClientChannel 11: { 12: private T responseObject; 13:   14: public FakeHttpChannel(T responseObject) 15: { 16: this.responseObject = responseObject; 17: } 18:   19: protected override HttpResponseMessage Send(HttpRequestMessage request, CancellationToken cancellationToken) 20: { 21: return new HttpResponseMessage() 22: { 23: RequestMessage = request, 24: Content = new StreamContent(this.GetContentStream()) 25: }; 26: } 27:   28: private Stream GetContentStream() 29: { 30: var serializer = new DataContractSerializer(typeof(T)); 31: Stream stream = new MemoryStream(); 32: serializer.WriteObject(stream, this.responseObject); 33: stream.Position = 0; 34: return stream; 35: } 36: } 37: } The HttpClientChannel provides a Send() method which you can override to return any HttpResponseMessage that you want. You can see I’m using the DataContractSerializer to serialize the object and write it to a stream. That’s all you need to do. In the example above, the only thing I’ve chosen to do is to provide a way to return different response objects. But there are many more features you could add to your own re-usable FakeHttpChannel. For example, you might want to provide the ability to add HTTP headers to the message. You might want to use a different serializer other than the DataContractSerializer. You might want to provide custom hypermedia in the response as well as just an object or set HTTP response codes. This list goes on. This is the just one example of the really cool features being added to the next version of WCF to enable various HTTP scenarios. The code sample for this post can be downloaded here.

    Read the article

  • User Group meeting in Copenhagen for #powerpivot

    - by Marco Russo (SQLBI)
    The next Monday, March 21st, I will join a special event organized by the Danish SQL Server User Group , Excelbi.dk and the Swedish SQL Server User Group . The meeting will start at 18:00 at the Radisson Royal Blu in Copenhagen, and this is the topic we will discuss. PowerPivot / BISM and the future of a BI Solution The next version of Analysis Services will offer the BI Semantic Model (BISM) that is based on Vertipaq, the same engine that runs PowerPivot. DAX and PowerPivot have been created as...(read more)

    Read the article

  • Combining Shared Secret and Username Token – Azure Service Bus

    - by Michael Stephenson
    As discussed in the introduction article this walkthrough will explain how you can implement WCF security with the Windows Azure Service Bus to ensure that you can protect your endpoint in the cloud with a shared secret but also flow through a username token so that in your listening WCF service you will be able to identify who sent the message. This could either be in the form of an application or a user depending on how you want to use your token. Prerequisites Before going into the walk through I want to explain a few assumptions about the scenario we are implementing but to keep the article shorter I am not going to walk through all of the steps in how to setup some of this. In the solution we have a simple console application which will represent the client application. There is also the services WCF application which contains the WCF service we will expose via the Windows Azure Service Bus. The WCF Service application in this example was hosted in IIS 7 on Windows 2008 R2 with AppFabric Server installed and configured to auto-start the WCF listening services. I am not going to go through significant detail around the IIS setup because it should not matter in relation to this article however if you want to understand more about how to configure WCF and IIS for such a scenario please refer to the following paper which goes into a lot of detail about how to configure this. The link is: http://tinyurl.com/8s5nwrz   The Service Component To begin with let's look at the service component and how it can be configured to listen to the service bus using a shared secret but to also accept a username token from the client. In the sample the service component is called Acme.Azure.ServiceBus.Poc.UN.Services. It has a single service which is the Visual Studio template for a WCF service when you add a new WCF Service Application so we have a service called Service1 with its Echo method. Nothing special so far!.... The next step is to look at the web.config file to see how we have configured the WCF service. In the services section of the WCF configuration you can see I have created my service and I have created a local endpoint which I simply used to do a little bit of diagnostics and to check it was working, but more importantly there is the Windows Azure endpoint which is using the ws2007HttpRelayBinding (note that this should also work just the same if your using netTcpRelayBinding). The key points to note on the above picture are the service behavior called MyServiceBehaviour and the service bus endpoints behavior called MyEndpointBehaviour. We will go into these in more detail later.   The Relay Binding The relay binding for the service has been configured to use the TransportWithMessageCredential security mode. This is the important bit where the transport security really relates to the interaction between the service and listening to the Azure Service Bus and the message credential is where we will use our username token like we have specified in the message/clientCrentialType attribute. Note also that we have left the relayClientAuthenticationType set to RelayAccessToken. This means that authentication will be made against ACS for accessing the service bus and messages will not be accepted from any sender who has not been authenticated by ACS.   The Endpoint Behaviour In the below picture you can see the endpoint behavior which is configured to use the shared secret client credential for accessing the service bus and also for diagnostic purposes I have included the service registry element. Hopefully if you are familiar with using Windows Azure Service Bus relay feature the above is very familiar to you and this is a very common setup for this section. There is nothing specific to the username token implementation here. The Service Behaviour Now we come to the bit with most of the username token bits in it. When you configure the service behavior I have included the serviceCredentials element and then setup to use userNameAuthentication and you can see that I have created my own custom username token validator.   This setup means that WCF will hand off to my class for validating the username token details. I have also added the serviceSecurityAudit element to give me a simple auditing of access capability. My UsernamePassword Validator The below picture shows you the details of the username password validator class I have implemented. WCF will hand off to this class when validating the token and give me a nice way to check the token credentials against an on-premise store. You have all of the validation features with a non-service bus WCF implementation available such as validating the username password against active directory or ASP.net membership features or as in my case above something much simpler.   The Client Now let's take a look at the client side of this solution and how we can configure the client to authenticate against ACS but also send a username token over to the service component so it can implement additional security checks on-premise. I have a console application and in the program class I want to use the proxy generated with Add Service Reference to send a message via the Azure Service Bus. You can see in my WCF client configuration below I have setup my details for the azure service bus url and am using the ws2007HttpRelayBinding. Next is my configuration for the relay binding. You can see below I have configured security to use TransportWithMessageCredential so we will flow the username token with the message and also the RelayAccessToken relayClientAuthenticationType which means the component will validate against ACS before being allowed to access the relay endpoint to send a message.     After the binding we need to configure the endpoint behavior like in the below picture. This is the normal configuration to use a shared secret for accessing a Service Bus endpoint.   Finally below we have the code of the client in the console application which will call the service bus. You can see that we have created our proxy and then made a normal call to a WCF service but this time we have also set the ClientCredentials to use the appropriate username and password which will be flown through the service bus and to our service which will validate them.     Conclusion As you can see from the above walkthrough it is not too difficult to configure a service to use both a shared secret and username token at the same time. This gives you the power and protection offered by the access control service in the cloud but also the ability to flow additional tokens to the on-premise component for additional security features to be implemented. Sample The sample used in this post is available at the following location: https://s3.amazonaws.com/CSCBlogSamples/Acme.Azure.ServiceBus.Poc.UN.zip

    Read the article

  • SolidQ Journal - free SQL goodness for February

    - by Greg Low
    The SolidQ Journal for February just made it out by the end of February 28th. But again, it's great to see the content appearing. I've included the second part of the article on controlling the execution context of stored procedures. The first part was in December. Also this month, along with Fernando Guerrero's editorial, Analysis Services guru Craig Utley has written about aggregations, Herbert Albert and Gianluca Holz have continued their double-act and described how to automate database migrations,...(read more)

    Read the article

  • SQL SERVER – SSAS – Multidimensional Space Terms and Explanation

    - by pinaldave
    I was presenting on SQL Server session at one of the Tech Ed On Road event in India. I was asked very interesting question during ‘Stump the Speaker‘ session. I am sharing the same with all of you over here. Question: Can you tell me in simple words what is dimension, member and other terms of multidimensional space? There is no simple example for it. This is extreme fundamental question if you know Analysis Service. Those who have no exposure to the same and have not yet started on this subject, may find it a bit difficult. I really liked his question so I decided to answer him there as well blog about the same over here. Answer: Here are the most important terms of multidimensional space – dimension, member, value, attribute and size. Dimension – It describes the point of interests for analysis. Member – It is one of the point of interests in the dimension. Value – It uniquely describes the member. Attribute – It is collection of multiple members. Size – It is total numbers for any dimension. Let us understand this further detail taking example of any space. I am going to take example of distance as a space in our example. Dimension – Distance is a dimension for us. Member – Kilometer – We can measure distance in Kilometer. Value – 4 – We can measure distance in the kilometer unit and the value of the unit can be 4. Attribute – Kilometer, Miles, Meter – The complete set of members is called attribute. Size – 100 KM – The maximum size decided for the dimension is called size. The same example can be also defined by using time space. Here is the example using time space. Dimension – Time Member – Date Value – 25 Attribute – 1, 2, 3…31 Size – 31 I hope it is clear enough that what are various multidimensional space and its terms. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Software Developers

    A Software Developer is a person who analyzes the problem and gathers the information about a particular program. And then on the basis of the analysis the programmer makes error free software which ... [Author: Petter Martine - Computers and Internet - April 11, 2010]

    Read the article

  • Book Review: Fast Track to MDX

    - by Greg Low
    Another book that I re-read while travelling last week was Fast Track to MDX . I still think that it's the best book that I've seen for introducing the core concepts of MDX. SolidQ colleague Mark Whitehorn, along with Mosha Pasumansky and Robert Zare do an amazing job of building MDX knowledge throughout the book. I had dinner with Mark in London a few years back and I was pestering him to update this book. The biggest limitation of the book is that it was written for SQL Server 2000 Analysis Services,...(read more)

    Read the article

  • Why Keyword Research Should Come First

    Every online business should begin with keywords research and analysis. Unfortunately, there are a great number of online entrepreneurs that focus on site construction first before they think of this fundamental element. Not tackling keyword units first can lead to certain difficulties.

    Read the article

  • 7 of the Best Free Linux Bible Software

    <b>LinuxLinks:</b> "Now, let's explore the 7 Bible software at hand. For each title we have compiled its own portal page, providing a screenshot of the software in action, a full description with an in-depth analysis of its features, together with links to relevant resources and reviews."

    Read the article

  • How to keep AST for feature access?

    - by greenoldman
    Consider such code (let's say it is C++) Foo::Bar.get().X How one should keep the AST for this -- as "tree" with root at left Foo(Bar(get(X)), or with root at right (((Foo)Bar)get)X? Or maybe as a flat structure (list)? The first one seems more convenient when resolving names, the second when working with it as expression. I set tag parsing but I am asking from semantic analysis POV really (there is no such tag).

    Read the article

  • SharePoint Planning/Design Worksheet Links

    - by Mike Huguet
    I ran across a blog entry with a consolidated list of links to the SharePoint 2007 planning worksheets.  These are good starting points for your discovery, analysis, and design and are provided by Microsoft.  I would suggest tweaking them to meet your organizational needs.  http://itfootprint.wordpress.com/2007/10/05/sharepoint-planning-worksheets-in-one-place/ TechNet provides a consolidated list of planning worksheets for SharePoint 2010.  http://technet.microsoft.com/en-us/library/cc262451.aspx  Technorati Tags: SharePoint,planning,design

    Read the article

  • Why does my WIFI connection drop every 5-20 minutes?

    - by Benett Freeman
    There are other questions on this but none have been taken very far in their analysis. I get disconnects very often - sometimes every 20 mins or so, but sometimes even every few minutes. The way I have been getting around it is to disconnect from the Wifi connection and reconnect - and then it works fine again until the next disconnect. I am running 11.04 on an ASUS A52F. Can anyone help me please?

    Read the article

  • WIF, ADFS 2 and WCF&ndash;Part 2: The Service

    - by Your DisplayName here!
    OK – so let’s first start with a simple WCF service and connect that to ADFS 2 for authentication. The service itself simply echoes back the user’s claims – just so we can make sure it actually works and to see how the ADFS 2 issuance rules emit claims for the service: [ServiceContract(Namespace = "urn:leastprivilege:samples")] public interface IService {     [OperationContract]     List<ViewClaim> GetClaims(); } public class Service : IService {     public List<ViewClaim> GetClaims()     {         var id = Thread.CurrentPrincipal.Identity as IClaimsIdentity;         return (from c in id.Claims                 select new ViewClaim                 {                     ClaimType = c.ClaimType,                     Value = c.Value,                     Issuer = c.Issuer,                     OriginalIssuer = c.OriginalIssuer                 }).ToList();     } } The ViewClaim data contract is simply a DTO that holds the claim information. Next is the WCF configuration – let’s have a look step by step. First I mapped all my http based services to the federation binding. This is achieved by using .NET 4.0’s protocol mapping feature (this can be also done the 3.x way – but in that scenario all services will be federated): <protocolMapping>   <add scheme="http" binding="ws2007FederationHttpBinding" /> </protocolMapping> Next, I provide a standard configuration for the federation binding: <bindings>   <ws2007FederationHttpBinding>     <binding>       <security mode="TransportWithMessageCredential">         <message establishSecurityContext="false">           <issuerMetadata address="https://server/adfs/services/trust/mex" />         </message>       </security>     </binding>   </ws2007FederationHttpBinding> </bindings> This binding points to our ADFS 2 installation metadata endpoint. This is all that is needed for svcutil (aka “Add Service Reference”) to generate the required client configuration. I also chose mixed mode security (SSL + basic message credential) for best performance. This binding also disables session – you can control that via the establishSecurityContext setting on the binding. This has its pros and cons. Something for a separate blog post, I guess. Next, the behavior section adds support for metadata and WIF: <behaviors>   <serviceBehaviors>     <behavior>       <serviceMetadata httpsGetEnabled="true" />       <federatedServiceHostConfiguration />     </behavior>   </serviceBehaviors> </behaviors> The next step is to add the WIF specific configuration (in <microsoft.identityModel />). First we need to specify the key material that we will use to decrypt the incoming tokens. This is optional for web applications but for web services you need to protect the proof key – so this is mandatory (at least for symmetric proof keys, which is the default): <serviceCertificate>   <certificateReference storeLocation="LocalMachine"                         storeName="My"                         x509FindType="FindBySubjectDistinguishedName"                         findValue="CN=Service" /> </serviceCertificate> You also have to specify which incoming tokens you trust. This is accomplished by registering the thumbprint of the signing keys you want to accept. You get this information from the signing certificate configured in ADFS 2: <issuerNameRegistry type="...ConfigurationBasedIssuerNameRegistry">   <trustedIssuers>     <add thumbprint="d1 … db"           name="ADFS" />   </trustedIssuers> </issuerNameRegistry> The last step (promised) is to add the allowed audience URIs to the configuration – WCF clients use (by default – and we’ll come back to this) the endpoint address of the service: <audienceUris>   <add value="https://machine/soapadfs/service.svc" /> </audienceUris> OK – that’s it – now we have a basic WCF service that uses ADFS 2 for authentication. The next step will be to set-up ADFS to issue tokens for this service. Afterwards we can explore various options on how to use this service from a client. Stay tuned… (if you want to have a look at the full source code or peek at the upcoming parts – you can download the complete solution here)

    Read the article

  • Extremely large spike in traffic on the 1st - 4th of every month from mobile browsers

    - by wsanville
    I've noticed that on the 1st - 4th of the recent months (since January), several sites I maintain are getting thousands of requests from mobile browsers, whereas throughout the rest of the month, the numbers are in the single or double digits. Has anybody else noticed this sort of behavior? I don't have the exact user agents logged, but my analysis software (WebTrends) reports the traffic as mostly iPhone/iPad/iPod, Android, and Blackberry.

    Read the article

  • Increase Site Search Ranking With Search Engines

    Increasing your website's search ranking should start with an effective SEO strategy with keyword analysis playing a pivotal role in SEO. However, it can be argued that the less your website needs to rely on search engines for traffic, the more the search engines want to rely on your website.

    Read the article

  • NDepend 4.0 Released

    - by Anthony Trudeau
    Last week version 4.0 of NDepend was released.  NDepend is a Visual Studio add-in designed for intense code analysis with the goal of high quality code.  A month ago I wrapped up my evaluation of the previous version of NDepend. The new version contains many minor changes, several bug fixes, and adds about 50 new code rules.  The version also adds support for Visual Studio 11, .NET Framework 4.5, and SilverLight 5.0.  But, the biggest change was the shift from CQL to CQLinq. Introducing CQLinq The latest version replaces the CQL rules language with CQLinq (CQL is still an option although the editor is buried).  As you might guess CQLinq is a flavor of Linq designed specifically for the code rules. The best way to illustrate the differences is with an example.  I used the following CQL example in Part 3 of my review: WARN IF Count > 0 IN SELECT TYPES WHERE IsInterface AND !NameLike “I” This same query looks like this when implemented in CQLinq: warnif count > 0 from t in Types where t.IsInterface == true && !t.NameLike(“I”) select t I like the syntax and it is a natural fit, but I found writing the queries frustrating in the Queries and Rules Edit window.  The Queries and Rules Edit window replaces the CQL Query Edit window.  The new editor has the same style of Intellisense as the previous editor.  However, it has a few annoyances.  The error indicator is a red block.  It has the tendency of obscuring your cursor.  Additionally, writing CQLing queries is like writing plain old Linq queries, so the fact that the editor uses Enter to select from Intellisense instead of Tab is jarring.  These issues can be an obstacle to writing queries quickly.CQLinq makes it possible to write rules that weren't possible before.  Additionally, a JustMyCode domain is now possible making it easy to eliminate generated code from the analysis.Should you Buy? I recommend NDepend overall.  It has some rough points for me that I have detailed in my earlier evaluation (starting here).  But, it’s definitely worth the money.  The bigger question is: should I pay for the upgrade to 4.0?  At this point I’m on the fence, but I would go for it if you need support for Visual Studio 2011, .NET Framework 4.5, or Silverlight 5.0; or if you need one of the many rules that weren't possible before CQLinq. Disclaimer: Patrick Smacchia contacted me about reviewing NDepend. I received a free license in return for sharing my experiences and talking about the capabilities of the add-in on this site. There is no expectation of a positive review elicited from the author of NDepend. Resources: NDepend Release Notes

    Read the article

  • 24 Extra Hot Free Linux Games (Part 1 of 3)

    <b>LinuxLinks:</b> " Now, let's scrutinize the 8 games at hand. For each game we have compiled its own portal page, providing screenshots of the game in action, a full description of the game, with an in-depth analysis of the features of the game, together with links to relevant resources and reviews."

    Read the article

  • The Other Side of XBRL

    - by john.orourke(at)oracle.com
    With the United States SEC's mandate for XBRL filings entering its third year, and impacting over 7000 additional companies in 2011, there's a lot of buzz in the industry about how companies should address the new reporting requirements.  Should they outsource the XBRL tagging process to a third party publisher, handle the process in-house with a bolt-on XBRL tool, or should they integrate XBRL tagging with the financial close and reporting process?  Oracle is recommending the latter approach, in fact  here's a link to a recent webcast that I did with CFO.com on this topic: http://www.cfo.com/webcasts/index.cfm/l_eventarchive/14548560 But production of XBRL-based filings is only half of the story. The other half is consumption of XBRL by regulators, academics, financial analysts and investors.  As I mentioned in my December article on the XBRL US conference, the feedback from these groups is that they are not really leveraging XBRL for analysis of companies due to a lack of tools and historic XBRL-based data on public companies.   The good news here is that the historic data problem is getting better as large, accelerated filers enter their third year of XBRL filings.  And the situation is getting better on the reporting and analysis tools side of the equation as well - and Oracle is leading the way. In early January, Oracle released the Oracle XBRL Extension for Oracle Database 11g.  This is a "no cost option" on top of the latest Oracle Database 11.2.0.2.0 release. With this added functionality organizations will have the ability to create one or more back-end XBRL repositories based on Oracle Database, which provide XBRL storage and query-ability with a set of XBRL-specific services.  The XBRL Extension to Oracle XML DB integrates easily with Oracle Business Intelligence Suite Enterprise Edition (OBIEE) for analytics and with interactive development environments (IDEs) and design tools for creating and editing XBRL taxonomies. The Oracle XBRL Extension to Oracle Database 11g should be attractive to regulators, stock exchanges, universities and other organizations that need to collect, analyze and disseminate XBRL-based filings.  It should also be attractive to organizations that produce XBRL filings, and need a way to store and compare their own XBRL-based financial filings to those of their peers and competitors. If you would like more information, here's a link to a web page on the Oracle Technology Network with the details about Oracle XBRL Extension for Oracle Database 11g, including data sheet, white paper, presentation, demos and other information: http://www.oracle.com/technetwork/database/features/xmldb/index-087631.html

    Read the article

  • Google Analytics Intelligence

    As we all are aware that Google analytic has always shown a good position among all analysis tools. It has been improving everyday and recently launched its new feature which is known as Google Analytic Intelligence.

    Read the article

  • Google I/O 2012 - Enterprise Geospatial in the Cloud

    Google I/O 2012 - Enterprise Geospatial in the Cloud Sean Maday, Mano Marks Google now offers a powerful and versatile cloud hosting solution for geospatial data and analysis. Learn how your business can exploit this potential to reduce costs, increase productivity, and deliver services to your employees and developers using familiar tools like Google Earth and the Google Maps API. For all I/O 2012 sessions, go to developers.google.com From: GoogleDevelopers Views: 790 9 ratings Time: 55:03 More in Science & Technology

    Read the article

  • Who Are the BI Users in Your Neighborhood?

    - by [email protected]
    By Brian Dayton on March 19, 2010 10:52 PM Forrester's Boris Evelson recently wrote a blog titled "Who are the BI Personas?" that I enjoyed for a number of reasons. It's a quick read, easy to grasp and (refreshingly) focuses on the users of technology VS the technology. As Evelson admits, he meant to keep the reference chart at a high-level because there are too many different permutations and additional sub-categories to make such a chart useful. For me, I wouldn't head into the technical permutations but more the contextual use of BI and the issues that users experience. My thoughts brought up more questions than answers such as: Context: - HOW: With the exception of the "Power User" persona--likely some sort of business or operations analyst? - WHEN: Are they using the information to make real-time decisions on the front lines (a customer service manager or shipping/logistics VP) or are they using this information for cumulative analysis and business planning? Or both? - WHERE: What areas of the business are more or less likely to rely on BI across an organization? Human Resources, Operations, Facilities, Finance--- and why are some more prone to use data-driven analysis than others? Issues: - DELAYS & DRAG ON IT?: One of the persona characteristics Evelson calls out is a reliance on IT. Every persona except for the "Power User" has a heavy reliance on IT for support. What business issues or delays does that cause to users? What is the drag on IT resources who could potentially be creating instead of reporting? - HOW MANY CLICKS: If BI is being used within the context of a transaction (sales manager looking for upsell opportunities as an example) is that person getting the information within the context of that action or transaction? Or are they minimizing screens, logging into another application or reporting tool, running queries, etc.? Who are the BI Users in your neighborhood or line of business? Do Evelson's personas resonate--and do the tools that he calls out (he refers to it as "BI Style") resonate with what your personas have or need? Finally, I'm very interested if BI use is viewed as a bolt-on...or an integrated part of your daily enterprise processes?

    Read the article

  • Free Webinar - Using Enterprise Data Integration Dashboards

    - by andyleonard
    Join Kent Bradshaw and me as we present Using Enterprise Data Integration Dashboards Tuesday 11 Dec 2012 at 10:00 AM ET! If data is the life of the modern organization, data integration is the heart of an enterprise. Data circulation is vital. Data integration dashboards provide enterprise ETL (Extract, Transform, and Load) teams near-real-time status supported with historical performance analysis. Join Linchpins Kent Bradshaw and Andy Leonard as they demonstrate and discuss the benefits of data...(read more)

    Read the article

  • What Precalculus knowledge is required before learning Discrete Math Computer Science topics?

    - by Ein Doofus
    Below I've listed the chapters from a Precalculus book as well as the author recommended Computer Science chapters from a Discrete Mathematics book. Although these chapters are from two specific books on these subjects I believe the topics are generally the same between any Precalc or Discrete Math book. What Precalculus topics should one know before starting these Discrete Math Computer Science topics?: Discrete Mathematics CS Chapters 1.1 Propositional Logic 1.2 Propositional Equivalences 1.3 Predicates and Quantifiers 1.4 Nested Quantifiers 1.5 Rules of Inference 1.6 Introduction to Proofs 1.7 Proof Methods and Strategy 2.1 Sets 2.2 Set Operations 2.3 Functions 2.4 Sequences and Summations 3.1 Algorithms 3.2 The Growths of Functions 3.3 Complexity of Algorithms 3.4 The Integers and Division 3.5 Primes and Greatest Common Divisors 3.6 Integers and Algorithms 3.8 Matrices 4.1 Mathematical Induction 4.2 Strong Induction and Well-Ordering 4.3 Recursive Definitions and Structural Induction 4.4 Recursive Algorithms 4.5 Program Correctness 5.1 The Basics of Counting 5.2 The Pigeonhole Principle 5.3 Permutations and Combinations 5.6 Generating Permutations and Combinations 6.1 An Introduction to Discrete Probability 6.4 Expected Value and Variance 7.1 Recurrence Relations 7.3 Divide-and-Conquer Algorithms and Recurrence Relations 7.5 Inclusion-Exclusion 8.1 Relations and Their Properties 8.2 n-ary Relations and Their Applications 8.3 Representing Relations 8.5 Equivalence Relations 9.1 Graphs and Graph Models 9.2 Graph Terminology and Special Types of Graphs 9.3 Representing Graphs and Graph Isomorphism 9.4 Connectivity 9.5 Euler and Hamilton Ptahs 10.1 Introduction to Trees 10.2 Application of Trees 10.3 Tree Traversal 11.1 Boolean Functions 11.2 Representing Boolean Functions 11.3 Logic Gates 11.4 Minimization of Circuits 12.1 Language and Grammars 12.2 Finite-State Machines with Output 12.3 Finite-State Machines with No Output 12.4 Language Recognition 12.5 Turing Machines Precalculus Chapters R.1 The Real-Number System R.2 Integer Exponents, Scientific Notation, and Order of Operations R.3 Addition, Subtraction, and Multiplication of Polynomials R.4 Factoring R.5 Rational Expressions R.6 Radical Notation and Rational Exponents R.7 The Basics of Equation Solving 1.1 Functions, Graphs, Graphers 1.2 Linear Functions, Slope, and Applications 1.3 Modeling: Data Analysis, Curve Fitting, and Linear Regression 1.4 More on Functions 1.5 Symmetry and Transformations 1.6 Variation and Applications 1.7 Distance, Midpoints, and Circles 2.1 Zeros of Linear Functions and Models 2.2 The Complex Numbers 2.3 Zeros of Quadratic Functions and Models 2.4 Analyzing Graphs of Quadratic Functions 2.5 Modeling: Data Analysis, Curve Fitting, and Quadratic Regression 2.6 Zeros and More Equation Solving 2.7 Solving Inequalities 3.1 Polynomial Functions and Modeling 3.2 Polynomial Division; The Remainder and Factor Theorems 3.3 Theorems about Zeros of Polynomial Functions 3.4 Rational Functions 3.5 Polynomial and Rational Inequalities 4.1 Composite and Inverse Functions 4.2 Exponential Functions and Graphs 4.3 Logarithmic Functions and Graphs 4.4 Properties of Logarithmic Functions 4.5 Solving Exponential and Logarithmic Equations 4.6 Applications and Models: Growth and Decay 5.1 Systems of Equations in Two Variables 5.2 System of Equations in Three Variables 5.3 Matrices and Systems of Equations 5.4 Matrix Operations 5.5 Inverses of Matrices 5.6 System of Inequalities and Linear Programming 5.7 Partial Fractions 6.1 The Parabola 6.2 The Circle and Ellipse 6.3 The Hyperbola 6.4 Nonlinear Systems of Equations

    Read the article

  • How do you track display impressions in Google Analytics on non Google networks?

    - by dee
    Google Analytics has a Multi-Channel funnel analysis feature that we’d like to use to understand assisted conversions and how each channel has impacted on conversion beyond just last interaction attribution. My current understanding is that the impression tracking part of this feature works really well when playing within Google’s search and display networks. Outside of Google’s network I suspect that impression tracking will no longer “just work” and feed back into GA appropriately. What our options are for tracking display impressions on other advertising networks so that we can be attributing value correctly with GA?

    Read the article

< Previous Page | 260 261 262 263 264 265 266 267 268 269 270 271  | Next Page >