Search Results

Search found 69987 results on 2800 pages for 'wcf data services'.

Page 9/2800 | < Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >

  • Data Mining Resources

    - by Dejan Sarka
    There are many different types of analyses, each one with its own pros and cons. Relational reports have a predefined structure, and end users cannot change it. They are simple to use for end users. Reports can use real-time data and snapshots of data to show the state of a report at specific points in time. One of the drawbacks is that report authoring is limited to IT pros and advanced users. Any kind of dynamic restructuring is very limited. If real-time data is used for a report, the report has a negative impact on the performance of the source system. Processing of the reports might be slow because the data comes from relational database management systems, which are not optimized for reporting only. If you create a semantic model of your data, your end users can create ad-hoc report structures. However, the development is more complex because a developer is needed to create these semantic models. For OLAP, you typically use specialized database management systems. You get lightning speed of analyses. End users can use rich and thin clients to interactively change the structure of the report. Typically, they do it graphically. However, the development of an OLAP system is many times quite complex. It involves the preparation and maintenance of an enterprise data warehouse and OLAP cubes. In order to exploit the possibility of real-time restructuring of reports, the users must be both active and educated. The data is usually stale, as it is loaded into data warehouses and OLAP cubes with a scheduled process. With data mining, a structure is not selected in advance; it searches for the structure. As a result, data mining can give you the most valuable results because you can discover patterns you did not expect. A data mining model structure is limited only by the attributes that you use to train the model. One of the drawbacks is that a lot of knowledge is needed for a successful data mining project. End users have to understand the results. Subject matter experts and IT professionals need to understand business problem thoroughly. The development might be sometimes even more complex than the development of OLAP cubes. Each type of analysis has its own place in an enterprise system. SQL Server has tools for all kinds of analyses. However, data mining is the most advanced way of analyzing the data; this is the “I” in BI. In order to get the most out of it, you need to learn quite a lot. In this blog post, I am gathering together resources for learning, including forthcoming events. Books Multiple authors: SQL Server MVP Deep Dives – I wrote an introductory data mining chapter there. Erik Veerman, Teo Lachev and Dejan Sarka: MCTS Self-Paced Training Kit (Exam 70-448): Microsoft SQL Server 2008 - Business Intelligence Development and Maintenance – you can find a good overview of a complete BI solution, including data mining, in this book. Jamie MacLennan, ZhaoHui Tang, and Bogdan Crivat: Data Mining with Microsoft SQL Server 2008 – can’t miss this book if you want to mine your data with SQL Server tools. Michael Berry, Gordon Linoff: Mastering Data Mining: The Art and Science of Customer Relationship Management – data mining from both, business and technical perspective. Dorian Pyle: Data Preparation for Data Mining – an in-depth book about data preparation. Thomas and Ronald Wonnacott: Introductory Statistics – if you thought that you could get away without statistics, then you are not serious about data mining. Jiawei Han and Micheline Kamber: Data Mining Concepts and Techniques – in-depth explanation of the most popular data mining algorithms. Michael Berry and Gordon Linoff: Data Mining Techniques – another book that explains data mining algorithms, more fro a business perspective. Paolo Guidici: Applied Data Mining – very mathematical book, only if you enjoy statistics and mathematics in general. Forthcoming presentations I am presenting two data mining related sessions during the PASS Summit in Charlotte, NC: Wednesday, October 16th, 2013 - Fraud Detection: Notes from the Field – I am showing how to use data mining for a specific business problem. The presentation is based on real-life projects. Friday, October 18th: Excel 2013 Advanced Analytics – I am focusing on Excel Data Mining Add-ins, and how to use them together with Power Pivot and other add-ins. This is the most you can get out of Excel. Sinergija 2013, Belgrade, Serbia Tuesday, October 22nd: Excel 2013 Analytics to the Max – another presentation focusing on the most advanced analytics you can get in Excel. SQL Rally Amsterdam, Netherlands Thursday, November 7th: Advanced Analytics in Excel 2013 – and again I am presenting about data mining in Excel. Why three different titles for the same presentation? I don’t know, I guess I forgot the name I proposed every time right after I sent the proposal. Courses Data Mining with SQL Server 2012 – I wrote a 3-day course for SolidQ. If you are interested in this course, which I could also deliver in a shorter seminar way, you can contact your closes SolidQ subsidiary, or, of course, me directly on addresses [email protected] or [email protected]. This course could also complement the existing courseware portfolio of training providers, which are welcome to contact me as well. OK, now you know: no more excuses, start learning data mining, get the most out of your data

    Read the article

  • Looking for Cutting-Edge Data Integration: 2010 Innovation Awards

    - by dain.hansen
    This year's Oracle Fusion Middleware Innovation Awards will honor customers and partners who are creatively using to various products across Oracle Fusion Middleware. Brand new to this year's awards is a category for Data Integration. Think you have something unique and innovative with one of our Oracle Data Integration products? We'd love to hear from you! Please submit today The deadline for the nomination is 5 p.m. PT Friday, August 6th 2010, and winning organizations will be notified by late August 2010. What you win! FREE pass to Oracle OpenWorld 2010 in San Francisco for select winners in each category. Honored by Oracle executives at awards ceremony held during Oracle OpenWorld 2010 in San Francisco. Oracle Middleware Innovation Award Winner Plaque 1-3 meetings with Oracle Executives during Oracle OpenWorld 2010 Feature article placement in Oracle Magazine and placement in Oracle Press Release Customer snapshot and video testimonial opportunity, to be hosted on oracle.com Podcast interview opportunity with Senior Oracle Executive

    Read the article

  • Data Integration 12c Raising the Big Data Roof at Oracle OpenWorld

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} Author: Dain Hansen, Director, Oracle It was an exciting OpenWorld 2013 for us in the Data Integration track. Our theme this year was all about ‘being future ready’ - previewing one of our biggest releases this year: Oracle Data Integration 12c. Just this week we followed up with this preview by announcing the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} Mark Hurd's keynote on day one set the tone for the Data Integration sessions. Mark focused on big data analytics and the changing consumer expectations. Especially real-time insight is a key theme for Oracle overall and data integration products. In Mark Hurd's keynote we heard from key customers, such as Airbus and Thomson Reuters, how real-time analysis of operational data including machine data creates value, in some cases even saves lives. Thomas Kurian gave a deeper look into Oracle's big data and fast data solutions. In the initial lead Data Integration track session - Brad Adelberg, VP of Development, presented Oracle’s Data Integration 12c product strategy based on key trends from the initial OpenWorld keynotes. Brad talked about how Oracle's data integration products address the new data integration requirements that evolved with cloud computing, big data, and changing consumer expectations and how they set the key themes in our products’ road map. Brad explained why and how fast-time to value, high-performance and future-ready solutions is the top focus areas for product development. If you were not able to attend OpenWorld or this session I recommend reading the white paper: Five New Data Integration Requirements and How to Meet them with Oracle Data Integration, which provides an in-depth look into how Oracle addresses the new trends in the DI market. Following Brad’s session, Nick Wagner provided in depth review of Oracle GoldenGate’s latest features and roadmap. Nick discussed how Oracle GoldenGate’s tight integration with Oracle Database sets the product apart from the competition. We also heard that heterogeneity of the product is still a major focus for GoldenGate’s development and there will be more news on that front when there is a major release. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} After GoldenGate’s product strategy session, Denis Gray from the PM team presented Oracle Data Integrator’s product strategy session, talking about the latest and greatest on ODI. Another good session was delivered by long-time GoldenGate users, Comcast.  Jason Hurd and Amit Patel of Comcast talked about the various use cases they deploy Oracle GoldenGate throughout their enterprise, from database upgrades, feeding reporting systems, to active-active database synchronization.  The Comcast team shared many good tips on how to use GoldenGate for both zero downtime upgrades and active-active replication with conflict management requirement. One of our other important goals we had this year for the Data Integration track at OpenWorld was hearing from our customers. We ended day 1 on just that, with a wonderful award ceremony for Oracle Excellence Awards for Oracle Fusion Middleware Innovation. The ceremony was held in the Yerba Buena Center for the Arts. Congratulations to Royal Bank of Scotland and Yalumba Wine Company, the winners in the Data Integration category. You can find more information on the award and the winners in our previous blog post: 2013 Oracle Excellence Awards for Fusion Middleware Innovation… Selected for their innovation use of Oracle’s Data Integration products; the winners for the Data Integration Category are Royal Bank of Scotland and The Yalumba Wine Company. Congratulations!!! Royal Bank of Scotland’s Market and International Banking division provides clients across the globe with seamless trading and competitive pricing, underpinned by a deep knowledge of risk management across the full spectrum of financial products. They handle millions of transactions daily to keep the lifeblood of their clients’ businesses flowing – whether through payment management solutions or through bespoke trade finance solutions. Royal Bank of Scotland is leveraging Oracle GoldenGate and Oracle Data Integrator along with Oracle Business Intelligence Enterprise Edition and the Oracle Database for a variety of solutions. Mainly, Oracle GoldenGate and Oracle Data Integrator are used to feed their data warehouse – providing a real-time data integration solution that feeds transactional data to their analytics system in minutes to enable improved decision making with timely, accurate data for their business users. Oracle Data Integrator’s in-database transformation capabilities and its ability to integrate with Oracle GoldenGate for real-time data capture is the foundation of this implementation. This solution makes it such that changes happening in the analytics systems are available the same day they are deployed on the operational system with 100% data quality guaranteed. Additionally, the solution has helped to reduce their operational database size from 150GB to 10GB. Impressive! Now what if I told you this solution was built in 3 months and had a less than 6 month return on investment? That’s outstanding! The Yalumba Wine Company is situated in the Barossa Valley of Australia. It is the oldest family owned winery in Australia with a unique way of aging their wines in specially crafted 100 liter barrels. Did you know that “Yalumba” is Aboriginal for “all the land around”? The Yalumba Wine Company is growing rapidly, and was in need of introducing a more modern standard to the existing manufacturing processes to meet globalization demands, overall time-to-market, and better operational efficiency objectives of product development. The Yalumba Wine Company worked with a partner, Bristlecone to develop a unique solution whereby Oracle Data Integrator is leveraged to pull data from Salesforce.com and JD Edwards, in addition to their other pre-existing source systems, for consumption into their data warehouse. They have emphasized the overall ease of developing integration workflows with Oracle Data Integrator. The solution has brought better visibility for the business users, shorter data loading and transformation performance to their data warehouse with rapid incorporation of new data sources, and a solid future-proof foundation for their organization. Moving forward, they plan on leveraging more from Oracle’s Data Integration portfolio. Terrific! In addition to these two customers on Tuesday we featured many other important Oracle Data Integrator and Oracle GoldenGate customers. On Tuesday the GoldenGate panel included: Land O’Lakes, Smuckers, and Veolia Water. Besides giving us yummy nutrition and healthy water, these companies have another aspect in common. They all use GoldenGate to boost their ERP application. Please read the recap by Irem Radzik. On Wednesday, the ODI Panel included: Barry Ralston and Ryan Weber of Infinity Insurance, Paul Stracke of Paychex Inc., and Ian Wall of Vertex Pharmaceuticals for a session filled with interesting projects, use cases and approaches to leveraging Oracle Data Integrator. Please read the recap by Sandrine Riley for more. Thanks to everyone who joined with us and we hope to stay connected! To hear more about our Data Integration12c products join us in an upcoming webcast to learn more. Follow us www.twitter.com/ORCLGoldenGate or goto our website at www.oracle.com/goto/dataintegration

    Read the article

  • Implementing a modern web application with Web API on top of old services

    - by Gaui
    My company has many WCF services which may or may not be replaced in the near future. The old web application is written in WebForms and communicates straight with these services via SOAP and returns DataTables. Now I am designing a new modern web application in a modern style, an AngularJS client which communicates with an ASP.NET Web API via JSON. The Web API then communicates with the WCF services via SOAP. In the future I want to let the Web API handle all requests and go straight to the database, but because the business logic implemented in the WCF services is complicated it's going to take some time to rewrite and replace it. Now to the problem: I'm trying to make it easy in the near future to replace the WCF services with some other data storage, e.g. another endpoint, database or whatever. I also want to make it easy to unit test the business logic. That's why I have structured the Web API with a repository layer and a service layer. The repository layer has a straight communication with the data storage (WCF service, database, or whatever) and the service layer then uses the repository (Dependency Injection) to get the data. It doesn't care where it gets the data from. Later on I can be in control and structure the data returned from the data storage (DataTable to POCO) and be able to test the logic in the service layer with some mock repository (using Dependency Injection). Below is some code to explain where I'm going with this. But my question is, does this all make sense? Am I making this overly complicated and could this be simplified in any way possible? Does this simplicity make this too complicated to maintain? My main goal is to make it as easy as possible to switch to another data storage later on, e.g. an ORM and be able to test the logic in the service layer. And because the majority of the business logic is implemented in these WCF services (and they return DataTables), I want to be in control of the data and the structure returned to the client. Any advice is greatly appreciated. Update 20/08/14 I created a repository factory, so services would all share repositories. Now it's easy to mock a repository, add it to the factory and create a provider using that factory. Any advice is much appreciated. I want to know if I'm making things more complicated than they should be. So it looks like this: 1. Repository Factory public class RepositoryFactory { private Dictionary<Type, IServiceRepository> repositories; public RepositoryFactory() { this.repositories = new Dictionary<Type, IServiceRepository>(); } public void AddRepository<T>(IServiceRepository repo) where T : class { if (this.repositories.ContainsKey(typeof(T))) { this.repositories.Remove(typeof(T)); } this.repositories.Add(typeof(T), repo); } public dynamic GetRepository<T>() { if (this.repositories.ContainsKey(typeof(T))) { return this.repositories[typeof(T)]; } throw new RepositoryNotFoundException("No repository found for " + typeof(T).Name); } } I'm not very fond of dynamic but I don't know how to retrieve that repository otherwise. 2. Repository and service // Service repository interface // All repository interfaces extend this public interface IServiceRepository { } // Invoice repository interface // Makes it easy to mock the repository later on public interface IInvoiceServiceRepository : IServiceRepository { List<Invoice> GetInvoices(); } // Invoice repository // Connects to some data storage to retrieve invoices public class InvoiceServiceRepository : IInvoiceServiceRepository { public List<Invoice> GetInvoices() { // Get the invoices from somewhere // This could be a WCF, a database, or whatever using(InvoiceServiceClient proxy = new InvoiceServiceClient()) { return proxy.GetInvoices(); } } } // Invoice service // Service that handles talking to a real or a mock repository public class InvoiceService { // Repository factory RepositoryFactory repoFactory; // Default constructor // Default connects to the real repository public InvoiceService(RepositoryFactory repo) { repoFactory = repo; } // Service function that gets all invoices from some repository (mock or real) public List<Invoice> GetInvoices() { // Query the repository return repoFactory.GetRepository<IInvoiceServiceRepository>().GetInvoices(); } }

    Read the article

  • WCF client encrypt message to JAVA WS using username_token with message protection client policy

    - by Alex
    I am trying to create a WCF client APP that is consuming a JAVA WS that uses username_token with message protection client policy. There is a private key that is installed on the server and a public certificate file was exported from the JKS keystore file. I have installed the public key into certificate store via MMC under Personal certificates. I am trying to create a binding that will encrypt the message and pass the username as part of the payload. I have been researching and trying the different configurations for about a day now. I found a similar situation on the msdn forum: http://social.msdn.microsoft.com/Forums/en/wcf/thread/ce4b1bf5-8357-4e15-beb7-2e71b27d7415 This is the configuration that I am using in my app.config <customBinding> <binding name="certbinding"> <security authenticationMode="UserNameOverTransport"> <secureConversationBootstrap /> </security> <httpsTransport requireClientCertificate="true" /> </binding> </customBinding> <endpoint address="https://localhost:8443/ZZZService?wsdl" binding="customBinding" bindingConfiguration="cbinding" contract="XXX.YYYPortType" name="ServiceEndPointCfg" /> And this is the client code that I am using: EndpointAddress endpointAddress = new EndpointAddress(url + "?wsdl"); P6.WCF.Project.ProjectPortTypeClient proxy = new P6.WCF.Project.ProjectPortTypeClient("ServiceEndPointCfg", endpointAddress); proxy.ClientCredentials.UserName.UserName = UserName; proxy.ClientCredentials.ClientCertificate.SetCertificate(StoreLocation.CurrentUser, StoreName.My, X509FindType.FindByThumbprint, "67 87 ba 28 80 a6 27 f8 01 a6 53 2f 4a 43 3b 47 3e 88 5a c1"); var projects = proxy.ReadProjects(readProjects); This is the .NET CLient error I get: Error Log: Invalid security information. On the Java WS side I trace the log : SEVERE: Encryption is enabled but there is no encrypted key in the request. I traced the SOAP headers and payload and did confirm the encrypted key is not there. Headers: {expect=[100-continue], content-type=[text/xml; charset=utf-8], connection=[Keep-Alive], host=[localhost:8443], Content-Length=[731], vsdebuggercausalitydata=[uIDPo6hC1kng3ehImoceZNpAjXsAAAAAUBpXWdHrtkSTXPWB7oOvGZwi7MLEYUZKuRTz1XkJ3soACQAA], SOAPAction=[""], Content-Type=[text/xml; charset=utf-8]} Payload: <s:Envelope xmlns:s="http://schemas.xmlsoap.org/soap/envelope/" xmlns:u="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-utility-1.0.xsd"><s:Header><o:Security s:mustUnderstand="1" xmlns:o="http://docs.oasis-open.org/wss/2004/01/oasis-200401-wss-wssecurity-secext-1.0.xsd"><o:UsernameToken u:Id="uuid-5809743b-d6e1-41a3-bc7c-66eba0a00998-1"><o:Username>admin</o:Username><o:Password>admin</o:Password></o:UsernameToken></o:Security></s:Header><s:Body xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema"><ReadProjects xmlns="http://xmlns.dev.com/WS/Project/V1"><Field>ObjectId</Field><Filter>Id='WS-Demo'</Filter></ReadProjects></s:Body></s:Envelope> I have also tryed some other bindings but with no success: <basicHttpBinding> <binding name="basicHttp"> <security mode="TransportWithMessageCredential"> <message clientCredentialType="Certificate"/> </security> </binding> </basicHttpBinding> <wsHttpBinding> <binding name="wsBinding"> <security mode="Message"> <message clientCredentialType="UserName" negotiateServiceCredential="false" /> </security> </binding> </wsHttpBinding> Your help will be greatly aprreciatted! Thanks!

    Read the article

  • WCF: Configuring Known Types

    - by jerbersoft
    I want to know as to how to configure known types in WCF. For example, I have a Person class and an Employee class. The Employee class is a sublass of the Person class. Both class are marked with a [DataContract] attribute. I dont want to hardcode the known type of a class, like putting a [ServiceKnownType(typeof(Employee))] at the Person class so that WCF will know that Employee is a subclass of Person. Now, I added to the host's App.config the following XML configuration: <?xml version="1.0" encoding="utf-8" ?> <configuration> <system.runtime.serialization> <dataContractSerializer> <declaredTypes> <add type="Person, WCFWithNoLibrary, Version=1.0.0.0,Culture=neutral,PublicKeyToken=null"> <knownType type="Employee, WCFWithNoLibrary, Version=1.0.0.0,Culture=neutral, PublicKeyToken=null" /> </add> </declaredTypes> </dataContractSerializer> </system.runtime.serialization> <system.serviceModel> ....... </system.serviceModel> </configuration> I compiled it, run the host, added a service reference at the client and added some code and run the client. But an error occured: The formatter threw an exception while trying to deserialize the message: There was an error while trying to deserialize parameter http://www.herbertsabanal.net:person. The InnerException message was 'Error in line 1 position 247. Element 'http://www.herbertsabanal.net:person' contains data of the 'http://www.herbertsabanal.net/Data:Employee' data contract. The deserializer has no knowledge of any type that maps to this contract. Add the type corresponding to 'Employee' to the list of known types - for example, by using the KnownTypeAttribute attribute or by adding it to the list of known types passed to DataContractSerializer.'. Please see InnerException for more details. Below are the data contracts: [DataContract(Namespace="http://www.herbertsabanal.net/Data", Name="Person")] class Person { string _name; int _age; [DataMember(Name="Name", Order=0)] public string Name { get { return _name; } set { _name = value; } } [DataMember(Name="Age", Order=1)] public int Age { get { return _age; } set { _age = value; } } } [DataContract(Namespace="http://www.herbertsabanal.net/Data", Name="Employee")] class Employee : Person { string _id; [DataMember] public string ID { get { return _id; } set { _id = value; } } } Btw, I didn't use class libraries (WCF class libraries or non-WCF class libraries) for my service. I just plain coded it in the host project. I guess there must be a problem at the config file (please see config file above). Or I must be missing something. Any help would be pretty much appreciated.

    Read the article

  • Ria Services vs WCF Dataservices

    - by NPehrsson
    My Team are evaluation to a bigger Business portal. (Invoicing, Bookkeeping, Salaries.....) We are all used to work with DDD, O/R mappers with NHibernate as our first choice. We have chosen to work with CompositeWPF to keep modularity between all modules and part system in the business portal. Now we have evaluated Ria Services and are kind of disappointed how it works in a Data Oriented way, Data Oriented can be good in a service oriented scenario, but we feel that we can with an Object Oriented approach to, and we feel that we can get an application with less complexity with the OO approach than the DO approach. For example it doesn't allow Value Objects, Many-to-many relations, everything needs to have keys and so on. We haven't looked at WCF Data Services yet so our question is WCF Data Services our answere? Does it integrate good with Silverlight 4? Can we work with it in a OO manor?

    Read the article

  • Migrated SCOM 2007 R2 Reporting Services but reports are gone

    - by Gabriel Guimarães
    I've migrated Reporting Services on a SCOM 2007 R2 install, and noticed that the reports have not being copied. I can create a new report, but the ones I've had because of the management packs are gone. I've tried re-applying the Management Packs however it doesn't re-deploy them and when I try to access for example: Monitoring - Microsoft Windows Print Server - Microsoft Windows Server 2000 and 2003 Print Services - State View - select any item and click Alerts on the right menu. I get the following error: Date: 12/24/2010 12:40:35 PM Application: System Center Operations Manager 2007 R2 Application Version: 6.1.7221.0 Severity: Error Message: Cannot initialize report. Microsoft.Reporting.WinForms.ReportServerException: The item '/Microsoft.SystemCenter.DataWarehouse.Report.Library/Microsoft.SystemCenter.DataWarehouse.Report.Alert' cannot be found. (rsItemNotFound) at Microsoft.Reporting.WinForms.ServerReport.GetExecutionInfo() at Microsoft.Reporting.WinForms.ServerReport.GetParameters() at Microsoft.EnterpriseManagement.Mom.Internal.UI.Reporting.Parameters.ReportParameterBlock.Initialize(ServerReport serverReport) at Microsoft.EnterpriseManagement.Mom.Internal.UI.Console.ReportForm.SetReportJob(Object sender, ConsoleJobEventArgs args) The report doesn't exist on the reporting services side. how do I re-deploy this reports? Any help is appreciated. Thanks in advance.

    Read the article

  • Should I swap from WCF to NserviceBus

    - by Matt Roberts
    We have a central server that sends and recieves messages from a number of PCs that are located on client networks in various locations. To facilitate this, currently I'm using WCF with TCPNetBindings, using duplex communication secured with certificates. Now, we have a number of issues with this - mainly that we are being asked to support "disconnected mode" (we need to be fault tolerant). From what I know, there is no simple way to do this using the WCF stack - we'd need to implement something and perhaps use msmq. I've been looking at NServiceBus lately, and from I can see it seems to fit the bill well - fault tolerance, messages can be sent over the internet via a simple http gateway, etc. I know it's well respected in the community, and I can see why from looking into it. So, my question is...Does employing NServiceBus sound like a sensible idea, or does anyone have any other suggestions / real world experience that relate to this? I guess I'm worried of introducing a new tech that I know relatively little about, and facing problems with things like securing it, setting everything up in a reliable way, gotchas along the way.. I'm also wary of "gold-plating" the architecture, and choosing something shiny that will end up bogging me down in implementation versus sticking with WCF and just making it work for me.. Thanks!

    Read the article

  • Fast Data - Big Data's achilles heel

    - by thegreeneman
    At OOW 2013 in Mark Hurd and Thomas Kurian's keynote, they discussed Oracle's Fast Data software solution stack and discussed a number of customers deploying Oracle's Big Data / Fast Data solutions and in particular Oracle's NoSQL Database.  Since that time, there have been a large number of request seeking clarification on how the Fast Data software stack works together to deliver on the promise of real-time Big Data solutions.   Fast Data is a software solution stack that deals with one aspect of Big Data, high velocity.   The software in the Fast Data solution stack involves 3 key pieces and their integration:  Oracle Event Processing, Oracle Coherence, Oracle NoSQL Database.   All three of these technologies address a high throughput, low latency data management requirement.   Oracle Event Processing enables continuous query to filter the Big Data fire hose, enable intelligent chained events to real-time service invocation and augments the data stream to provide Big Data enrichment. Extended SQL syntax allows the definition of sliding windows of time to allow SQL statements to look for triggers on events like breach of weighted moving average on a real-time data stream.    Oracle Coherence is a distributed, grid caching solution which is used to provide very low latency access to cached data when the data is too big to fit into a single process, so it is spread around in a grid architecture to provide memory latency speed access.  It also has some special capabilities to deploy remote behavioral execution for "near data" processing.   The Oracle NoSQL Database is designed to ingest simple key-value data at a controlled throughput rate while providing data redundancy in a cluster to facilitate highly concurrent low latency reads.  For example, when large sensor networks are generating data that need to be captured while analysts are simultaneously extracting the data using range based queries for upstream analytics.  Another example might be storing cookies from user web sessions for ultra low latency user profile management, also leveraging that data using holistic MapReduce operations with your Hadoop cluster to do segmented site analysis.  Understand how NoSQL plays a critical role in Big Data capture and enrichment while simultaneously providing a low latency and scalable data management infrastructure thru clustered, always on, parallel processing in a shared nothing architecture. Learn how easily a NoSQL cluster can be deployed to provide essential services in industry specific Fast Data solutions. See these technologies work together in a demonstration highlighting the salient features of these Fast Data enabling technologies in a location based personalization service. The question then becomes how do these things work together to deliver an end to end Fast Data solution.  The answer is that while different applications will exhibit unique requirements that may drive the need for one or the other of these technologies, often when it comes to Big Data you may need to use them together.   You may have the need for the memory latencies of the Coherence cache, but just have too much data to cache, so you use a combination of Coherence and Oracle NoSQL to handle extreme speed cache overflow and retrieval.   Here is a great reference to how these two technologies are integrated and work together.  Coherence & Oracle NoSQL Database.   On the stream processing side, it is similar as with the Coherence case.  As your sliding windows get larger, holding all the data in the stream can become difficult and out of band data may need to be offloaded into persistent storage.  OEP needs an extreme speed database like Oracle NoSQL Database to help it continue to perform for the real time loop while dealing with persistent spill in the data stream.  Here is a great resource to learn more about how OEP and Oracle NoSQL Database are integrated and work together.  OEP & Oracle NoSQL Database.

    Read the article

  • Oracle Announces Oracle Big Data Appliance X3-2 and Enhanced Oracle Big Data Connectors

    - by jgelhaus
    Enables Customers to Easily Harness the Business Value of Big Data at Lower Cost Engineered System Simplifies Big Data for the Enterprise Oracle Big Data Appliance X3-2 hardware features the latest 8-core Intel® Xeon E5-2600 series of processors, and compared with previous generation, the 18 compute and storage servers with 648 TB raw storage now offer: 33 percent more processing power with 288 CPU cores; 33 percent more memory per node with 1.1 TB of main memory; and up to a 30 percent reduction in power and cooling Oracle Big Data Appliance X3-2 further simplifies implementation and management of big data by integrating all the hardware and software required to acquire, organize and analyze big data. It includes: Support for CDH4.1 including software upgrades developed collaboratively with Cloudera to simplify NameNode High Availability in Hadoop, eliminating the single point of failure in a Hadoop cluster; Oracle NoSQL Database Community Edition 2.0, the latest version that brings better Hadoop integration, elastic scaling and new APIs, including JSON and C support; The Oracle Enterprise Manager plug-in for Big Data Appliance that complements Cloudera Manager to enable users to more easily manage a Hadoop cluster; Updated distributions of Oracle Linux and Oracle Java Development Kit; An updated distribution of open source R, optimized to work with high performance multi-threaded math libraries Read More   Data sheet: Oracle Big Data Appliance X3-2 Oracle Big Data Appliance: Datacenter Network Integration Big Data and Natural Language: Extracting Insight From Text Thomson Reuters Discusses Oracle's Big Data Platform Connectors Integrate Hadoop with Oracle Big Data Ecosystem Oracle Big Data Connectors is a suite of software built by Oracle to integrate Apache Hadoop with Oracle Database, Oracle Data Integrator, and Oracle R Distribution. Enhancements to Oracle Big Data Connectors extend these data integration capabilities. With updates to every connector, this release includes: Oracle SQL Connector for Hadoop Distributed File System, for high performance SQL queries on Hadoop data from Oracle Database, enhanced with increased automation and querying of Hive tables and now supported within the Oracle Data Integrator Application Adapter for Hadoop; Transparent access to the Hive Query language from R and introduction of new analytic techniques executing natively in Hadoop, enabling R developers to be more productive by increasing access to Hadoop in the R environment. Read More Data sheet: Oracle Big Data Connectors High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database

    Read the article

  • WSDL-world vs CLR-world – some differences

    - by nmarun
    A change in mindset is required when switching between a typical CLR application and a web service application. There are some things in a CLR environment that just don’t add-up in a WSDL arena (and vice-versa). I’m listing some of them here. When I say WSDL-world, I’m mostly talking with respect to a WCF Service and / or a Web Service. No (direct) Method Overloading: You definitely can have overloaded methods in a, say, Console application, but when it comes to a WCF / Web Services application, you need to adorn these overloaded methods with a special attribute so the service knows which specific method to invoke. When you’re working with WCF, use the Name property of the OperationContract attribute to provide unique names. 1: [OperationContract(Name = "AddInt")] 2: int Add(int arg1, int arg2); 3:  4: [OperationContract(Name = "AddDouble")] 5: double Add(double arg1, double arg2); By default, the proxy generates the code for this as: 1: [System.ServiceModel.OperationContractAttribute( 2: Action="http://tempuri.org/ILearnWcfService/AddInt", 3: ReplyAction="http://tempuri.org/ILearnWcfService/AddIntResponse")] 4: int AddInt(int arg1, int arg2); 5: 6: [System.ServiceModel.OperationContractAttribute( 7: Action="http://tempuri.org/ILearnWcfServiceExtend/AddDouble", 8: ReplyAction="http://tempuri.org/ILearnWcfServiceExtend/AddDoubleResponse")] 9: double AddDouble(double arg1, double arg2); With Web Services though the story is slightly different. Even after setting the MessageName property of the WebMethod attribute, the proxy does not change the name of the method, but only the underlying soap message changes. 1: [WebMethod] 2: public string HelloGalaxy() 3: { 4: return "Hello Milky Way!"; 5: } 6:  7: [WebMethod(MessageName = "HelloAnyGalaxy")] 8: public string HelloGalaxy(string galaxyName) 9: { 10: return string.Format("Hello {0}!", galaxyName); 11: } The one thing you need to remember is to set the WebServiceBinding accordingly. 1: [WebServiceBinding(ConformsTo = WsiProfiles.None)] The proxy is: 1: [System.Web.Services.Protocols.SoapDocumentMethodAttribute("http://tempuri.org/HelloGalaxy", 2: RequestNamespace="http://tempuri.org/", 3: ResponseNamespace="http://tempuri.org/", 4: Use=System.Web.Services.Description.SoapBindingUse.Literal, 5: ParameterStyle=System.Web.Services.Protocols.SoapParameterStyle.Wrapped)] 6: public string HelloGalaxy() 7:  8: [System.Web.Services.WebMethodAttribute(MessageName="HelloGalaxy1")] 9: [System.Web.Services.Protocols.SoapDocumentMethodAttribute("http://tempuri.org/HelloAnyGalaxy", 10: RequestElementName="HelloAnyGalaxy", 11: RequestNamespace="http://tempuri.org/", 12: ResponseElementName="HelloAnyGalaxyResponse", 13: ResponseNamespace="http://tempuri.org/", 14: Use=System.Web.Services.Description.SoapBindingUse.Literal, 15: ParameterStyle=System.Web.Services.Protocols.SoapParameterStyle.Wrapped)] 16: [return: System.Xml.Serialization.XmlElementAttribute("HelloAnyGalaxyResult")] 17: public string HelloGalaxy(string galaxyName) 18:  You see the calling method name is the same in the proxy, however the soap message that gets generated is different. Using interchangeable data types: See details on this here. Type visibility: In a CLR-based application, if you mark a field as private, well we all know, it’s ‘private’. Coming to a WSDL side of things, in a Web Service, private fields and web methods will not get generated in the proxy. In WCF however, all your operation contracts will be public as they get implemented from an interface. Even in case your ServiceContract interface is declared internal/private, you will see it as a public interface in the proxy. This is because type visibility is a CLR concept and has no bearing on WCF. Also if a private field has the [DataMember] attribute in a data contract, it will get emitted in the proxy class as a public property for the very same reason. 1: [DataContract] 2: public struct Person 3: { 4: [DataMember] 5: private int _x; 6:  7: [DataMember] 8: public int Id { get; set; } 9:  10: [DataMember] 11: public string FirstName { get; set; } 12:  13: [DataMember] 14: public string Header { get; set; } 15: } 16: } See the ‘_x’ field is a private member with the [DataMember] attribute, but the proxy class shows as below: 1: [System.Runtime.Serialization.DataMemberAttribute()] 2: public int _x { 3: get { 4: return this._xField; 5: } 6: set { 7: if ((this._xField.Equals(value) != true)) { 8: this._xField = value; 9: this.RaisePropertyChanged("_x"); 10: } 11: } 12: } Passing derived types to web methods / operation contracts: Once again, in a CLR application, I can have a derived class be passed as a parameter where a base class is expected. I have the following set up for my WCF service. 1: [DataContract] 2: public class Employee 3: { 4: [DataMember(Name = "Id")] 5: public int EmployeeId { get; set; } 6:  7: [DataMember(Name="FirstName")] 8: public string FName { get; set; } 9:  10: [DataMember] 11: public string Header { get; set; } 12: } 13:  14: [DataContract] 15: public class Manager : Employee 16: { 17: [DataMember] 18: private int _x; 19: } 20:  21: // service contract 22: [OperationContract] 23: Manager SaveManager(Employee employee); 24:  25: // in my calling code 26: Manager manager = new Manager {_x = 1, FirstName = "abc"}; 27: manager = LearnWcfServiceClient.SaveManager(manager); The above will throw an exception saying: In short, this is saying, that a Manager type was found where an Employee type was expected! Hierarchy flattening of interfaces in WCF: See details on this here. In CLR world, you’ll see the entire hierarchy as is. That’s another difference. Using ref parameters: * can use ref for parameters, but operation contract should not be one-way (gives an error when you do an update service reference)   => bad programming; create a return object that is composed of everything you need! This one kind of stumped me. Not sure why I tried this, but you can pass parameters prefixed with ref keyword* (* terms and conditions apply). The main issue is this, how would we know the changes that were made to a ‘ref’ input parameter are returned back from the service and updated to the local variable? Turns out both Web Services and WCF make this tracking happen by passing the input parameter in the response soap. This way when the deserializer does its magic, it maps all the elements of the response xml thereby updating our local variable. Here’s what I’m talking about. 1: [WebMethod(MessageName = "HelloAnyGalaxy")] 2: public string HelloGalaxy(ref string galaxyName) 3: { 4: string output = string.Format("Hello {0}", galaxyName); 5: if (galaxyName == "Andromeda") 6: { 7: galaxyName = string.Format("{0} (2.5 million light-years away)", galaxyName); 8: } 9: return output; 10: } This is how the request and response look like in soapUI. As I said above, the behavior is quite similar for WCF as well. But the catch comes when you have a one-way web methods / operation contracts. If you have an operation contract whose return type is void, is marked one-way and that has ref parameters then you’ll get an error message when you try to reference such a service. 1: [OperationContract(Name = "Sum", IsOneWay = true)] 2: void Sum(ref double arg1, ref double arg2); 3:  4: public void Sum(ref double arg1, ref double arg2) 5: { 6: arg1 += arg2; 7: } This is what I got when I did an update to my service reference: Makes sense, because a OneWay operation is… one-way – there’s no returning from this operation. You can also have a one-way web method: 1: [SoapDocumentMethod(OneWay = true)] 2: [WebMethod(MessageName = "HelloAnyGalaxy")] 3: public void HelloGalaxy(ref string galaxyName) This will throw an exception message similar to the one above when you try to update your web service reference. In the CLR space, there’s no such concept of a ‘one-way’ street! Yes, there’s void, but you very well can have ref parameters returned through such a method. Just a point here; although the ref/out concept sounds cool, it’s generally is a code-smell. The better approach is to always return an object that is composed of everything you need returned from a method. These are some of the differences that we need to bear when dealing with services that are different from our daily ‘CLR’ life.

    Read the article

  • Best approach to accessing multiple data source in a web application

    - by ced
    I've a base web application developed with .net technologies (asp.net) used into our LAN by 30 users simultanousley. From this web application I've developed two verticalization used from online users. In future i expect hundreds users simultanousley. Our company has different locations. Each site use its own database. The web application needs to retrieve information from all existing databases. Currently there are 3 database, but it's not excluded in the future expansion of new offices. My question then is: What is the best strategy for a web application to retrieve information from different databases (which have the same schema) whereas the main objective performance data access and high fault tolerance? There are case studies in the literature that I can take as an example? Do you know some good documents to study? Do you have any tips to implement this task so efficient? Intuitively I would say that two possible strategy are: perform queries from different sources in real time and aggregate data on the fly; create a repository that contains the union of the entities of interest and perform queries directly on repository;

    Read the article

  • WCF Error - Security processor was unable to find a security header in the message

    - by quinntheeskimo
    Hi, I'm getting what appears now to be a security error in my WCF Service. Originally my error was about a falted state(removed using around client proxy to clear this error), but have found more information through enabling trace. I have been unable to get my solution running after encountering this error, and even my backup copy now gets the same error. I'm not sure what has caused this to happen, I undone the changes I made (nothing relating to WCF) and still get the same error. The error from trace is - System.ServiceModel.Security.MessageSecurityException: Security processor was unable to find a security header in the message. This might be because the message is an unsecured fault or because there is a binding mismatch between the communicating parties. This can occur if the service is configured for security and the client is not using security. I'm not really sure what I need to do to fix this, any help would be usefull. The application was previously working.

    Read the article

  • How to achieve maximum callback throughput with WCF duplex channels

    - by Schneider
    I have setup a basic WCF client/server which are communicating via Named pipes. It is a duplex contract with a callback. After the client "subscribes", a thread on the server just invokes the callback as quickly as possible. The problem is I am only getting a throughput of 1000 callbacks per second. And the payload is only an integer! I need to get closer to 10,000. Everything is essentially running with default settings. What can I look at to improve things, or should I just drop WCF for some other technology? Thanks

    Read the article

  • How does LinqPad support WCF Data Services?

    - by user341127
    LinqPad supports WCF Data Services. If you assign an URL, such as http://services.odata.org/Northwind/Northwind.svc/. It will list all available data objects and you can query them. I guess LinqPad generates all available data classes at run time by reflection.Emit. I am wondering who can show me to how to do so. Or maybe someone has done it before. Any feedback are appreciated. Ying

    Read the article

  • WCF Security Transport Security Questions

    - by shyneman
    I'm writing a set of WCF services that rely on transport security with Windows Authentication using the trusted subsystem model. However, I want to perform authorization based on the original client user that initiated the request (e.g. a user from a website with a username/password). I'm planning to achieve this by adding the original user's credentials in the header before the client sends the message and then the service will use the supplied credentials to authorize the user. So I have a few questions about this implementation: 1) using transport security with windows auth, I do NOT need to worry about again encrypting the passed credentials to ensure the validity... WCF automatically takes care of this - is this correct? 2) how does this implementation prevent a malicious service, running under some windows account within the domain, to send a message tagged with spoofed credentials. for e.g. a malicious service replaces the credentials with an Admin user to do something bad? Thanks for any help.

    Read the article

  • wcf data service security configuration

    - by Daniel Pratt
    I'm in the process of setting up a WCF Data Services web service and I'm trying to sort out the security configuration. Although there's quite a lot of documentation out there for configuring WCF security, a lot of it seems to be outmoded or does not apply to my scenario. Ultimately, I am planning on managing authorization of operations via change interceptors. Thus, all I really need is the simplest way to permit a client to pass credentials along with a request and to be able to authenticate those credentials against either AD or an ASP.NET membership provider (I'd much prefer the latter unless it makes things much more complicated). I'm intending to manage encryption at the transport level (i.e. HTTPS). I'm hoping that the eventual solution does not involve a huge web.config. Likewise, I'd much prefer to avoid writing custom code for the purpose of authentication.

    Read the article

  • WCF Service Error

    - by Sanjeev K B
    Hi, I have a WCF service deployed on a windows 2003 server. We are using a WPF application to consume this service. The trouble is if we deploy a new version of WCF service or leave the IIS and WPF application idle for sometime and then try to execute a functionality, we are get the following exception: The content type text/html of the response message does not match the content type of the binding (text/xml; charset=utf-8). If using a custom encoder, be sure that the IsContentTypeSupported method is implemented properly. The first 119 bytes of the response were:'<HEAD><TITLE>500: Server Error [20-0004]</TITLE></HEAD> <BODY> <H1>500: Server Error [20-0004]<H1> </BODY> </HTML> Thanks and Regards,

    Read the article

  • practical security ramifications of increasing WCF clock skew to more than an hour

    - by Andrew Patterson
    I have written a WCF service that returns 'semi-private' data concerning peoples name, addresses and phone numbers. By semi-private, I mean that there is a username and password to access the data, and the data is meant to be secured in transit. However, IMHO noone is going to expend any energy trying to obtain the data, as it is mostly available in the public phone book anyway etc. At some level, the security is a bit of security 'theatre' to tick some boxes imposed on us by government entities. The client end of the service is an application which is given out to registered 'users' to run within their own IT setups. We have no control over the IT of the users - and in fact they often tell us to 'go jump' if we put too many requirements on their systems. One problem we have been encountering is numerous users that have system clocks that are not accurate. This can either be caused by a genuine slow/fast clocks, or more than likely a timezone or daylight savings zone error (putting their machine an hour off the 'real' time). A feature of the WCF bindings we are using is that they rely on the notion of time to detect replay attacks etc. <wsHttpBinding> <binding name="normalWsBinding" maxBufferPoolSize="524288" maxReceivedMessageSize="655360"> <reliableSession enabled="false" /> <security mode="Message"> <message clientCredentialType="UserName" negotiateServiceCredential="false" algorithmSuite="Default" establishSecurityContext="false" /> </security> </binding> </wsHttpBinding> The inaccurate client clocks cause security exceptions to be thrown and unhappy users. Other than suggesting users correct their clocks, we know that we can increase the clock skew of the security bindings. http://www.danrigsby.com/blog/index.php/2008/08/26/changing-the-default-clock-skew-in-wcf/ My question is, what are the real practical security ramifications of increasing the skew to say 2 hours? If an attacker can perform some sort of replay attack, why would a clock skew window of 5 minutes be necessarily safer than 2 hours? I presume performing any attack with security mode of 'message' requires more than just capturing some data at a proxy and sending the data back in again to 'replay' the call? In a situation like mine where data is only 'read' by the users, are there indeed any security ramifications at all to allowing 'replay' attacks?

    Read the article

  • Silverlight WCF with two-way SSL security certificates

    - by dlang
    Dear All! I would like to implement a server - client software with the following security requirements: WCF-Services need to be secured with SSL and Certificates for both, the server and the client Client certificates need to be generated programmatically upon user registration Client-certificates are deployed via a an automatically generated installer-package Altough the client-certificates are self-signed (no authorized CA for the generation server) the end-user must not add the server-certificate to the trusted certificates in the local Certificate Store My problems: I cannot find any information regarding establishing such a two-way ssl-security for wcf, while the server-certificate is not signed by an authorized CA and instead is created programmatically with "makecert"... My question: Is it technically possible to implement this requirements? If yes - could you provide some hints how to get started? Thank you!

    Read the article

  • How to get a PerSession context with WCF?

    - by christophe31
    Hi, I got running a WCF service with custom binding, for now it use httpTransport. <customBinding> <binding name="myHttpBindingConf"> <context contextManagementEnabled="true" protectionLevel="None" contextExchangeMechanism="ContextSoapHeader" /> <textMessageEncoding/> <httpTransport useDefaultWebProxy="false" /> </binding> </customBinding> I've Made a custom IExtension<OperationContext> to stock my data in a specific context by following those instructions: http://hyperthink.net/blog/a-simple-ish-approach-to-custom-context-in-wcf/ I would like to use a ContextMode.PerSession context. Which transport choose to get Session management? How to set new transport in place and letting object discovery enabled? How to force a PerSession context?

    Read the article

< Previous Page | 5 6 7 8 9 10 11 12 13 14 15 16  | Next Page >