Search Results

Search found 5454 results on 219 pages for 'soa purge instances dehyd'.

Page 23/219 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • links for 2010-04-28

    - by Bob Rhubart
    Guido Schmutz: Oracle BPM11g available! Oracle ACE Director Guido Schmutz shares his impressions after attending a hands-on workshop conducted by Masons of SOA member Clemens Utschig-Utschig. (tags: oracle otn oracleace bpm soa soasuite) Elena Zannoni : 2010 Collaboration Summit Impressions Elena Zannoni has collected her thoughts on #C10 and shares them in this great blog post. (tags: oracle otn linux architecture collaborate2010) Hajo Normann: BPMN 2.0 in Oracle BPM Suite: The future of BPM starts now "The BPM Studio sets itself apart from pure play BPMN 2.0 tools by being seamlessly integrated inside a holistic SOA / BPM toolset: BPMN models are placed in SCA-Composites in SOA Suite 11g. This allows to abstract away the complexities of SOA integration aspects from business process aspects. For UIs in BPMN tasks, you have the richness of ADF 11g based Frontends." -- Oracle ACE Director and Masons of SOA member Hajo Normann (tags: oracle otn oracleace bpm soa sca) Brain Dirking: AIIM Best Practice Awards to Two Oracle Customers Brian Dirking's great write-up of the AIIM Awards Banquet, at which the Bureau of Indian Affairs and the Charles Town Police Department were among the winners of the 2010 Carl E. Nelson Best Practices Awards. (tags: oracle otn aiim bpm ecm enterprise2.0) Mark Wilcox: Upcoming Directory Services Live Webcast - Improve Time-to-Market and Reduce Cost with Oracle Directory Services Live Webcast: Improve Time-to-Market and Reduce Cost with Oracle Directory Services Event Date: Thursday, May 27, 2010 Event Time: 10:00 AM Pacific Standard Time / 1:00 Eastern Standard Time (tags: oracle otn webcast security identitymanagement) Celine Beck: Introducing AutoVue Document Print Service Celine Beck offers a detailed overview of Oracle AutoVue. (tags: oracle otn enatarch visualization printing) Vikas Jain: What's new in OWSM 11gR1 PS2 (11.1.1.3.0) ? Vikas Jain shares links to resources relevant to the recently releases patch set for Oracle Web Services Manager 11gR1. (tags: oracle otn soa webservices oswm) @theovanarem: Oracle SOA Suite 11g Release 1 Patch Set 2 Theo Van Arem shares links to several resources relevant to the release of the latest patch set for Oracle SOA Suite 11g. (tags: oracle otn soa soasuite middleware) @vambenepe: Analyzing the VMforce announcement "The new thing is that force.com now supports an additional runtime, in addition to Apex. That new runtime uses the Java language, with the constraint that it is used via the Spring framework. Which is familiar territory to many developers. That’s it." -- William Vambenepe (tags: oracle otn cloud paas)

    Read the article

  • Tellago 2011: Dwight, Chris and Don are MVPs

    - by gsusx
    It’s been a great start of 2011. Tellago’s Dwight Goins has been awarded as a Microsoft BizTalk Server MVP for 2011. I’ve always said that Dwight should have been an MVP a long time ago. His contributions to the BizTalk Server community are nothing but remarkable. In addition to Dwight, my colleagues Don Demsak and Chris Love also renewed their respective MVP award. A few other of us are up for renewal later in the year. As a recognition to Dwight’s award, we have made him the designated doorman...(read more)

    Read the article

  • Tellago speaks about Business Intellligence with SQL Server 2008 R2

    - by gsusx
    At Tellago , we always try to stay in the frontlines of technology that can enhance our solution development practices. This year we are putting a lot of emphasis on business intelligence and in particular the new set of BI technologies such as Microsoft's PowerPivot, Master Data Services and StreamInsight that are scheduled to be release with SQL Server 2008 R2. In the last few weeks we have been working closely with different Microsoft field offices to coordinate a series of customers events that...(read more)

    Read the article

  • My Speaking Engagements in the Last Two Months

    - by gsusx
    I’ve been so busy lately with the activities around Moesion that I haven’t had time to blog about a couple of great conferences I had the opportunity to speak at in the last two months. Software Architect Conference, UK ( http://www.software-architect.co.uk/ ) This conference is becoming one of my favorite events of the year. As always Nick Payne and his team did a remarkable job lining up an all-star group of speakers that covered some of the hottest topics in today’s software industry. The first...(read more)

    Read the article

  • Tellago announces SQL Server 2008 R2 BI quick adoption programs

    - by gsusx
    During the last year, we (Tellago) have been involved in various business intelligence initiatives that leverage some emerging BI techniques such as self-service BI or complex event processing (CEP). Specifically, in the last few months, we have partnered with Microsoft to deliver a series of events across the country where we present the different technologies of the SQL Server 2008 R2 BI stack such as PowerPivot, StreamInsight, Ad-Hoc Reporting and Master Data Services. As part of those events...(read more)

    Read the article

  • Moesion Webinar: Managing BizTalk Server from your Smartphone or Tablet Without Upsetting your Boss

    - by gsusx
    BizTalkers, This Thursday we will be hosting a webinar to highlight how to use Moesion to manage your BizTalk Server environment from your mobile device. We will walk through the complete feature set of Moesion HTML5 BizTalk management console as well as complementary features of the Moesion platform that can be used to manage your BizTalk environment from your mobile device. More importantly, if you are a BizTalk developer or IT Pro we REALLY REALLY REALLY would love to get your feedback about the...(read more)

    Read the article

  • SO-Aware sessions in Dallas and Houston

    - by gsusx
    Our WCF Registry: SO-Aware keeps being evangelized throughout the world. This week Tellago Studios' Dwight Goins will be speaking at Microsoft events in Dallas and Houston ( https://msevents.microsoft.com/cui/EventDetail.aspx?culture=en-US&EventID=1032469800&IO=ycqB%2bGJQr78fJBMJTye1oA%3d%3d ) about WCF management best practices using SO-Aware . If you are in the area and passionate about WCF you should definitely swing by and give Dwight a hard time ;)...(read more)

    Read the article

  • Running SSIS packages from C#

    - by Piotr Rodak
    Most of the developers and DBAs know about two ways of deploying packages: You can deploy them to database server and run them using SQL Server Agent job or you can deploy the packages to file system and run them using dtexec.exe utility. Both approaches have their pros and cons. However I would like to show you that there is a third way (sort of) that is often overlooked, and it can give you capabilities the ‘traditional’ approaches can’t. I have been working for a few years with applications that run packages from host applications that are implemented in .NET. As you know, SSIS provides programming model that you can use to implement more flexible solutions. SSIS applications are usually thought to be batch oriented, with fairly rigid architecture and processing model, with fixed timeframes when the packages are executed to process data. It doesn’t to be the case, you don’t have to limit yourself to batch oriented architecture. I have very good experiences with service oriented architectures processing large amounts of data. These applications are more complex than what I would like to show here, but the principle stays the same: you can execute packages as a service, on ad-hoc basis. You can also implement and schedule various signals, HTTP calls, file drops, time schedules, Tibco messages and other to run the packages. You can implement event handler that will trigger execution of SSIS when a certain event occurs in StreamInsight stream. This post is just a small example of how you can use the API and other features to create a service that can run SSIS packages on demand. I thought it might be a good idea to implement a restful service that would listen to requests and execute appropriate actions. As it turns out, it is trivial in C#. The application is implemented as console application for the ease of debugging and running. In reality, you might want to implement the application as Windows service. To begin, you have to reference namespace System.ServiceModel.Web and then add a few lines of code: Uri baseAddress = new Uri("http://localhost:8011/");               WebServiceHost svcHost = new WebServiceHost(typeof(PackRunner), baseAddress);                           try             {                 svcHost.Open();                   Console.WriteLine("Service is running");                 Console.WriteLine("Press enter to stop the service.");                 Console.ReadLine();                   svcHost.Close();             }             catch (CommunicationException cex)             {                 Console.WriteLine("An exception occurred: {0}", cex.Message);                 svcHost.Abort();             } The interesting lines are 3, 7 and 13. In line 3 you create a WebServiceHost object. In line 7 you start listening on the defined URL and then in line 13 you shut down the service. As you have noticed, the WebServiceHost constructor is accepting type of an object (here: PackRunner) that will be instantiated as singleton and subsequently used to process the requests. This is the class where you put your logic, but to tell WebServiceHost how to use it, the class must implement an interface which declares methods to be used by the host. The interface itself must be ornamented with attribute ServiceContract. [ServiceContract]     public interface IPackRunner     {         [OperationContract]         [WebGet(UriTemplate = "runpack?package={name}")]         string RunPackage1(string name);           [OperationContract]         [WebGet(UriTemplate = "runpackwithparams?package={name}&rows={rows}")]         string RunPackage2(string name, int rows);     } Each method that is going to be used by WebServiceHost has to have attribute OperationContract, as well as WebGet or WebInvoke attribute. The detailed discussion of the available options is outside of scope of this post. I also recommend using more descriptive names to methods . Then, you have to provide the implementation of the interface: public class PackRunner : IPackRunner     {         ... There are two methods defined in this class. I think that since the full code is attached to the post, I will show only the more interesting method, the RunPackage2.   /// <summary> /// Runs package and sets some of its variables. /// </summary> /// <param name="name">Name of the package</param> /// <param name="rows">Number of rows to export</param> /// <returns></returns> public string RunPackage2(string name, int rows) {     try     {         string pkgLocation = ConfigurationManager.AppSettings["PackagePath"];           pkgLocation = Path.Combine(pkgLocation, name.Replace("\"", ""));           Console.WriteLine();         Console.WriteLine("Calling package {0} with parameter {1}.", name, rows);                  Application app = new Application();         Package pkg = app.LoadPackage(pkgLocation, null);           pkg.Variables["User::ExportRows"].Value = rows;         DTSExecResult pkgResults = pkg.Execute();         Console.WriteLine();         Console.WriteLine(pkgResults.ToString());         if (pkgResults == DTSExecResult.Failure)         {             Console.WriteLine();             Console.WriteLine("Errors occured during execution of the package:");             foreach (DtsError er in pkg.Errors)                 Console.WriteLine("{0}: {1}", er.ErrorCode, er.Description);             Console.WriteLine();             return "Errors occured during execution. Contact your support.";         }                  Console.WriteLine();         Console.WriteLine();         return "OK";     }     catch (Exception ex)     {         Console.WriteLine(ex);         return ex.ToString();     } }   The method accepts package name and number of rows to export. The packages are deployed to the file system. The path to the packages is configured in the application configuration file. This way, you can implement multiple services on the same machine, provided you also configure the URL for each instance appropriately. To run a package, you have to reference Microsoft.SqlServer.Dts.Runtime namespace. This namespace is implemented in Microsoft.SQLServer.ManagedDTS.dll which in my case was installed in the folder “C:\Program Files (x86)\Microsoft SQL Server\100\SDK\Assemblies”. Once you have done it, you can create an instance of Microsoft.SqlServer.Dts.Runtime.Application as in line 18 in the above snippet. It may be a good idea to create the Application object in the constructor of the PackRunner class, to avoid necessity of recreating it each time the service is invoked. Then, in line 19 you see that an instance of Microsoft.SqlServer.Dts.Runtime.Package is created. The method LoadPackage in its simplest form just takes package file name as the first parameter. Before you run the package, you can set its variables to certain values. This is a great way of configuring your packages without all the hassle with dtsConfig files. In the above code sample, variable “User:ExportRows” is set to value of the parameter “rows” of the method. Eventually, you execute the package. The method doesn’t throw exceptions, you have to test the result of execution yourself. If the execution wasn’t successful, you can examine collection of errors exposed by the package. These are the familiar errors you often see during development and debugging of the package. I you run the package from the code, you have opportunity to persist them or log them using your favourite logging framework. The package itself is very simple; it connects to my AdventureWorks database and saves number of rows specified in variable “User::ExportRows” to a file. You should know that before you run the package, you can change its connection strings, logging, events and many more. I attach solution with the test service, as well as a project with two test packages. To test the service, you have to run it and wait for the message saying that the host is started. Then, just type (or copy and paste) the below command to your browser. http://localhost:8011/runpackwithparams?package=%22ExportEmployees.dtsx%22&rows=12 When everything works fine, and you modified the package to point to your AdventureWorks database, you should see "OK” wrapped in xml: I stopped the database service to simulate invalid connection string situation. The output of the request is different now: And the service console window shows more information: As you see, implementing service oriented ETL framework is not a very difficult task. You have ability to configure the packages before you run them, you can implement logging that is consistent with the rest of your system. In application I have worked with we also have resource monitoring and execution control. We don’t allow to run more than certain number of packages to run simultaneously. This ensures we don’t strain the server and we use memory and CPUs efficiently. The attached zip file contains two projects. One is the package runner. It has to be executed with administrative privileges as it registers HTTP namespace. The other project contains two simple packages. This is really a cool thing, you should check it out!

    Read the article

  • Integrating BizTalk Server and StreamInsight paper

    - by gsusx
    With all the holidays madness I didn't realized that my "Integrating BizTalk Server and StreamInsight" paper is now available on MSDN . This paper was originally an idea of the BizTalk product team and intends to present some fundamental scenarios that can be enabled by the combination of BizTalk Server and StreamInsight. Thanks to everybody who, directly or indirectly, provided feedback about this paper: Syed Rasheed, Mark Simms , Richard Seroter , Roman Schindlauer and Torsten Grabs from the StreamInsight...(read more)

    Read the article

  • Announcing SO-Aware Test Workbench

    - by gsusx
    Yesterday was a big day for Tellago Studios . After a few months hands down working, we announced the release of the SO-Aware Test Workbench tool which brings sophisticated performance testing and test visualization capabilities to theWCF world. This work has been the result of the feedback received by many of our SO-Aware and Tellago customers in terms of how to improve the WCF testing. More importantly, with the SO-Aware Test Workbench we are trying to address what has been one of the biggest challenges...(read more)

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by Brian Dayton
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage.   2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.."   3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what.   Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.  

    Read the article

  • Tellago Devlabs: A RESTful API for BizTalk Server Business Rules

    - by gsusx
    Tellago DevLabs keeps growing as the primary example of our commitment to open source! Today, we are very happy to announce the availability of the BizTalk Business Rules Data Service API which extends our existing BizTalk Data Services solution with an OData API for the BizTalk Server Business Rules engine. Tellago’s Vishal Mody led the implementation of this version of the API with some input from other members of our technical staff. The motivation The fundamental motivation behind the BRE Data...(read more)

    Read the article

  • Bringing true agility to enterprise .NET: Tellago Studios announces TeleSharp

    - by gsusx
    We are happy to announce the latest addition to Tellago Studios’ product family: TeleSharp . After the success of SO-Aware and the SO-Aware Test Workbench , we decided to tackle on a bigger challenge by taking the initial steps towards simplifying enterprise .NET application development. After months of discussion with customers we decided to focus on the following challenges: Cataloging Applications What if you could keep a central catalog of the .NET applications exist on your enterprise? What...(read more)

    Read the article

  • Difference between $ and # in ADF/JSF/JSP

    - by pavan.pvj
    Found this one interesting. So, picked it from one of the books and posting here.JSP 2.1 and JSF 1.2 - both of them use a unified Expression language. One major and the most obvious difference is between $ and #. JSP 2.1 uses $ and JSF 1.2 uses # in an EL. $ - immediate evaluation# - deferred evaluation$ - $ syntax executes expressions eagerly/immediately, which means that the result is returned immediately when the page renders.# - # syntax defers the expression evaluation to a point defined by the implementing technology. In general, JSF uses deferred EL evaluation because of its multiple lifecycle phases in which events are handled. To ensure the model is prepared before the values are accessed by EL, it must defer EL evaluation until the appropriate point in the life cycle.Note: This is picked up from Oracle Fusion Developer Guide (ISBN: 9780071622547). There is also a very good article here:http://java.sun.com/products/jsp/reference/techart/unifiedEL.html

    Read the article

  • REST or Non-REST on Internal Services

    - by tyndall
    I'm curious if others have chosen to implement some services internally at their companies as non-REST (SOAP, Thrift, Proto Buffers, etc...) as a way to auto-generate client libraries/wrappers? I'm on a two year project. I will be writing maybe 40 services over that period with my team. 10% of those services definitely make sense as REST services, but the other 90% feel more like they could be done in REST or RPC style. Of these 90%, 100% will be .NET talking to .NET. When I think about all the effort to have my devs develop client "wrappers" for REST services I cringe. WADL or RSDL don't seem to have enough mindshare. Thoughts? Any good discussions of this "internal service" issue online? If you have struggled with this what general rules for determining REST or non-REST have you used?

    Read the article

  • Welcome

    - by Jiandong Guo
    In this blog, I plan to provide you with information about OWSM, Oracle Web services Manager.  I joined Platform Security and OWSM team in Oracle's identity management organization in February, 2010. Before that I had been working on Metro, an open source Web services project,  in Sun's Glassfish organization for 5 years, as one of the architects for security. I am continuing that work here at Oracle OWSM, focusing on developing and evangelizing our enterprise Web services security,  identity and policy management offerings.To start with, I plan to write a series of posts on some of the new features for OWSM in Oracle Fusion Middleware 11g R1 PS3.Thank you all for your interests.

    Read the article

  • Versioning Strategy for Service Interfaces JAR

    - by Colin Morelli
    I'm building a service oriented architecture composed (mostly) of Java-based services, each of which is a Maven project (in an individual repository) with two submodules: common, and server. The common module contains the service's interfaces that clients can include in their project to make service calls. The server submodule contains the code that actually powers the service. I'm now trying to figure out an appropriate versioning strategy for the interfaces, such that each interface change results in a new common jar, but changes to the server (so long as they don't impact the contract of the interfaces) receive the same common jar. I know this is pretty simple to do manually (simply increment the server version and don't touch the common one), but this project will be built and deployed by a CI server, and I'd like to come up with a strategy for automatically versioning these. The only thing I have been able to come up with so far is to have the CI server md5 the service interfaces.

    Read the article

  • Oracle ADF Framework for 4GL Developers Workshop (15-17/Jun/10)

    - by Claudia Costa
    This 3 day workshop is targeted at Oracle Forms professionals interested in developing JEE applications based on Oracle ADF (Application Development Framework). The workshop highlights the similarities between the 2 development paradigms, while also discussing the crucial differences and components such as the ADF BC and ADF Faces. The goal is to lower the learning curve and enable the attendees to leverage ADF technology immediately, either in developing new applications or re-writing existing Forms applications.   During the event the attendees will rewrite a sample Oracle Forms application using the above technology.   Prerequisites ·         Basic knowledge Oracle database ·         Basic knowledge of the Java Programming Language ·         Basic knowledge of Oracle Jdeveloper or another Java IDE   Hardware/Software Requirements This workshop requires attendees to provide their own laptops for this class. Attendee laptops must meet the following minimum hardware/software requirements: ·         Laptop/PC (3 GB RAM recommended) ·         Oracle Database 10g ·         Internet Explorer 7 ·         The version of Oracle JDeveloper 11g will be provided   To view the full agenda and register please click here   ------------------------------------------------------------------------ Clique aqui e registe-se.   Horário e Local: 9h30 - 18h00 Oracle Lagoas Park - Edf. 8, Porto Salvo   Para mais informações, por favor contacte: [email protected] ------------------------------------------------------------------------

    Read the article

  • How to use the unit of work and repository patterns in a service oriented enviroment

    - by A. Karimi
    I've created an application framework using the unit of work and repository patterns for it's data layer. Data consumer layers such as presentation depend on the data layer design. For example a CRUD abstract form has a dependency to a repository (IRepository). This architecture works like a charm in client/server environments (Ex. a WPF application and a SQL Server). But I'm looking for a good pattern to change or reuse this architecture for a service oriented environment. Of course I have some ideas: Idea 1: The "Adapter" design pattern Keep the current architecture and create a new unit of work and repository implementation which can work with a service instead of the ORM. Data layer consumers are loosely coupled to the data layer so it's possible but the problem is about the unit of work; I have to create a context which tracks the objects state at the client side and sends the changes to the server side on calling the "Commit" (Something that I think the RIA has done for Silverlight). Here the diagram: ----------- CLIENT----------- | ------------------ SERVER ---------------------- [ UI ] -> [ UoW/Repository ] ---> [ Web Services ] -> [ UoW/Repository ] -> [DB] Idea 2: Add another layer Add another layer (let say "local services" or "data provider"), then put it between the data layer (unit of work and repository) and the data consumer layers (like UI). Then I have to rewrite the consumer classes (CRUD and other classes which are dependent to IRepository) to depend on another interface. And the diagram: ----------------- CLIENT ------------------ | ------------------- SERVER --------------------- [ UI ] -> [ Local Services/Data Provider ] ---> [ Web Services ] -> [ UoW/Repository ] -> [DB] Please note that I have the local services layer on the current architecture but it doesn't expose the data layer functionality. In another word the UI layer can communicate with both of the data and local services layers whereas the local services layer also uses the data layer. | | | | | | | | ---> | Local Services | ---> | | | UI | | | | Data | | | | | | | ----------------------------> | |

    Read the article

  • Is there a clean separation of my layers with this attempt at Domain Driven Design in XAML and C#

    - by Buddy James
    I'm working on an application. I'm using a mixture of TDD and DDD. I'm working hard to separate the layers of my application and that is where my question comes in. My solution is laid out as follows Solution MyApp.Domain (WinRT class library) Entity (Folder) Interfaces(Folder) IPost.cs (Interface) BlogPosts.cs(Implementation of IPost) Service (Folder) Interfaces(Folder) IDataService.cs (Interface) BlogDataService.cs (Implementation of IDataService) MyApp.Presentation(Windows 8 XAML + C# application) ViewModels(Folder) BlogViewModel.cs App.xaml MainPage.xaml (Contains a property of BlogViewModel MyApp.Tests (WinRT Unit testing project used for my TDD) So I'm planning to use my ViewModel with the XAML UI I'm writing a test and define my interfaces in my system and I have the following code thus far. [TestMethod] public void Get_Zero_Blog_Posts_From_Presentation_Layer_Returns_Empty_Collection() { IBlogViewModel viewModel = _container.Resolve<IBlogViewModel>(); viewModel.LoadBlogPosts(0); Assert.AreEqual(0, viewModel.BlogPosts.Count, "There should be 0 blog posts."); } viewModel.BlogPosts is an ObservableCollection<IPost> Now.. my first thought is that I'd like the LoadBlogPosts method on the ViewModel to call a static method on the BlogPost entity. My problem is I feel like I need to inject the IDataService into the Entity object so that it promotes loose coupling. Here are the two options that I'm struggling with: Not use a static method and use a member method on the BlogPost entity. Have the BlogPost take an IDataService in the constructor and use dependency injection to resolve the BlogPost instance and the IDataService implementation. Don't use the entity to call the IDataService. Put the IDataService in the constructor of the ViewModel and use my container to resolve the IDataService when the viewmodel is instantiated. So with option one the layers will look like this ViewModel(Presentation layer) - Entity (Domain layer) - IDataService (Service Layer) or ViewModel(Presentation layer) - IDataService (Service Layer)

    Read the article

  • IT Optimization Plan Pays Off For UK Retailer

    - by [email protected]
    I caught this article in ComputerworldUK yesterday. The headline talks about UK-based supermarket chain Morrisons is increasing their IT spend...OK, sounds good. Even nicer that Oracle is a big part of that. But what caught my eye were three things: 1) Morrison's truly has a long term strategy for IT. In this case, modernizing and optimizing how they use IT for business advantage. 2) Even in a tough economic climate, Morrison's views IT investments as contributing to and improving the bottom line. Specifically, "The investment in IT contributed to a 21 percent increase in Morrison's underlying profit.." 3) The phased, 3-year "Optimization Plan" took a holistic approach to their business--from CRM and Supply Chain systems to the underlying application infrastructure. On the infrastructure front, adopting a more flexible Service-Oriented Architecture enabled them to be more agile and adapt their business and Identity Management helped with sometimes mundane (but costly) issues like lost passwords and being able to document who has access to what. Things don't always turn out so rosy. And I know it was a long and difficult process...but it's nice to see a happy ending every once in a while.

    Read the article

  • SQL 2008 R2 Named Instance Client Connectivity Issues?

    - by Jerry Dodge
    We're upgrading our software from using SQL 2000 to 2008 R2. Our customers will be installing an update which uninstalls 2000 and installs 2008 R2 under the same instance. So if no instance existed, then no instance name will be set (default). However, the problem starts with the customers which have a named SQL instance. Starting in 2008 R2 (not sure of ones before), for some reason, a client connecting to the server by its instance name is unsuccessful. I'm testing from the Management Studio - if I can't connect this, then nothing can connect. I browse network servers, and find the specific server\instance in the list. But, upon trying to connect to an instance name like MyServer\INST, I get: A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (Microsoft SQL Server, Error: -1) I do in fact have TCP/IP and Named Pipes protocols enabled, this is the first thing I did. When I connect to the server using a comma (,) and port number like MyServer, 49195, it works just fine. So it appears that client computers are just unable to identify the instance names. This has happened on all our installations of SQL 2008 R2 and from all client computers, including Win 7, XP, Vista, Server 2008, and Server 2003. We never experienced such issues on earlier versions of SQL. The problem even persists if the firewalls and antiviruses are all disabled. Now, this is a large update which we will be distributing soon to all our customers, and we want to minimize the interaction they need with us to get this installed. We absolutely hate the idea of using a port number, because it will always be different, and we would have to modify each client to point to this server/port. Some of our customers may have hundreds of client computers. How do I make client connections to a named SQL instance work again? After all, this is the whole purpose of named instances, and if a client can't connect to this instance by its name, then what is it even named for? EDIT It was mentioned to make sure SQL Browser is running, so I checked, and it is running. The server is also able to connect to its self (locally) - just external connections are refused. UPDATE After more careful checking, I learned the firewall wasn't completely disabled when testing, and upon disabling it completely, this works. So it appears that SQL Browser is being blocked by the firewall from external clients from accessing.

    Read the article

  • Reuse Business Logic between Web and API

    - by fesja
    We have a website and two mobile apps that connect through an API. All the platforms do the exactly same things. Right now the structure is the following: Website. It manages models, controllers, views for the website. It also executes all background tasks. So if a user create a place, everything is executed in this code. API. It manages models, controllers and return a JSON. If a user creates a place on the mobile app, the place is created here. After, we add a background task to update other fields. This background task is executed by the Website. We are redoing everything, so it's time to improve the approach. Which is the best way to reuse the business logic so I only need to code the insert/edit/delete of the place & other actions related in just one place? Is a service oriented approach a good idea? For example: Service. It has the models and gets, adds, updates and deletes info from the DB. Website. It send the info to the service, and it renders HTML. API. It sends info to the service, and it returns JSON. Some problems I have found: More initial work? Not sure.. It can work slower. Any experience? The benefits: We only have the business logic in one place, both for web and api. It's easier to scale. We can put each piece on different servers. Other solutions Duplicate the code and be careful not to forget anything (do tests!) DUplicate some code but execute background tasks that updates the related fields and executes other things (emails, indexing...) A "small" detail is we are 1.3 person in backend, for now ;)

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >