Search Results

Search found 17222 results on 689 pages for 'integration services'.

Page 59/689 | < Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >

  • Branching and CI Builds with Agile

    - by Bob Horn
    We follow many agile processes, including automated tests, continuous integration, sprint reviews, etc... We're currently having a debate about how often we should branch release builds. We've been doing two-week sprints and trying to deploy to production at the end of each sprint. Some of us think we should be branching every sprint. Some of us think that's overkill. If a project encompasses three Visual Studio solutions, and we branch every sprint, then that's three branches, and three CI builds to create every two weeks. If we do this for six months, we'll end up with 36 branches and 36 CI builds. There is overhead involved in that. For those of us that think that branching every sprint is overkill, we don't have a very good alternative. On my last project, we deployed some solutions from the Main trunk. Yeah, that's not good, but it saved on some of the overhead. What's the right way to manage branching/releasing and CI builds, using agile, when we have such short (two-week) sprint cycles?

    Read the article

  • Windows 7 XP mode missing driver virtual pc integration device

    - by Charlie Wu
    I have XP Mode installed fine, but the shipped Windows XP installation is too heavy. I tried to install a light XP using my ASUS Eee PC. After installation, I have to install a few missing drivers and everything worked OK except I can't copy and paste between host and guest operating system. I checked Device Manager and found the Virtual PC Integration Device is not installed correctly. I can't find any driver for that. How can I fix this problem?

    Read the article

  • Does RIA Services requires an extra install on the server?

    - by jkohlhepp
    If I want to deploy an ASP.NET application that hosts RIA Services endpoints for a Silverlight application, do I have to install anything extra on the web server? Or is it just some extra DLLs that can be deployed to my applications Bin folder? I know that when you are doing RIA Services development there are additional toolkits and what-not to install, but I'm not sure if those are needed on the server.

    Read the article

  • ASP.NET MVC as a service host for SOA like architecture

    - by Delucia
    I'm creating a distributed application that includes a lot of services and I'm looking for the technology that allows me to create and manage a lot of services easily. I know managing and deploying windows services is not fun. I'm thinking of using ASP.NET MVC as service host of my services where each controller action becomes essentially a service and I can communicate with a service via simple HTTP request and responses and not have to deal with complexity if i use something like WCF. Services need to be isolated and asp.net requests are isolated as far as i know, i.e. if a request throws an exception it will not effect other running requests. But I still have questions about the management of the services. How will it be possible to see which services are running, stopping and resuming services. Also ASP.NET MVC are passive, i.e. they only do something upon a request but what if i want to service that initiates work by itself?

    Read the article

  • What is the security advantage of STS in web services?

    - by Neil McF
    Hello, I've started reading up on security (particularly authentication) with web services and I see a lot of references to security token services. From what I see, they take a username-password (or something) and, on validation, return a digital token. How is using this token any more secure then just relying on the username-password in the first place?

    Read the article

  • Why would one use REST instead of Web services?

    - by AngryHacker
    Attended an interesting demo on REST today, however, I couldn't think of a single reason (nor was one presented) why REST is in anyway better or simpler to use and implement than a Web Services stack. What are some of the reasons Why anyone in the "real world" use REST instead of the Web Services?

    Read the article

  • Integration features enabled but drives not available

    - by dsjbirch
    Frustratingly, after a recent update to Windows XP mode integration features, the availability of shared disks from the hosts has been impaired. Does anyone know any kind of workaround or fix (excluding dropbox et al)? I have tried completely uninstalling and reinstalling as per http://www.sevenforums.com/virtualization/63710-refreshing-xp-mode.html#post568715 At one point restarting the machine appeared to have worked, but today again I am without access to my host. Interestingly audio and copy and paste to and from the machine are working.

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • SQLAuthority News – List of Master Data Services White Paper

    - by pinaldave
    Since my TechEd India 2010 presentation I am very excited with SQL Server 2010 MDS. I just come across very interesting white paper on Microsoft site related to this subject. Here is the list of the same and location where you can download them. They are all written by Top Experts at Microsoft. Master Data Management from a Business Perspective - Download a PDF version or an XPS version Master Data Management from a Technical Perspective - Download a PDF version or an XPS version Bringing Master Data Management to the Stakeholders - Download a PDF version or an XPS version Implementing a Phased Approach to Master Data Management - Download a PDF version or an XPS version SharePoint Workflow Integration with Master Data Services - Read it here. Reference : Pinal Dave (http://blog.SQLAuthority.com) Filed under: SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQL White Papers, T SQL

    Read the article

  • Amazon Product Advertising API SOAP Namespace Changes

    - by Rick Strahl
    About two months ago (twowards the end of February 2012 I think) Amazon decided to change the namespace of the Product Advertising API. The error that would come up was: <ItemSearchResponse > was not expected. If you've used the Amazon Product Advertising API you probably know that Amazon has made it a habit to break the services every few years or so and I guess last month was about the time for another one. Basically the service namespace of the document has been changed and responses from the service just failed outright even though the rest of the schema looks fine. Now I looked around for a while trying to find a recent update to the Product Advertising API - something semi-official looking but everything is dated around 2009. Really??? And it's not just .NET - the newest thing on the sample/APIs is dated early 2011 and a handful of 2010 samples. There are newer full APIs for the 'cloud' offerings, but the Product Advertising API apparently isn't part of that. After searching for quite a bit trying to trace this down myself and trying some of the newer samples (which also failed) I found an obscure forum post that describes the solution of getting past the namespace issue. FWIW, I've been using an old version of the Product Advertising API using the old Microsoft WSE3 services (pre-WCF), which provides some of the WS* security features required by the Amazon service. The fix for this code is to explicitly override the namespace declaration on each of the imported service method signatures. The old service namespace (at least on my build) was: http://webservices.amazon.com/AWSECommerceService/2009-03-31 and it should be changed to: http://webservices.amazon.com/AWSECommerceService/2011-08-01 Change it on the class header:[Microsoft.Web.Services3.Messaging.SoapService("http://webservices.amazon.com/AWSECommerceService/2011-08-01")] [System.Xml.Serialization.XmlIncludeAttribute(typeof(Property[]))] [System.Xml.Serialization.XmlIncludeAttribute(typeof(BrowseNode[]))] [System.Xml.Serialization.XmlIncludeAttribute(typeof(TransactionItem[]))] public partial class AWSECommerceService : Microsoft.Web.Services3.Messaging.SoapClient { and on all method signatures:[Microsoft.Web.Services3.Messaging.SoapMethodAttribute("http://soap.amazon.com/ItemSearch")] [return: System.Xml.Serialization.XmlElementAttribute("ItemSearchResponse", Namespace="http://webservices.amazon.com/AWSECommerceService/2011-08-01")] public ItemSearchResponse ItemSearch(ItemSearch ItemSearch1) { Microsoft.Web.Services3.SoapEnvelope results = base.SendRequestResponse("ItemSearch", ItemSearch1); return ((ItemSearchResponse)(results.GetBodyObject(typeof(ItemSearchResponse), this.SoapServiceAttribute.TargetNamespace))); } It's easy to do with a Search and Replace on the above strings. Amazon Services <rant> FWIW, I've not been impressed by Amazon's service offerings. While the services work well, their documentation and tool support is absolutely horrendous. I was recently working with a customer on an old AWS application and their old API had been completely removed with a new API that wasn't even a close match. One old API call resulted in requiring three different APIs to perform the same functionality. We had to re-write the entire piece from scratch essentially. The documentation was downright wrong, and incomplete and so scattered it was next to impossible to follow. The examples weren't examples at all - they're mockups of real service calls with fake data that didn't even provide everything that was required to make same service calls work. Additionally there appears to be just about no public support from Amazon, only peer support which is sparse at best - and getting a hold of somebody at Amazon, even for pay seems to be mythical task. It's a terrible business model they have going. I can't see why anybody would put themselves through this sort of customer and development experience. Sad really, but an experience we see more and more these days. Nobody puts in the time to document anything anymore, leaving it to devs to figure this stuff out over and over again… </rant>© Rick Strahl, West Wind Technologies, 2005-2012Posted in CSharp  Web Services   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • What is a good design pattern / lib for iOS 5 to synchronize with a web service?

    - by Junto
    We are developing an iOS application that needs to synchronize with a remote server using web services. The existing web services have an "operations" style rather than REST (implemented in WCF but exposing JSON HTTP endpoints). We are unsure of how to structure the web services to best fit with iOS and would love some advice. We are also interested in how to manage the synchronization process within iOS. Without going into detailed specifics, the application allows the user to estimate repair costs at a remote site. These costs are broken down by room and item. If the user has an internet connection this data can be sent back to the server. Multiple photographs can be taken of each item, but they will be held in a separate queue, which sends when the connection is optimal (ideally wifi). Our backend application controls the unique ids for each room and item. Thus, each time we send these costs to the server, the server echoes the central database ids back, thus, that they can be synchronized in the mobile app. I have simplified this a little, since the operations contract is actually much larger, but I just want to illustrate the basic requirements without complicating matters. Firstly, the web service architecture: We currently have two operations: GetCosts and UpdateCosts. My assumption is that if we used a strict REST architecture we would need to break our single web service operations into multiple smaller services. This would make the services much more chatty and we would also have to guarantee a delivery order from the app. For example, we need to make sure that containing rooms are added before the item. Although this seems much more RESTful, our perception is that these extra calls are expensive connections (security checks, database calls, etc). Does the type of web api (operation over service focus) determine chunky vs chatty? Since this is mobile (3G), are we better handling lots of smaller messages, or a few large ones? Secondly, the iOS side. What is the current advice on how to manage data synchronization within the iOS (5) app itself. We need multiple queues and we need to guarantee delivery order in each queue (and technically, ordering between queues). The server needs to control unique ids and other properties and echo them back to the application. The application then needs to update an internal database and when re-updating, make sure the correct ids are available in the update message (essentially multiple inserts and updates in one call). Our backend has a ton of business logic operating on these cost estimates. We don't want any of this in the app itself. Currently the iOS app sends the cost data, and then the server echoes that data back with populated ids (and other data). The existing cost data is deleted and the echoed response data is added to the client database on the device. This is causing us problems, because any photos might not have been sent, but the original entity tree has been removed and replaced. Obviously updating the costs tree rather than replacing it would remove this problem, but I'm not sure if there are any nice xcode libraries out there to do such things. I welcome any advice you might have.

    Read the article

  • How to integrate a PHP CMS with paypal so that only users who completed a payment can register and authenticate?

    - by ibiza
    I am currently using a PHP CMS - cmsmadesimple - in order to create a website where services will be sold. I intend to use Paypal 'Buy Now' buttons in order to offer a few packages that will be renewable every 1-month or every 3-months and that grant access to the secure content of the website for a given period of time. Everything is going well so far but I am somewhat at loss for the user registration process as I have a few constraints I would like to use and it would be nice to automate the process if possible. Here are the constraints : User should be able to register to my website and choose a password himself Only users that paid should be able to register Access permissions should be disabled automatically after the service period if the package is not renewed And here is the process which I am thinking of : User clicks 'buy' on my website User is redirected on Paypal and completes the payment The paypal email used to pay should be returned to my server and somehow stored If it is a new email, user needs to register to my website (else if it is a returning customer, the deactivation flag for payment stopped should be removed to give back access) If a user does not renew his subscription, there should be a deactivation flag automatically set to the email used in order to lock access until next payment. Ideally, no human intervention is needed. What is the best way to implement all this? I am a bit at loss. I found this article that explained a few things and even has a nice code snippet, except that I'm not sure where to plug it. Thanks all

    Read the article

  • Managing Operational Risk of Financial Services Processes – part 2/2

    - by Sanjeev Sharma
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} In my earlier blog post, I had described the factors that lead to compliance complexity of financial services processes. In this post, I will outline the business implications of the increasing process compliance complexity and the specific role of BPM in addressing the operational risk reduction objectives of regulatory compliance. First, let’s look at the business implications of increasing complexity of process compliance for financial institutions: · Increased time and cost of compliance due to duplication of effort in conforming to regulatory requirements due to process changes driven by evolving regulatory mandates, shifting business priorities or internal/external audit requirements · Delays in audit reporting due to quality issues in reconciling non-standard process KPIs and integrity concerns arising from the need to rely on multiple data sources for a given process Next, let’s consider some approaches to managing the operational risk of business processes. Financial institutions considering reducing operational risk of their processes, generally speaking, have two choices: · Rip-and-replace existing applications with new off-the shelf applications. · Extend capabilities of existing applications by modeling their data and process interactions, with other applications or user-channels, outside of the application boundary using BPM. The benefit of the first approach is that compliance with new regulatory requirements would be embedded within the boundaries of these applications. However pre-built compliance of any packaged application or custom-built application should not be mistaken as a one-shot fix for future compliance needs. The reason is that business needs and regulatory requirements inevitably out grow end-to-end capabilities of even the most comprehensive packaged or custom-built business application. Thus, processes that originally resided within the application will eventually spill outside the application boundary. It is precisely at such hand-offs between applications or between overlaying processes where vulnerabilities arise to unknown and accidental faults that potentially result in errors and lead to partial or total failure. The gist of the above argument is that processes which reside outside application boundaries, in other words, span multiple applications constitute a latent operational risk that spans the end-to-end value chain. For instance, distortion of data flowing from an account-opening application to a credit-rating system if left un-checked renders compliance with “KYC” policies void even when the “KYC” checklist was enforced at the time of data capture by the account-opening application. Oracle Business Process Management is enabling financial institutions to lower operational risk of such process ”gaps” for Financial Services processes including “Customer On-boarding”, “Quote-to-Contract”, “Deposit/Loan Origination”, “Trade Exceptions”, “Interest Claim Tracking” etc.. If you are faced with a similar challenge and need any guidance on the same feel free to drop me a note.

    Read the article

  • Extensible Metadata in Oracle IRM 11g

    - by martin.abrahams
    Another significant change in Oracle IRM 11g is that we now use XML to create the tamperproof header for each sealed document. This article explains what this means, and what benefit it offers. So, every sealed file has a metadata header that contains information about the document - its classification, its format, the user who sealed it, the name and URL of the IRM Server, and much more. The IRM Desktop and other IRM applications use this information to formulate the request for rights, as well as to enhance the user experience by exposing some of the metadata in the user interface. For example, in Windows explorer you can see some metadata exposed as properties of a sealed file and in the mouse-over tooltip. The following image shows 10g and 11g metadata side by side. As you can see, the 11g metadata is written as XML as opposed to the simple delimited text format used in 10g. So why does this matter? The key benefit of using XML is that it creates the opportunity for sealing applications to use custom metadata. This in turn creates the opportunity for custom classification models to be defined and enforced. Out of the box, the solution uses the context classification model, in which two particular pieces of metadata form the basis of rights evaluation - the context name and the document's item code. But a custom sealing application could use some other model entirely, enabling rights decisions to be evaluated on some other basis. The integration with Oracle Beehive is a great example of this. When a user adds a document to a Beehive workspace, that document can be automatically sealed with metadata that represents the Beehive security model rather than the context model. As a consequence, IRM can enforce the Beehive security model precisely and all rights configuration can actually be managed through the Beehive UI rather than the IRM UI. In this scenario, IRM simply supports the Beehive application, seamlessly extending Beehive security to all copies of workspace documents without any additional administration. Finally, I mentioned that the metadata header is tamperproof. This is obviously to stop a rogue user modifying the metadata with a view to gaining unauthorised access - reclassifying a board document to a less sensitive classifcation, for example. To prevent this, the header is digitally signed and can only be manipulated by a suitably authorised sealing application.

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!! Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" } To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet UPDATE: THis post prompted Chad Miller to write a post describing his Powershell add-in that utilises a SSIS API to do exporting of packages. Go take a read here: http://sev17.com/2011/02/importing-and-exporting-ssis-packages-using-powershell/

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!!   Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:   Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" }   To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet

    Read the article

  • DTLoggedExec 1.1.2008.4 Released!

    - by Davide Mauri
    Today I've relased the latest version of my DTExec replacement tool, DTLoggedExec. The main changes are the following: Used a new strategy for version numbers. Now it will follow the following pattern Major.Minor.TargetSQLServerVersion.Revision Added support for Auto Configurations Fixed a bug that reported incorrect number of errors and warnings to Log Providers Fixed a buf that prevented correct casting of values when using /Set and /Param options Errors and Warnings are now counted more precisely. Updated database and log import scripts to categorize logs by projects and sections. E.g.: Project: MyBIProject; Sections: Staging, Datawarehouse Removed unused report stored procedures from database Updated Samples: 12 samples are now available to show ALL DTLoggedExec features From this version only SSIS 2008 will be supported http://dtloggedexec.codeplex.com/releases/view/62218  It useful to say something more on a couple of specific points: From this version only SSIS 2008 will be supportedYes, Integration Services 2005 are not supported anymore. The latest version capable of running SSIS 2005 Packages is the 1.0.0.2. Updated database and log import scripts to categorize logs by projects and sectionsWhen you import a log file, you can now assign it to a Project and to a Section of that project. In this way it's easier to gather statistical information for an entire project or a subsection of it. This also allows to store logged data of package belonging to different projects in the same database. For example:  Updated SamplesA complete set of samples that shows how to use all DTLoggedExec features are now shipped with the product. Enjoy! Added support for Auto ConfigurationsThis point will have a post on its own, since it's quite important and is by far the biggest new feature introduced in this release. To explain it in a few words, I can just say that you don't need to waste time with complex DTS configuration files or options, since a package will configure itself automatically. You just need to write a single statement as a parameter for DTLoggedExec. This feature can simplify deployment *a lot* :)   I the next days I'll write the mentioned post on Auto-Configurations and i'll update the documentation available on theDTLoggedExec website:   http://dtloggedexec.davidemauri.it/MainPage.ashx

    Read the article

  • BackupExec errors when starting services

    - by blade
    Hi, I've installed backupexec 2010 trial on my server, with an appropriately-privileged AD account, but get errors when starting the required services from the login page: Processing services Start services on server: WIN-HQ7JSCRTTSQ Starting Enterprise Vault Admin Service on WIN-HQ7JSCRTTSQ. The service Enterprise Vault Admin Service is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Remote Agent for Windows Systems on WIN-HQ7JSCRTTSQ. The service Backup Exec Remote Agent for Windows Systems is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Device & Media Service on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Device & Media Service on WIN-HQ7JSCRTTSQ. Service-specific error code returned: 0x2000e2d3 (536928979) Starting Backup Exec Server on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Server on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec Job Engine on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Job Engine on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec Agent Browser on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec Agent Browser on WIN-HQ7JSCRTTSQ. The dependency service or group failed to start. Starting Backup Exec DLO Administration Service on WIN-HQ7JSCRTTSQ. Error starting the service Backup Exec DLO Administration Service on WIN-HQ7JSCRTTSQ. Error code returned: Starting Backup Exec DLO Maintenance Service on WIN-HQ7JSCRTTSQ. The service Backup Exec DLO Maintenance Service is already running on WIN-HQ7JSCRTTSQ. Starting Backup Exec Web Service on WIN-HQ7JSCRTTSQ. The service Backup Exec Web Service is already running on WIN-HQ7JSCRTTSQ. Start services on server WIN-HQ7JSCRTTSQ completed. Processing services completed! How can I resolve this?

    Read the article

  • Repeated Reporting Services Login issue when deploying through BIDS to a remote server

    - by Richard Edwards
    We are having a problem deploying a reporting services report to a sql reporting services computer that is configured in SharePoint Integrated mode. I can successfully deploy to the SharePoint document libraries set up for reports and data connections if I do it locally from the box that SharePoint and Reporting Services are deployed on. If I try and do the same thing with the exact same deployment properties from a remote box, I constantly get a Reporting Services Login dialog popping up and no combination of domain\username and password will work. I've even tried the machines local admin account and still nothing. Any ideas where to start looking?

    Read the article

  • EC2 EBS AMI Instance stopping/restarting doesn't start services

    - by tgm
    I've recently been moving our instances to EBS instances (CentOS) and still have a bit of confusion on what's happening when I "stop" and instance. I have some of my services with runlevels 345 on but when I start a stopped instance the services don't start. What's actually happening when I issue a stop command to the instance, and how do I get my services to start automatically when I start the instance up again?

    Read the article

  • Can't find Role Services in Windows 7

    - by Pete Alvin
    I'm trying to install PHP on Windows 7 Home Premium using these instructions: Using FastCGI to Host PHP Applications on IIS 7 and a 1/3 of the way down on the page I have to do this: Server Manager -> Roles -> Add Role Services and just cannot find Roles or Role Services. Any ideas? PHP is giving me a 500 error and I'm wondering if it's because I haven't done this role services thing.

    Read the article

  • Website and Web Services Monitoring

    - by JohnyD
    I'm looking for a good product (either stand-alone or remote) that will monitor our online services. We're in a transition period between new asp.net and old foxpro / cgi and because of this experience errors that can make our services unresponsive. Can anyone suggest internal/external tools, subscription based solutions, or anything else that I can put in place to monitor all aspects of our web services? Thank you

    Read the article

  • EC2 EBS AMI Instance stopping/restarting doesn't start services

    - by tgm
    I've recently been moving our instances to EBS instances (CentOS) and still have a bit of confusion on what's happening when I "stop" and instance. I have some of my services with runlevels 345 on but when I start a stopped instance the services don't start. What's actually happening when I issue a stop command to the instance, and how do I get my services to start automatically when I start the instance up again?

    Read the article

  • exchange 2003 to 2007, recipient update services

    - by Jeff
    going to try my best to explain: our exchange 2003 was migrated to exchange 2007 (was not here for the migration). Looking in active directory sites and services - services - Microsoft exchange - company - address list container - recipient update services i have two recipient update services 1) recipient update service (enterprise configuration) 2) recipient update service (my-domain) my guess is one of these is left from the old exchange 2003. (im guessing this because the migration was not done properly, the exchange server was never actually uninstall, it was just taken offline, and then later disposed of). could having two of these cause me issues, and if so, how do i know which one i could get rid of ? Im asking because i honestly have no idea. it might even be nothing. thanks

    Read the article

< Previous Page | 55 56 57 58 59 60 61 62 63 64 65 66  | Next Page >