Search Results

Search found 4276 results on 172 pages for 'integration'.

Page 19/172 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Is QTWebKit still being actively developed? [closed]

    - by Brian M. Hunt
    I am considering recommending PhantomJS/CasperJS for a project, to provide continuous integration testing. Unfortunately those testing frameworks are based on QTWebKit, and it does not appear that there has been much activity on QTWebKit since September of 2011. It seems this is because of Nokia's financial troubles. QT has since been sold to Digia in August of this year, and can be found on qt.digia.com. It is not apparent whether QTWebKit will be actively developed. Before putting the effort into developing a PhantomJS/CasperJS testing framework, I would like to know whether the underlying QTWebKit framework is probably going to continue to be actively developed (or, alternatively, could be easily substituted with an alternative). I would suspect that since Digia just acquired QT it is a little too soon to tell what direction they will take the project. I would be interested in thoughts and comments on this issue.

    Read the article

  • Looking for a very subtle unit testing example

    - by Stéphane Bruckert
    In the context of Continuous Integration, I need to teach unit testing to a 20-people audience of programmers. Everything will be all right, but I am still trying to find the perfect unit testing example. More than writing tests like a robot, I want to show that unit testing can help prevent very subtle errors. I am thinking of the following scenario to happen when doing a live TDD demo: the test cases would already be written, we would have to write methods together, most of us would naturally have forgotten to handle a specific case for a method, everyone would then be surprised, when seeing that all tests don't pass, the failing test would make us think more and realize that we forgot an important case. My question will probably finish as "too broad" or "not clear what you are asking", but we never know, one of you might have a great idea. Your answer can use Java and JUnit, though any other language will be fine since only the idea will matter.

    Read the article

  • How to unit test image processing code?

    - by rold2007
    I'm working in image processing (mainly OCR) and I wonder how I should integrate unit tests in my development. I'm already using unit tests for more "common" type of code but when dealing with image processing code I'm not sure how to deal with it. This kind of code always need some image data input/output and mocking this is not obvious. For now I'm mostly doing integration tests but they take a while to run and I would like some ideas on how to break down this kind of code into unit tests so that I can run them more quickly.

    Read the article

  • migrating product and team from startup race to quality development

    - by thevikas
    This is year 3 and product is selling good enough. Now we need to enforce good software development practices. The goal is to monitor incoming bug reports and reduce them, allow never ending features and get ready for scaling 10x. The phrases "test-driven-development" and "continuous-integration" are not even understood by the team cause they were all in the first 2 year product race. Tech team size is 5. The question is how to sell/convince team and management about TDD/unit testing/coding standards/documentation - with economics. train the team to do more than just feature coding and start writing test units along - which looks like more work, means needs more time! how to plan for creating units for all backlog production code

    Read the article

  • Project life cycle management - Maven vs 'manual' approach

    - by jb10210
    I have a question concerning the life cycle management of a/multiple project(s), more specific to the advantages/disadvantages of using technologies such as Maven. Currently we work in a continuous-integration environment but lots of things still need to be manually performed (dependency management, deploying, setting up documentation, generating stats, ...). My impression is that this approach often leads to errors, miscommunications or things just are forgotten. I know and have used Maven in the past but in smaller environments and I was always really enthusiastic about it. But I was wondering if someone could share some insights, experiences, pros, contras, ... about the use of Maven (or similar technology) in larger environments and for multiple projects. I would like to use the suggestions made here to start the debate about moving to the next level in project management!

    Read the article

  • How should I set up UDK with Git and CruiseControl?

    - by Martin Sojka
    For a new project in UDK, I'd like to set up a Git repository for version control and a CruiseControl.NET-based continuous integration solution. The good news is that he first part seems easy enough and CruiseControl.NET can work off Git repositories. The bad news is that according to my searches, nobody has ever tried to do this. Ideally, I'm looking for a step-by-step guide on how to set up such a development environment assuming more than one development computer, one central repository for the "master" branch, and one machine for building and packaging the binaries via CruiseControl.NET. Related: Version control system for game development with UDK? Options for UDK and version control repositories? CruiseControl.NET and Git

    Read the article

  • Parsing flat files using SSIS : SSIS Nugget

    - by jamiet
    Often when using SQL Server Integration Services (SSIS) you will find there is more than one way of accomplishing a task and that the most obvious method of doing so might not be the optimal one. In the video below I demonstrate this by way of an experiment using SSIS’s Flat File Source component; I show different ways that you can pull data from a flat file into the SSIS dataflow and also how the nature of the data itself can influence your choice as to how this task should be accomplished. If you are having trouble viewing the video in your blog reader then head to http://sqlblog.com/blogs/jamie_thomson/archive/2010/03/25/parsing-flat-files-using-ssis-ssis-nugget.aspx to see it as it is hosted on my blog!  The main point I want to get across from this video is that a little bit of creative thinking when building your dataflows can sometimes be very beneficial for performance; quite often building a solution that isn’t the most obvious might actually turn out to be the best one. You’ll notice, if you have watched the video, that my editing skills weren’t quite up to snuff and I cut off the final few words however all I was saying was that if you have any feedback on this video then I would love to hear it either via email or preferably the comments section below. I hope this turns out to be useful to some of you. @Jamiet P.S. Incidentally the parsing that we do using SSIS expressions in the video would be much easier if we had a TOKENISE function in SSIS’s expression language and I have asked for the introduction of such a function on Connect at [SSIS] TOKEN(string, tokeniser_string, occurence) function. Feel free to go and vote that up if you think this feature would be useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle Internet Directory 11.1.1.4 Certified with E-Business Suite

    - by Steven Chan
    Oracle E-Business Suite comes with native user authentication and management capabilities out-of-the-box. If you need more-advanced features, it's also possible to integrate it with Oracle Internet Directory and Oracle Single Sign-On or Oracle Access Manager, which allows you to link the E-Business Suite with third-party tools like Microsoft Active Directory, Windows Kerberos, and CA Netegrity SiteMinder.  For details about third-party integration architectures, see either of these article for EBS 11i and 12:In-Depth: Using Third-Party Identity Managers with E-Business Suite Release 12In-Depth: Using Third-Party Identity Managers with the E-Business Suite Release 11iOracle Internet Directory 11.1.1.4 is now certified with Oracle E-Business Suite Release 11i, 12.0 and 12.1.  OID 11.1.1.4 is part of Oracle Fusion Middleware 11g Release 1 Version 11.1.1.4.0, also known as FMW 11g Patchset 3.  Certified E-Business Suite releases are:EBS Release 11i 11.5.10.2 + ATG RUP 7 and higherEBS Release 12.0.6 and higherEBS Release 12.1.1 and higherOracle Internet Directory 11.1.1.3.0 can be integrated with two single sign-on solutions for EBS environments:With Oracle Single Sign-On Server 10g (10.1.4.3.0) with an existing Oracle E-Business Suite system (Release 11i, 12.0.x or 12.1.1) With Oracle Access Manager 10g (10.1.4.3) with an existing Oracle E-Business Suite system (Release 11i or 12.1.x)

    Read the article

  • Querying the SSIS Catalog? Here’s a handy query!

    - by jamiet
    I’ve been working on a SQL Server Integration Services (SSIS) solution for about 6 months now and I’ve learnt many many things that I intend to share on this blog just as soon as I get the time. Here’s a very short starter-for-ten… I’ve found the following query to be utterly invaluable when interrogating the SSIS Catalog to discover what is going on in my executions: SELECT event_message_id,MESSAGE,package_name,event_name,message_source_name,package_path,execution_path,message_type,message_source_typeFROM   (       SELECT  em.*       FROM    SSISDB.catalog.event_messages em       WHERE   em.operation_id = (SELECT MAX(execution_id) FROM SSISDB.catalog.executions)           AND event_name NOT LIKE '%Validate%'       )q/* Put in whatever WHERE predicates you might like*/--WHERE event_name = 'OnError'--WHERE package_name = 'Package.dtsx'--WHERE execution_path LIKE '%<some executable>%'ORDER BY message_time DESC Know it. Learn it. Love it. @jamiet

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Security exception in Twitterizer [closed]

    - by Raghu
    Possible Duplicate: Security exception in Twitterizer Hi, We are using Twitterizer for Twitter integration to get the Tweets details. When making call to the method OAuthUtility.GetRequestToken, following exception is coming. System.Security.SecurityException: Request for the permission of type 'System.Net.WebPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. When the application is hosted on IIS 5, the application works fine and the above error is coming only when the application is hosted in IIS 7 on Windows 2008 R2. and the method OAuthUtility.GetRequestToken throws above exception. It seems the issue is something with code access security. Please suggest what kind of permissions should be given to fix the security exception. The application has the Full Trust and I have even tried by registering the Twitterizer DLL in GAC and still the same error is coming. I am not sure what makes the difference between IIS 5 and IIS 7 with regards to code access security to cause that exception. Any help would be greatly appreciated. Thanks in advance. Regards, Raghu

    Read the article

  • SSIS Reporting Pack – a performance tip

    - by jamiet
    SSIS Reporting Pack is a suite of open source SQL Server Reporting Services (SSRS) reports that provide additional insight into the SQL Server Integration Services (SSIS) 2012 Catalog. You can read more about SSIS Reporting Pack here on my blog or had over to the home page for the project at http://ssisreportingpack.codeplex.com/. After having used SSRS Reporting Pack on a real project for a few months now I have come to realise that if you have any sizeable data volumes in [SSISDB] then the reports in SSIS Reporting Pack will suffer from chronic performance problems – I have seen the “execution” report take upwards of 30minutes to return data. To combat this I highly recommend that you create an index on the [SSISDB].[internal].[event_messages].[operation_id] & [SSISDB].[internal].[operation_messages].[operation_id] fields. Phil Brammer has experienced similar problems himself and has since made it easy for the rest of us by preparing some scripts to create the indexes that he recommends and he has shared those scripts via his blog at http://www.ssistalk.com/SSIS_2012_Missing_Indexes.zip. If you are using SSIS Reporting Pack, or even if you are simply querying [SSISDB], I highly recommend that you download Phil’s scripts and test them out on your own SSIS Catalog(s). Those indexes will not solve all problems but they will make some of your reports run quicker. I am working on some further enhancements that should further improve the performance of the reports. Watch this space. @Jamiet

    Read the article

  • Is it bad practice for services to share a database in SOA?

    - by Paul T Davies
    I have recently been reading Hohpe and Woolf's Enterprise Integration Patterns, some of Thomas Erl's books on SOA and watching various videos and podcasts by Udi Dahan et al. on CQRS and Event Driven systems. Systems in my place of work suffer from high coupling. Although each system theoretically has its own database, there is a lot of joining between them. In practice this means there is one huge database that all systems use. For example, there is one table of customer data. Much of what I've read seems to suggest denormalising data so that each system uses only its database, and any updates to one system are propagated to all the others using messaging. I thought this was one of the ways of enforcing the boundaries in SOA - each service should have its own database, but then I read this: http://stackoverflow.com/questions/4019902/soa-joining-data-across-multiple-services and it suggests this is the wrong thing to do. Segregating the databases does seem like a good way of decoupling systems, but now I'm a bit confused. Is this a good route to take? Is it ever recommended that you should segregate a database on, say an SOA service, an DDD Bounded context, an application, etc?

    Read the article

  • Should we be able to deploy a single package to the SSIS Catalog?

    - by jamiet
    My buddy Sutha Thiru sent me an email recently asking about my opinion on a particular nuance of the project deployment model in SQL Server Integration Services (SSIS) 2012 and I’d like to share my response as I think it warrants a wider discussion. Sutha asked: Jamie What is your take on this? http://www.mattmasson.com/index.php/2012/07/can-i-deploy-a-single-ssis-package-from-my-project-to-the-ssis-catalog/ Overnight I was talking to Matt who confirmed that they got no plans to change the deployment model. For example if we have following scenrio how do we do deploy? Sprint 1 Pkg1, 2 & 3 has been developed and deployed to UAT. Once signed off its been deployed to Live. Sprint 2 Pkg 4 & 5 been developed. During this time users raised a bug on Pkg2. We want to make the change to Pkg2 and deploy that to UAT and eventually to LIVE without releasing Pkg 4 &5. How do we do it? Matt pointed me to his blog entry which I have seen before . http://www.mattmasson.com/index.php/2012/02/thoughts-on-branching-strategies-for-ssis-projects/ Thanks Sutha My response: Personally, even though I've experienced the exact problem you just outlined, I agree with the current approach. I steadfastly believe that there should not be a way for an unscrupulous developer to slide in a new version of a package under the covers. Deploying .ispac files brings a degree of rigour to your operational processes. Yes, that means that we as SSIS developers are going to have to get better at using source control and branching properly but that is no bad thing in my opinion. Claiming to be proper "developers" is a bit of a cheap claim if we don't even do the fundamentals correctly. I would be interested in the thoughts of others who have used the project deployment model. Do you agree with my point of view? @Jamiet

    Read the article

  • Design for XML mapping scenarios between two different systems [on hold]

    - by deepak_prn
    Mapping XML fields between two systems is a mundane routine in integration scenarios. I am trying to make the design documents look better and provide clear understanding to the developers especially when we do not use XSLT or any other IDE such as jDeveloper or eclipse plugins. I want it to be a high level design but at the same time talk in developer's language. So that there is no requirements that slip under the crack. For example, one of the scenarios goes: the store cashier sells an item, the transaction data is sent to Data management system. Now, I am writing a functional design for the scenario which deals with mapping XML fields between our system and the data management system. Question : I was wondering if some one had to deal with mapping XML fields between two systems? (without XSLT being involved) and if you used a table to represent the fields mapping (example is below) or any other visualization tool which does not break the bank ? I am trying to find out if there is a better way to represent XML mapping in your design documents. The widely accepted and used method seems to be using a simple table such as in the picture to illustrate the mapping. I am wondering if there are alternate ways/ tools to represent such as in Altova:

    Read the article

  • Code Measuring and Metrics Tools?

    - by David
    I'm in the process of setting up a build server for personal projects. This server will handle all the normal CI stuff, including running large suites of tests (unit, integration, automated UI). While I'm working out the kinks for including code coverage output with MSTest, it occurs to me that there may be lots of tools out there which give me additional metrics other than just code coverage. FxCop comes to mind as an example. Though I'm sure there are others. Anything that can generate useful reportable data and metrics would be good. Whether it's class dependency charts (looking for Law of Demeter violations, for example), analyses of the uses of classes/functions (looking for a function that isn't used in the system other than just the tests, for example), and so on. I'm not sure the right way to formulate the question, since polling questions or "What's your favorite code analysis tool" aren't very good. But I'm essentially just looking for recommendations on what metrics to gather and the tools that can gather them. The eventual vision for something like this is to have the CI server run a bunch of automated tests and analysis tools and track performance metrics over time. Imagine a dashboard full of graphs plotting these metrics over time. The lines should all relatively be at an equilibrium, and if one starts to stray toward the negative then it's an early indication of problems with the code. In the age old struggle to quantify code quality with management, this sounds like a potentially helpful means of doing just that.

    Read the article

  • How do I structure code and builds for continuous delivery of multiple applications in a small team?

    - by kingdango
    Background: 3-5 developers supporting (and building new) internal applications for a non-software company. We use TFS although I don't think that matters much for my question. I want to be able to develop a deployment pipeline and adopt continuous integration / deployment techniques. Here's what our source tree looks like right now. We use a single TFS Team Project. $/MAIN/src/ $/MAIN/src/ApplicationA/VSSOlution.sln $/MAIN/src/ApplicationA/ApplicationAProject1.csproj $/MAIN/src/ApplicationA/ApplicationAProject2.csproj $/MAIN/src/ApplicationB/... $/MAIN/src/ApplicationC $/MAIN/src/SharedInfrastructureA $/MAIN/src/SharedInfrastructureB My Goal (a pretty typical promotion pipeline) When a code change is made to a given application I want to be able to build that application and auto-deploy that change to a DEV server. I may also need to build dependencies on Shared Infrastructure Components. I often also have some database scripts or changes as well If developer testing passes I want to have an manually triggered but automated deploy of that build on a STAGING server where end-users will review new functionality. Once it's approved by end users I want to a manually triggered auto-deploy to production Question: How can I best adopt continuous deployment techniques in a multi-application environment? A lot of the advice I see is more single-application-specific, how is that best applied to multiple applications? For step 1, do I simply setup a separate Team Build for each application? What's the best approach to accomplishing steps 2 and 3 of promoting latest build to new environments? I've seen this work well with web apps but what about database changes

    Read the article

  • Branching and CI Builds with Agile

    - by Bob Horn
    We follow many agile processes, including automated tests, continuous integration, sprint reviews, etc... We're currently having a debate about how often we should branch release builds. We've been doing two-week sprints and trying to deploy to production at the end of each sprint. Some of us think we should be branching every sprint. Some of us think that's overkill. If a project encompasses three Visual Studio solutions, and we branch every sprint, then that's three branches, and three CI builds to create every two weeks. If we do this for six months, we'll end up with 36 branches and 36 CI builds. There is overhead involved in that. For those of us that think that branching every sprint is overkill, we don't have a very good alternative. On my last project, we deployed some solutions from the Main trunk. Yeah, that's not good, but it saved on some of the overhead. What's the right way to manage branching/releasing and CI builds, using agile, when we have such short (two-week) sprint cycles?

    Read the article

  • Windows 7 XP mode missing driver virtual pc integration device

    - by Charlie Wu
    I have XP Mode installed fine, but the shipped Windows XP installation is too heavy. I tried to install a light XP using my ASUS Eee PC. After installation, I have to install a few missing drivers and everything worked OK except I can't copy and paste between host and guest operating system. I checked Device Manager and found the Virtual PC Integration Device is not installed correctly. I can't find any driver for that. How can I fix this problem?

    Read the article

  • Sharepoint and SSRS Native Integration - Not seeing web parts after cabinet file install

    - by Greg_the_Ant
    I followed the steps here to set up Sharepoint integration with reporting services (SSRS) using native mode. (i.e., get a report explorer and viewer web parts) However after adding the cabinet file e.g., STSADM.EXE -o addwppack -filename "C:\ Program Files\Microsoft SQL Server\100\Tools\Reporting Services\SharePoint\RSWebParts.cab" -globalinstall I still don't see report explorer in the web parts list. I'm very stuck and confused :-(

    Read the article

  • Integration features enabled but drives not available

    - by dsjbirch
    Frustratingly, after a recent update to Windows XP mode integration features, the availability of shared disks from the hosts has been impaired. Does anyone know any kind of workaround or fix (excluding dropbox et al)? I have tried completely uninstalling and reinstalling as per http://www.sevenforums.com/virtualization/63710-refreshing-xp-mode.html#post568715 At one point restarting the machine appeared to have worked, but today again I am without access to my host. Interestingly audio and copy and paste to and from the machine are working.

    Read the article

  • SSIS Catalog, Windows updates and deployment failures due to System.Core mismatch

    - by jamiet
    This is a heads-up for anyone doing development on SSIS. On my current project where we are implementing a SQL Server Integration Services (SSIS) 2012 solution we recently encountered a situation where we were unable to deploy any of our projects even though we had successfully deployed in the past. Any attempt to use the deployment wizard resulted in this error dialog: The text of the error (for all you search engine crawlers out there) was: A .NET Framework error occurred during execution of user-defined routine or aggregate "create_key_information": System.IO.FileLoadException: Could not load file or assembly 'System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) ---> System.IO.FileLoadException: The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) System.IO.FileLoadException: System.IO.FileLoadException:     at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateSymmetricKey(String algorithm)    at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateKeyInformation(SqlString algorithmName, SqlBytes& key, SqlBytes& IV) . (Microsoft SQL Server, Error: 6522) After some investigation and a bit of back and forth with some very helpful members of the SSIS product team (hey Matt, Wee Hyong) it transpired that this was due to a .Net Framework fix that had been delivered via Windows Update. I took a look at the server update history and indeed there have been some recently applied .Net Framework updates: This fix had (in the words of Matt Masson) “somehow caused a mismatch on System.Core for SQLCLR” and, as you may know, SQLCLR is used heavily within the SSIS Catalog. The fix was pretty simple – restart SQL Server. This causes the assemblies to be upgraded automatically. If you are using Data Quality Services (DQS) you may have experienced similar problems which are documented at Upgrade SQLCLR Assemblies After .NET Framework Update. I am hoping the SSIS team will follow-up with a more thorough explanation on their blog soon. You DBAs out there may be questioning why Windows Update is set to automatically apply updates on our production servers. We’re checking that out with our hosting provider right now You have been warned! @Jamiet

    Read the article

  • Extensible Metadata in Oracle IRM 11g

    - by martin.abrahams
    Another significant change in Oracle IRM 11g is that we now use XML to create the tamperproof header for each sealed document. This article explains what this means, and what benefit it offers. So, every sealed file has a metadata header that contains information about the document - its classification, its format, the user who sealed it, the name and URL of the IRM Server, and much more. The IRM Desktop and other IRM applications use this information to formulate the request for rights, as well as to enhance the user experience by exposing some of the metadata in the user interface. For example, in Windows explorer you can see some metadata exposed as properties of a sealed file and in the mouse-over tooltip. The following image shows 10g and 11g metadata side by side. As you can see, the 11g metadata is written as XML as opposed to the simple delimited text format used in 10g. So why does this matter? The key benefit of using XML is that it creates the opportunity for sealing applications to use custom metadata. This in turn creates the opportunity for custom classification models to be defined and enforced. Out of the box, the solution uses the context classification model, in which two particular pieces of metadata form the basis of rights evaluation - the context name and the document's item code. But a custom sealing application could use some other model entirely, enabling rights decisions to be evaluated on some other basis. The integration with Oracle Beehive is a great example of this. When a user adds a document to a Beehive workspace, that document can be automatically sealed with metadata that represents the Beehive security model rather than the context model. As a consequence, IRM can enforce the Beehive security model precisely and all rights configuration can actually be managed through the Beehive UI rather than the IRM UI. In this scenario, IRM simply supports the Beehive application, seamlessly extending Beehive security to all copies of workspace documents without any additional administration. Finally, I mentioned that the metadata header is tamperproof. This is obviously to stop a rogue user modifying the metadata with a view to gaining unauthorised access - reclassifying a board document to a less sensitive classifcation, for example. To prevent this, the header is digitally signed and can only be manipulated by a suitably authorised sealing application.

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!! Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" } To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet UPDATE: THis post prompted Chad Miller to write a post describing his Powershell add-in that utilises a SSIS API to do exporting of packages. Go take a read here: http://sev17.com/2011/02/importing-and-exporting-ssis-packages-using-powershell/

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >