Search Results

Search found 4853 results on 195 pages for 'continuous integration'.

Page 23/195 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Which Continuous Integration framework do you use and why?

    - by Richard Warburton
    There are quite a few different Continuous Integration (CI) frameworks out there and I'm wondering which is the most popular. Which frameworks have you used at firms where you work? Is there any reason one CI framework is more popular than another - perhaps this is to do with the features it offers, things that integrate into it or maybe its just marketing? It seems like continuous integration is used more in the Java and .net worlds than say ruby or python. Why is this?

    Read the article

  • Parsing flat files using SSIS : SSIS Nugget

    - by jamiet
    Often when using SQL Server Integration Services (SSIS) you will find there is more than one way of accomplishing a task and that the most obvious method of doing so might not be the optimal one. In the video below I demonstrate this by way of an experiment using SSIS’s Flat File Source component; I show different ways that you can pull data from a flat file into the SSIS dataflow and also how the nature of the data itself can influence your choice as to how this task should be accomplished. If you are having trouble viewing the video in your blog reader then head to http://sqlblog.com/blogs/jamie_thomson/archive/2010/03/25/parsing-flat-files-using-ssis-ssis-nugget.aspx to see it as it is hosted on my blog!  The main point I want to get across from this video is that a little bit of creative thinking when building your dataflows can sometimes be very beneficial for performance; quite often building a solution that isn’t the most obvious might actually turn out to be the best one. You’ll notice, if you have watched the video, that my editing skills weren’t quite up to snuff and I cut off the final few words however all I was saying was that if you have any feedback on this video then I would love to hear it either via email or preferably the comments section below. I hope this turns out to be useful to some of you. @Jamiet P.S. Incidentally the parsing that we do using SSIS expressions in the video would be much easier if we had a TOKENISE function in SSIS’s expression language and I have asked for the introduction of such a function on Connect at [SSIS] TOKEN(string, tokeniser_string, occurence) function. Feel free to go and vote that up if you think this feature would be useful! Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Oracle Internet Directory 11.1.1.4 Certified with E-Business Suite

    - by Steven Chan
    Oracle E-Business Suite comes with native user authentication and management capabilities out-of-the-box. If you need more-advanced features, it's also possible to integrate it with Oracle Internet Directory and Oracle Single Sign-On or Oracle Access Manager, which allows you to link the E-Business Suite with third-party tools like Microsoft Active Directory, Windows Kerberos, and CA Netegrity SiteMinder.  For details about third-party integration architectures, see either of these article for EBS 11i and 12:In-Depth: Using Third-Party Identity Managers with E-Business Suite Release 12In-Depth: Using Third-Party Identity Managers with the E-Business Suite Release 11iOracle Internet Directory 11.1.1.4 is now certified with Oracle E-Business Suite Release 11i, 12.0 and 12.1.  OID 11.1.1.4 is part of Oracle Fusion Middleware 11g Release 1 Version 11.1.1.4.0, also known as FMW 11g Patchset 3.  Certified E-Business Suite releases are:EBS Release 11i 11.5.10.2 + ATG RUP 7 and higherEBS Release 12.0.6 and higherEBS Release 12.1.1 and higherOracle Internet Directory 11.1.1.3.0 can be integrated with two single sign-on solutions for EBS environments:With Oracle Single Sign-On Server 10g (10.1.4.3.0) with an existing Oracle E-Business Suite system (Release 11i, 12.0.x or 12.1.1) With Oracle Access Manager 10g (10.1.4.3) with an existing Oracle E-Business Suite system (Release 11i or 12.1.x)

    Read the article

  • Querying the SSIS Catalog? Here’s a handy query!

    - by jamiet
    I’ve been working on a SQL Server Integration Services (SSIS) solution for about 6 months now and I’ve learnt many many things that I intend to share on this blog just as soon as I get the time. Here’s a very short starter-for-ten… I’ve found the following query to be utterly invaluable when interrogating the SSIS Catalog to discover what is going on in my executions: SELECT event_message_id,MESSAGE,package_name,event_name,message_source_name,package_path,execution_path,message_type,message_source_typeFROM   (       SELECT  em.*       FROM    SSISDB.catalog.event_messages em       WHERE   em.operation_id = (SELECT MAX(execution_id) FROM SSISDB.catalog.executions)           AND event_name NOT LIKE '%Validate%'       )q/* Put in whatever WHERE predicates you might like*/--WHERE event_name = 'OnError'--WHERE package_name = 'Package.dtsx'--WHERE execution_path LIKE '%<some executable>%'ORDER BY message_time DESC Know it. Learn it. Love it. @jamiet

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • SSIS Prehistory video

    - by jamiet
    I’m currently wasting spending my Easter bank holiday putting together my presentation SSIS Dataflow Performance Tuning for the upcoming SQL Bits conference in London and in doing so I’m researching some old material about how the dataflow actually works. Boring as it is I’ve gotten easily sidelined and have chanced upon an old video on Channel 9 entitled Euan Garden - Tour of SQL Server Team (part I). Euan is a former member of the SQL Server team and in this series of videos he walks the halls of the SQL Server building on Microsoft’s Redmond campus talking to some of the various protagonists and in this one he happens upon the SQL Server Integration Services team. The video was shot in 2004 so this is a fascinating (to me anyway) glimpse into the development of SSIS from before it was ever shipped and if you’re a geek like me you’ll really enjoy this behind-the-scenes look into how and why the product was architected. The video is also notable for the presence of the cameraman – none other than the now-rather-more-famous-than-he-was-then Robert Scoble. See it at http://channel9.msdn.com/posts/TheChannel9Team/Euan-Garden-Tour-of-SQL-Server-Team-part-I/ Enjoy! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Security exception in Twitterizer [closed]

    - by Raghu
    Possible Duplicate: Security exception in Twitterizer Hi, We are using Twitterizer for Twitter integration to get the Tweets details. When making call to the method OAuthUtility.GetRequestToken, following exception is coming. System.Security.SecurityException: Request for the permission of type 'System.Net.WebPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. When the application is hosted on IIS 5, the application works fine and the above error is coming only when the application is hosted in IIS 7 on Windows 2008 R2. and the method OAuthUtility.GetRequestToken throws above exception. It seems the issue is something with code access security. Please suggest what kind of permissions should be given to fix the security exception. The application has the Full Trust and I have even tried by registering the Twitterizer DLL in GAC and still the same error is coming. I am not sure what makes the difference between IIS 5 and IIS 7 with regards to code access security to cause that exception. Any help would be greatly appreciated. Thanks in advance. Regards, Raghu

    Read the article

  • SSIS Reporting Pack – a performance tip

    - by jamiet
    SSIS Reporting Pack is a suite of open source SQL Server Reporting Services (SSRS) reports that provide additional insight into the SQL Server Integration Services (SSIS) 2012 Catalog. You can read more about SSIS Reporting Pack here on my blog or had over to the home page for the project at http://ssisreportingpack.codeplex.com/. After having used SSRS Reporting Pack on a real project for a few months now I have come to realise that if you have any sizeable data volumes in [SSISDB] then the reports in SSIS Reporting Pack will suffer from chronic performance problems – I have seen the “execution” report take upwards of 30minutes to return data. To combat this I highly recommend that you create an index on the [SSISDB].[internal].[event_messages].[operation_id] & [SSISDB].[internal].[operation_messages].[operation_id] fields. Phil Brammer has experienced similar problems himself and has since made it easy for the rest of us by preparing some scripts to create the indexes that he recommends and he has shared those scripts via his blog at http://www.ssistalk.com/SSIS_2012_Missing_Indexes.zip. If you are using SSIS Reporting Pack, or even if you are simply querying [SSISDB], I highly recommend that you download Phil’s scripts and test them out on your own SSIS Catalog(s). Those indexes will not solve all problems but they will make some of your reports run quicker. I am working on some further enhancements that should further improve the performance of the reports. Watch this space. @Jamiet

    Read the article

  • Design for XML mapping scenarios between two different systems [on hold]

    - by deepak_prn
    Mapping XML fields between two systems is a mundane routine in integration scenarios. I am trying to make the design documents look better and provide clear understanding to the developers especially when we do not use XSLT or any other IDE such as jDeveloper or eclipse plugins. I want it to be a high level design but at the same time talk in developer's language. So that there is no requirements that slip under the crack. For example, one of the scenarios goes: the store cashier sells an item, the transaction data is sent to Data management system. Now, I am writing a functional design for the scenario which deals with mapping XML fields between our system and the data management system. Question : I was wondering if some one had to deal with mapping XML fields between two systems? (without XSLT being involved) and if you used a table to represent the fields mapping (example is below) or any other visualization tool which does not break the bank ? I am trying to find out if there is a better way to represent XML mapping in your design documents. The widely accepted and used method seems to be using a simple table such as in the picture to illustrate the mapping. I am wondering if there are alternate ways/ tools to represent such as in Altova:

    Read the article

  • Is it bad practice for services to share a database in SOA?

    - by Paul T Davies
    I have recently been reading Hohpe and Woolf's Enterprise Integration Patterns, some of Thomas Erl's books on SOA and watching various videos and podcasts by Udi Dahan et al. on CQRS and Event Driven systems. Systems in my place of work suffer from high coupling. Although each system theoretically has its own database, there is a lot of joining between them. In practice this means there is one huge database that all systems use. For example, there is one table of customer data. Much of what I've read seems to suggest denormalising data so that each system uses only its database, and any updates to one system are propagated to all the others using messaging. I thought this was one of the ways of enforcing the boundaries in SOA - each service should have its own database, but then I read this: http://stackoverflow.com/questions/4019902/soa-joining-data-across-multiple-services and it suggests this is the wrong thing to do. Segregating the databases does seem like a good way of decoupling systems, but now I'm a bit confused. Is this a good route to take? Is it ever recommended that you should segregate a database on, say an SOA service, an DDD Bounded context, an application, etc?

    Read the article

  • Should we be able to deploy a single package to the SSIS Catalog?

    - by jamiet
    My buddy Sutha Thiru sent me an email recently asking about my opinion on a particular nuance of the project deployment model in SQL Server Integration Services (SSIS) 2012 and I’d like to share my response as I think it warrants a wider discussion. Sutha asked: Jamie What is your take on this? http://www.mattmasson.com/index.php/2012/07/can-i-deploy-a-single-ssis-package-from-my-project-to-the-ssis-catalog/ Overnight I was talking to Matt who confirmed that they got no plans to change the deployment model. For example if we have following scenrio how do we do deploy? Sprint 1 Pkg1, 2 & 3 has been developed and deployed to UAT. Once signed off its been deployed to Live. Sprint 2 Pkg 4 & 5 been developed. During this time users raised a bug on Pkg2. We want to make the change to Pkg2 and deploy that to UAT and eventually to LIVE without releasing Pkg 4 &5. How do we do it? Matt pointed me to his blog entry which I have seen before . http://www.mattmasson.com/index.php/2012/02/thoughts-on-branching-strategies-for-ssis-projects/ Thanks Sutha My response: Personally, even though I've experienced the exact problem you just outlined, I agree with the current approach. I steadfastly believe that there should not be a way for an unscrupulous developer to slide in a new version of a package under the covers. Deploying .ispac files brings a degree of rigour to your operational processes. Yes, that means that we as SSIS developers are going to have to get better at using source control and branching properly but that is no bad thing in my opinion. Claiming to be proper "developers" is a bit of a cheap claim if we don't even do the fundamentals correctly. I would be interested in the thoughts of others who have used the project deployment model. Do you agree with my point of view? @Jamiet

    Read the article

  • Windows 7 XP mode missing driver virtual pc integration device

    - by Charlie Wu
    I have XP Mode installed fine, but the shipped Windows XP installation is too heavy. I tried to install a light XP using my ASUS Eee PC. After installation, I have to install a few missing drivers and everything worked OK except I can't copy and paste between host and guest operating system. I checked Device Manager and found the Virtual PC Integration Device is not installed correctly. I can't find any driver for that. How can I fix this problem?

    Read the article

  • Sharepoint and SSRS Native Integration - Not seeing web parts after cabinet file install

    - by Greg_the_Ant
    I followed the steps here to set up Sharepoint integration with reporting services (SSRS) using native mode. (i.e., get a report explorer and viewer web parts) However after adding the cabinet file e.g., STSADM.EXE -o addwppack -filename "C:\ Program Files\Microsoft SQL Server\100\Tools\Reporting Services\SharePoint\RSWebParts.cab" -globalinstall I still don't see report explorer in the web parts list. I'm very stuck and confused :-(

    Read the article

  • Integration features enabled but drives not available

    - by dsjbirch
    Frustratingly, after a recent update to Windows XP mode integration features, the availability of shared disks from the hosts has been impaired. Does anyone know any kind of workaround or fix (excluding dropbox et al)? I have tried completely uninstalling and reinstalling as per http://www.sevenforums.com/virtualization/63710-refreshing-xp-mode.html#post568715 At one point restarting the machine appeared to have worked, but today again I am without access to my host. Interestingly audio and copy and paste to and from the machine are working.

    Read the article

  • SSIS Catalog, Windows updates and deployment failures due to System.Core mismatch

    - by jamiet
    This is a heads-up for anyone doing development on SSIS. On my current project where we are implementing a SQL Server Integration Services (SSIS) 2012 solution we recently encountered a situation where we were unable to deploy any of our projects even though we had successfully deployed in the past. Any attempt to use the deployment wizard resulted in this error dialog: The text of the error (for all you search engine crawlers out there) was: A .NET Framework error occurred during execution of user-defined routine or aggregate "create_key_information": System.IO.FileLoadException: Could not load file or assembly 'System.Core, Version=4.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) ---> System.IO.FileLoadException: The located assembly's manifest definition does not match the assembly reference. (Exception from HRESULT: 0x80131040) System.IO.FileLoadException: System.IO.FileLoadException:     at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateSymmetricKey(String algorithm)    at Microsoft.SqlServer.IntegrationServices.Server.Security.CryptoGraphy.CreateKeyInformation(SqlString algorithmName, SqlBytes& key, SqlBytes& IV) . (Microsoft SQL Server, Error: 6522) After some investigation and a bit of back and forth with some very helpful members of the SSIS product team (hey Matt, Wee Hyong) it transpired that this was due to a .Net Framework fix that had been delivered via Windows Update. I took a look at the server update history and indeed there have been some recently applied .Net Framework updates: This fix had (in the words of Matt Masson) “somehow caused a mismatch on System.Core for SQLCLR” and, as you may know, SQLCLR is used heavily within the SSIS Catalog. The fix was pretty simple – restart SQL Server. This causes the assemblies to be upgraded automatically. If you are using Data Quality Services (DQS) you may have experienced similar problems which are documented at Upgrade SQLCLR Assemblies After .NET Framework Update. I am hoping the SSIS team will follow-up with a more thorough explanation on their blog soon. You DBAs out there may be questioning why Windows Update is set to automatically apply updates on our production servers. We’re checking that out with our hosting provider right now You have been warned! @Jamiet

    Read the article

  • Extensible Metadata in Oracle IRM 11g

    - by martin.abrahams
    Another significant change in Oracle IRM 11g is that we now use XML to create the tamperproof header for each sealed document. This article explains what this means, and what benefit it offers. So, every sealed file has a metadata header that contains information about the document - its classification, its format, the user who sealed it, the name and URL of the IRM Server, and much more. The IRM Desktop and other IRM applications use this information to formulate the request for rights, as well as to enhance the user experience by exposing some of the metadata in the user interface. For example, in Windows explorer you can see some metadata exposed as properties of a sealed file and in the mouse-over tooltip. The following image shows 10g and 11g metadata side by side. As you can see, the 11g metadata is written as XML as opposed to the simple delimited text format used in 10g. So why does this matter? The key benefit of using XML is that it creates the opportunity for sealing applications to use custom metadata. This in turn creates the opportunity for custom classification models to be defined and enforced. Out of the box, the solution uses the context classification model, in which two particular pieces of metadata form the basis of rights evaluation - the context name and the document's item code. But a custom sealing application could use some other model entirely, enabling rights decisions to be evaluated on some other basis. The integration with Oracle Beehive is a great example of this. When a user adds a document to a Beehive workspace, that document can be automatically sealed with metadata that represents the Beehive security model rather than the context model. As a consequence, IRM can enforce the Beehive security model precisely and all rights configuration can actually be managed through the Beehive UI rather than the IRM UI. In this scenario, IRM simply supports the Beehive application, seamlessly extending Beehive security to all copies of workspace documents without any additional administration. Finally, I mentioned that the metadata header is tamperproof. This is obviously to stop a rogue user modifying the metadata with a view to gaining unauthorised access - reclassifying a board document to a less sensitive classifcation, for example. To prevent this, the header is digitally signed and can only be manipulated by a suitably authorised sealing application.

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!! Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" } To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet UPDATE: THis post prompted Chad Miller to write a post describing his Powershell add-in that utilises a SSIS API to do exporting of packages. Go take a read here: http://sev17.com/2011/02/importing-and-exporting-ssis-packages-using-powershell/

    Read the article

  • Export all SSIS packages from msdb using Powershell

    - by jamiet
    Have you ever wanted to dump all the SSIS packages stored in msdb out to files? Of course you have, who wouldn’t? Right? Well, at least one person does because this was the subject of a thread (save all ssis packages to file) on the SSIS forum earlier today. Some of you may have already figured out a way of doing this but for those that haven’t here is a nifty little script that will do it for you and it uses our favourite jack-of-all tools … Powershell!!   Imagine I have the following package folder structure on my Integration Services server (i.e. in [msdb]): There are two packages in there called “20110111 Chaining Expression components” & “Package”, I want to export those two packages into a folder structure that mirrors that in [msdb]. Here is the Powershell script that will do that:   Param($SQLInstance = "localhost") #####Add all the SQL goodies (including Invoke-Sqlcmd)##### add-pssnapin sqlserverprovidersnapin100 -ErrorAction SilentlyContinue add-pssnapin sqlservercmdletsnapin100 -ErrorAction SilentlyContinue cls $Packages = Invoke-Sqlcmd -MaxCharLength 10000000 -ServerInstance $SQLInstance -Query "WITH cte AS ( SELECT cast(foldername as varchar(max)) as folderpath, folderid FROM msdb..sysssispackagefolders WHERE parentfolderid = '00000000-0000-0000-0000-000000000000' UNION ALL SELECT cast(c.folderpath + '\' + f.foldername as varchar(max)), f.folderid FROM msdb..sysssispackagefolders f INNER JOIN cte c ON c.folderid = f.parentfolderid ) SELECT c.folderpath,p.name,CAST(CAST(packagedata AS VARBINARY(MAX)) AS VARCHAR(MAX)) as pkg FROM cte c INNER JOIN msdb..sysssispackages p ON c.folderid = p.folderid WHERE c.folderpath NOT LIKE 'Data Collector%'" Foreach ($pkg in $Packages) { $pkgName = $Pkg.name $folderPath = $Pkg.folderpath $fullfolderPath = "c:\temp\$folderPath\" if(!(test-path -path $fullfolderPath)) { mkdir $fullfolderPath | Out-Null } $pkg.pkg | Out-File -Force -encoding ascii -FilePath "$fullfolderPath\$pkgName.dtsx" }   To run it simply change the “localhost” parameter of the server you want to connect to either by editing the script or passing it in when the script is executed. It will create the folder structure in C:\Temp (which you can also easily change if you so wish – just edit the script accordingly). Here’s the folder structure that it created for me: Notice how it is a mirror of the folder structure in [msdb]. Hope this is useful! @Jamiet

    Read the article

  • DTLoggedExec 1.1.2008.4 Released!

    - by Davide Mauri
    Today I've relased the latest version of my DTExec replacement tool, DTLoggedExec. The main changes are the following: Used a new strategy for version numbers. Now it will follow the following pattern Major.Minor.TargetSQLServerVersion.Revision Added support for Auto Configurations Fixed a bug that reported incorrect number of errors and warnings to Log Providers Fixed a buf that prevented correct casting of values when using /Set and /Param options Errors and Warnings are now counted more precisely. Updated database and log import scripts to categorize logs by projects and sections. E.g.: Project: MyBIProject; Sections: Staging, Datawarehouse Removed unused report stored procedures from database Updated Samples: 12 samples are now available to show ALL DTLoggedExec features From this version only SSIS 2008 will be supported http://dtloggedexec.codeplex.com/releases/view/62218  It useful to say something more on a couple of specific points: From this version only SSIS 2008 will be supportedYes, Integration Services 2005 are not supported anymore. The latest version capable of running SSIS 2005 Packages is the 1.0.0.2. Updated database and log import scripts to categorize logs by projects and sectionsWhen you import a log file, you can now assign it to a Project and to a Section of that project. In this way it's easier to gather statistical information for an entire project or a subsection of it. This also allows to store logged data of package belonging to different projects in the same database. For example:  Updated SamplesA complete set of samples that shows how to use all DTLoggedExec features are now shipped with the product. Enjoy! Added support for Auto ConfigurationsThis point will have a post on its own, since it's quite important and is by far the biggest new feature introduced in this release. To explain it in a few words, I can just say that you don't need to waste time with complex DTS configuration files or options, since a package will configure itself automatically. You just need to write a single statement as a parameter for DTLoggedExec. This feature can simplify deployment *a lot* :)   I the next days I'll write the mentioned post on Auto-Configurations and i'll update the documentation available on theDTLoggedExec website:   http://dtloggedexec.davidemauri.it/MainPage.ashx

    Read the article

  • Inequality joins, Asynchronous transformations and Lookups : SSIS

    - by jamiet
    It is pretty much accepted by SQL Server Integration Services (SSIS) developers that synchronous transformations are generally quicker than asynchronous transformations (for a description of synchronous and asynchronous transformations go read Asynchronous and synchronous data flow components). Notice I said “generally” and not “always”; there are circumstances where using asynchronous transformations can be beneficial and in this blog post I’ll demonstrate such a scenario, one that is pretty common when building data warehouses. Imagine I have a [Customer] dimension table that manages information about all of my customers as a slowly-changing dimension. If that is a type 2 slowly changing dimension then you will likely have multiple rows per customer in that table. Furthermore you might also have datetime fields that indicate the effective time period of each member record. Here is such a table that contains data for four dimension members {Terry, Max, Henry, Horace}: Notice that we have multiple records per customer and that the [SCDStartDate] of a record is equivalent to the [SCDEndDate] of the record that preceded it (if there was one). (Note that I am on record as saying I am not a fan of this technique of storing an [SCDEndDate] but for the purposes of clarity I have included it here.) Anyway, the idea here is that we will have some incoming data containing [CustomerName] & [EffectiveDate] and we need to use those values to lookup [Customer].[CustomerId]. The logic will be: Lookup a [CustomerId] WHERE [CustomerName]=[CustomerName] AND [SCDStartDate] <= [EffectiveDate] AND [EffectiveDate] <= [SCDEndDate] The conventional approach to this would be to use a full cached lookup but that isn’t an option here because we are using inequality conditions. The obvious next step then is to use a non-cached lookup which enables us to change the SQL statement to use inequality operators: Let’s take a look at the dataflow: Notice these are all synchronous components. This approach works just fine however it does have the limitation that it has to issue a SQL statement against your lookup set for every row thus we can expect the execution time of our dataflow to increase linearly in line with the number of rows in our dataflow; that’s not good. OK, that’s the obvious method. Let’s now look at a different way of achieving this using an asynchronous Merge Join transform coupled with a Conditional Split. I’ve shown it post-execution so that I can include the row counts which help to illustrate what is going on here: Notice that there are more rows output from our Merge Join component than on the input. That is because we are joining on [CustomerName] and, as we know, we have multiple records per [CustomerName] in our lookup set. Notice also that there are two asynchronous components in here (the Sort and the Merge Join). I have embedded a video below that compares the execution times for each of these two methods. The video is just over 8minutes long. View on Vimeo  For those that can’t be bothered watching the video I’ll tell you the results here. The dataflow that used the Lookup transform took 36 seconds whereas the dataflow that used the Merge Join took less than two seconds. An illustration in case it is needed: Pretty conclusive proof that in some scenarios it may be quicker to use an asynchronous component than a synchronous one. Your mileage may of course vary. The scenario outlined here is analogous to performance tuning procedural SQL that uses cursors. It is common to eliminate cursors by converting them to set-based operations and that is effectively what we have done here. Our non-cached lookup is performing a discrete operation for every single row of data, exactly like a cursor does. By eliminating this cursor-in-disguise we have dramatically sped up our dataflow. I hope all of that proves useful. You can download the package that I demonstrated in the video from my SkyDrive at http://cid-550f681dad532637.skydrive.live.com/self.aspx/Public/BlogShare/20100514/20100514%20Lookups%20and%20Merge%20Joins.zip Comments are welcome as always. @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Required Parameters [SSIS Denali]

    - by jamiet
    SQL Server Integration Services (SSIS) in its 2005 and 2008 incarnations expects you to set a property values within your package at runtime using Configurations. SSIS developers tend to have rather a lot of issues with SSIS configurations; in this blog post I am going to highlight one of those problems and how it has been alleviated in SQL Server code-named Denali.   A configuration is a property path/value pair that exists outside of a package, typically within SQL Server or in a collection of one or more configurations in a file called a .dtsConfig file. Within the package one defines a pointer to a configuration that says to the package “When you execute, go and get a configuration value from this location” and if all goes well the package will fetch that configuration value as it starts to execute and you will see something like the following in your output log: Information: 0x40016041 at Package: The package is attempting to configure from the XML file "C:\Configs\MyConfig.dtsConfig". Unfortunately things DON’T always go well, perhaps the .dtsConfig file is unreachable or the name of the SQL Sever holding the configuration value has been defined incorrectly – any one of a number of things can go wrong. In this circumstance you might see something like the following in your log output instead: Warning: 0x80012014 at Package: The configuration file "C:\Configs\MyConfig.dtsConfig" cannot be found. Check the directory and file name. The problem that I want to draw attention to here though is that your package will ignore the fact it can’t find the configuration and executes anyway. This is really really bad because the package will not be doing what it is supposed to do and worse, if you have not isolated your environments you might not even know about it. Can you imagine a package executing for months and all the while inserting data into the wrong server? Sounds ridiculous but I have absolutely seen this happen and the root cause was that no-one picked up on configuration warnings like the one above. Happily in SSIS code-named Denali this problem has gone away as configurations have been replaced with parameters. Each parameter has a property called ‘Required’: Any parameter with Required=True must have a value passed to it when the package executes. Any attempt to execute the package will result in an error. Here we see that error when attempting to execute using the SSMS UI: and similarly when executing using T-SQL: Error is: Msg 27184, Level 16, State 1, Procedure prepare_execution, Line 112 In order to execute this package, you need to specify values for the required parameters.   As you can see, SSIS code-named Denali has mechanisms built-in to prevent the problem I described at the top of this blog post. Specifying a Parameter required means that any packages in that project cannot execute until a value for the parameter has been supplied. This is a very good thing. I am loathe to make recommendations so early in the development cycle but right now I’m thinking that all Project Parameters should have Required=True, certainly any that are used to define external locations should be anyway. @Jamiet

    Read the article

  • How to configure CruiseControl.Net for Windows Authentication?

    - by balu
    I am using CruiseControl.Net for continuous integration which is now accessing the dashboard through login plugin, which in turn is authenticating and authorizing after verifying it with a set of users saved as XML file in the CruiseControl.Net server. Now, i need to bring in Windows Authentication to the system whereby which CruiseControl.Net server webdashboard when accessed from a client machine(local machine associated with a common server), would be authenticated and be authorized to access the CruiseControl.Net features based on the authority of the logged in users. Kindly guide me to go ahead with this, appreciate all kinds of resources that would be helpful for achieving this. Thanks.

    Read the article

  • TFS and Sharepoint integration

    - by pho3nix
    I using a Sharepoint 2007 and TFS 2010. I installed all with reports sharepoint integration but when i try create a TeamProject Collections always return this warning: Server was unable to process request. --- TF250029: No user was found within the context of a Web site. Verify that the site does not allow anonymous access. And in sharepoint web applications tfs console window when check my sharepoint portal /sites return a error saying is blocked by firewall or server extensions is not installed, but i have all ok. Anyone can helpme, please.

    Read the article

  • Integration of Tomcat 7 with IIS 7

    - by priya
    After following all the steps related to integration of tomcat7 and IIS7 i am getting below error.Any idea what might be the cause?First time when I did all the steps as mentioned in tutorial my site was coming up then suddenly it stop coming up.Again i removed my site from IIS and followed the steps but then every time below error is coming:- HTTP Error 500.0 - Internal Server Error The page cannot be displayed because an internal server error has occurred. Detailed Error Information Module IsapiFilterModule Notification AuthenticateRequest Handler StaticFile Error Code 0x80070001 Physical Path D:\New\IISROOT Logon Method Anonymous Logon User Anonymous Failed Request Tracing Log Directory D:\New\Tomcat\logs I have checked the logs but trace log is not created as well as isapi_redirect.logs

    Read the article

  • Changing LDAP schema casts Confluence AD integration unoperable

    - by Maxim V. Pavlov
    I have had our instance of Atlassian Confluence configured to be integrated with our Active Directory. In AD, all the users were being created under default Users folder in Active Directory Users and Computers. We have decided to introduce cleaner separation and have created an Organizational Units structure in AD. Under root we have created Managed OU, and under it - Users OU and all user accounts were moved under Users OU. Now I though that to let the Confluence AD integration engine "know" where to look for user accounts now, I only need to adjust the BaseDN and prepand it with ou=Managed so it is aware that it is looking for cn=Users but under ou=Managed. That didn't work. How should I adjust LDAP schema root in a client application for it to be able to look for users in OU that then in a default folder.

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >