Search Results

Search found 4309 results on 173 pages for 'continous integration'.

Page 18/173 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Installing Hyper-V Integration Components on Linux

    - by Lance Fisher
    Some big news this week was Microsoft released the Hyper-V integration components for Linux source code under the GPL v2. I just installed Ubuntu Server 9.04 in a Hyper-V VM with a Legacy Network Adapter. How do I install the integration components? Do I have to wait until they are included in the kernel?

    Read the article

  • And the Winners of Fusion Middleware Innovation Awards in Data Integration are…

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} At OpenWorld, we announced the winners of Fusion Middleware Innovation Awards 2012. Raymond James and Morrison Supermarkets were selected for the data integration category for their innovative use of Oracle’s data integration products and the great results they have achieved. In this blog I would like to briefly introduce you to these award winning projects. Raymond James is a diversified financial services company, which provides financial planning, wealth management, investment banking, and asset management. They are using Oracle GoldenGate and Oracle Data Integrator to feed their operational data store (ODS), which supports application services across the enterprise. A major requirement for their project was low data latency, as key decisions are made based on the data in the ODS. They were able to fulfill this requirement due to the Oracle Data Integrator’s integrated solution with Oracle GoldenGate. Oracle GoldenGate captures changed data from different systems including Oracle Database, HP NonStop and Microsoft SQL Server into a single data store on SQL Server 2008. Oracle Data Integrator provides data transformations for the ODS. Leveraging ODI’s integration with GoldenGate, Raymond James now sees a 9 second median latency (from source commit to ODS target commit). The ODS solution delivers high quality, accurate data for consuming applications such as Raymond James’ next generation client and portfolio management systems as well as real-time operational reporting. It enables timely information for making better decisions. There are more benefits Raymond James achieved with this implementation of Oracle’s data integration solution. The software developers and architects of this solution, Tim Garrod and Ryan Fonnett, have told us during their presentation at OpenWorld that they also reduced application complexity significantly while improving developer productivity through trusted operational services. They were able to utilize CDC to generate alerts for business users, and for applications (for example for cache hydration mechanisms). One cool innovation example among many in this project is that using ODI's flexible architecture, Tim and Ryan could build 24/7 self-healing processes. And these processes have hardly failed. Integration processes fixes the errors itself. Pretty amazing; and a great solution for environments that need such reliability and availability. (You can see Tim and Ryan’s photo with the Innovation Award above.) The other winner of this year in the data integration category, Morrison Supermarkets, is the UK’s 4th largest grocery retailer. The company has been migrating all their legacy applications on to a new-world application set based on Oracle and consolidating all BI on to a single Oracle platform. The company recently implemented Oracle Exadata as the data warehouse engine and uses Oracle Business Intelligence EE. Their goal with deploying GoldenGate and ODI was to provide BI data to the enterprise in a way that it also supports operational decision making requirements from a wide range of Oracle based ERP applications such as E-Business Suite, PeopleSoft, Oracle Retail Suite. They use GoldenGate’s log-based change data capture capabilities and Oracle Data Integrator to populate the Oracle Retail Data Model. The electronic point of sale (EPOS) integration solution they built processes over 80 million transactions/day at busy periods in near real time (15 mins). It provides valuable insight to Retail and Commercial teams for both intra-day and historical trend analysis. As I mentioned in yesterday’s blog, the right data integration platform can transform the business. Here is another example: The point-of-sale integration enabled the grocery chain to optimize its stock management, leading to another award: Morrisons won the Grocer 33 award in 2012 - beating all other major UK supermarkets in product availability. Congratulations, Morrisons,on another award! Celebrating the innovation and the success of our customers with Oracle’s data integration products was definitely a highlight of Oracle OpenWorld for me. I look forward to hearing more from Raymond James, Morrisons, and the other customers that presented their data integration projects at OpenWorld, on how they are creating more value for their organizations.

    Read the article

  • Continuous Integration with 64-bit Sharepoint and TFS 2008?

    - by Hirvox
    I've set up a 64-bit TFS 2008 build server with Sharepoint, continuous integration and out-of-the-box MSTest. Unit tests for plain business logic classes run just fine and test results are published into TFS. However, any test that uses Sharepoint's API fails horribly, SPFarm.Local returning null and so on. Is there a way to fix this? The tests run fine in an otherwise identical 32-bit development environment (Windows Server 2008 under Hyper-V, Sharepoint patched up to June 2009 cumulative update) from both Visual Studio and command line, so the problem is not about improper use of SPContext.Current or any other part of the API that needs to be run in a web server context. I've ruled out permissions issues, because the build agent account can deploy the solution and create site collections just fine with stsadm. The next culprit could be that the unit tests were being run with a 32-bit process, which couldn't access the 64-bit Sharepoint API properly. I tried a workaround, but it has the side effect of disabling TFS support in MSTest. Do I have to wait for 2010 versions of MS tools (and hope for the best) or is there a third-party test framework available that runs natively in 64 bit and can publish test results into TFS 2008?

    Read the article

  • Can an internally developed fast evolving, agile, short sprint web application lend itself to offshoring?

    - by Gavin Howden
    I have recently been set a target to achieve readiness to successfully manage and deliver results through the usage of offshore teams on our mainline development project within 12 months. Our mainline is a multi-thousand user highly available web application, and various related SAAS components delivered through the above mentioned web application. We work agile on the mainline with a rapid 1 week sprint using continuous integration. Our delivery platform is a bespoke php framework, although we have some .net services and components in the mix. My view is: an offshore team could work if we either ship out an entire isolated project for offshore development, or we specify a component for our system in huge detail up front. But we don't currently work like that, and it will conflict with the in-house method, and unless the off-shore is working within our team, with our development/deployment chain it could be an integration nightmare. So my question is, given we have a closed source bespoke framework (Private IP) which we train our developers to use, and we work agile minimising documentation, maximising communication and responding to rapidly changing requirements, and much of the quality control is via team skills building and peer review, how can I make off-shoring work on our main line development?

    Read the article

  • Performance issues with jms and spring integration. What is wrong with the following configuration?

    - by user358448
    I have a jms producer, which generates many messages per second, which are sent to amq persistent queue and are consumed by single consumer, which needs to process them sequentially. But it seems that the producer is much faster than the consumer and i am having performance and memory problems. Messages are fetched very very slowly and the consuming seems to happen on intervals (the consumer "asks" for messages in polling fashion, which is strange?!) Basically everything happens with spring integration. Here is the configuration at the producer side. First stake messages come in stakesInMemoryChannel, from there, they are filtered throw the filteredStakesChannel and from there they are going into the jms queue (using executor so the sending will happen in separate thread) <bean id="stakesQueue" class="org.apache.activemq.command.ActiveMQQueue"> <constructor-arg name="name" value="${jms.stakes.queue.name}" /> </bean> <int:channel id="stakesInMemoryChannel" /> <int:channel id="filteredStakesChannel" > <int:dispatcher task-executor="taskExecutor"/> </int:channel> <bean id="stakeFilterService" class="cayetano.games.stake.StakeFilterService"/> <int:filter input-channel="stakesInMemoryChannel" output-channel="filteredStakesChannel" throw-exception-on-rejection="false" expression="true"/> <jms:outbound-channel-adapter channel="filteredStakesChannel" destination="stakesQueue" delivery-persistent="true" explicit-qos-enabled="true" /> <task:executor id="taskExecutor" pool-size="100" /> The other application is consuming the messages like this... The messages come in stakesInputChannel from the jms stakesQueue, after that they are routed to 2 separate channels, one persists the message and the other do some other stuff, lets call it "processing". <bean id="stakesQueue" class="org.apache.activemq.command.ActiveMQQueue"> <constructor-arg name="name" value="${jms.stakes.queue.name}" /> </bean> <jms:message-driven-channel-adapter channel="stakesInputChannel" destination="stakesQueue" acknowledge="auto" concurrent-consumers="1" max-concurrent-consumers="1" /> <int:publish-subscribe-channel id="stakesInputChannel" /> <int:channel id="persistStakesChannel" /> <int:channel id="processStakesChannel" /> <int:recipient-list-router id="customRouter" input-channel="stakesInputChannel" timeout="3000" ignore-send-failures="true" apply-sequence="true" > <int:recipient channel="persistStakesChannel"/> <int:recipient channel="processStakesChannel"/> </int:recipient-list-router> <bean id="prefetchPolicy" class="org.apache.activemq.ActiveMQPrefetchPolicy"> <property name="queuePrefetch" value="${jms.broker.prefetch.policy}" /> </bean> <bean id="connectionFactory" class="org.springframework.jms.connection.CachingConnectionFactory"> <property name="targetConnectionFactory"> <bean class="org.apache.activemq.ActiveMQConnectionFactory"> <property name="brokerURL" value="${jms.broker.url}" /> <property name="prefetchPolicy" ref="prefetchPolicy" /> <property name="optimizeAcknowledge" value="true" /> <property name="useAsyncSend" value="true" /> </bean> </property> <property name="sessionCacheSize" value="10"/> <property name="cacheProducers" value="false"/> </bean>

    Read the article

  • What's wrong performing unit test against concrete implementation if your frameworks are not going to change?

    - by palm snow
    First a bit of background: We are re-architecting our product suite that was written 10 years ago and served its purpose. One thing that we cannot change is the database schema as we have 500+ client base using this system. Our db schema has over 150+ tables. We have decided on using Entity Framework 4.1 as DAL and still evaluating various frameworks for storing our business logic. I am investigation to bring unit testing into the mix but I also confused as to how far I need to go with setting up a full blown TDD environment. One aspect of setting up unit testing is by getting into implementing Repository, unit of work and mocking frameworks etc. This mean there will be cost and investment on the code-bloat associated with all these frameworks. I understand some of this could be auto-generated but when it comes to things like behaviors, that will be mostly hand written. Just to be clear, I am not questioning the important of unit testing your code. I am just not sure we need all its components (like repository, mocking etc.) when we are fairly certain of storage mechanism/framework (SQL Server/Entity Framework). All that code bloat with generic repositories make sense when you need a generic layers with ability to change this whenever you like however its very likely a YAGNI in our case. What we need is more of integration testing where we can unit-test our code with concrete repository objects and test data in database. In this scenario, just running integration test seem to be more beneficial in our case. Any thoughts if I am missing any thing here?

    Read the article

  • Autofac Wcf Integration Security Problem

    - by ecoffey
    I've created a Wcf Service to back a Ajax page (.Net 3.5). It's hosted in IIS 6.1 Integrated Pipeline. (The rest of Autofac is setup correctly for Web Forms integration). Everything works fine and dandy with the normal Wcf pipeline. However when I plug in the Autofac Wcf Integration (as per the Autofac wiki) I get this delightful exception: [SecurityException: That assembly does not allow partially trusted callers.] Autofac.Integration.Wcf.AutofacHostFactory.CreateServiceHost(String constructorString, Uri[] baseAddresses) in c:\Working\Autofac\src\Source\Autofac.Integration.Wcf\AutofacHostFactory.cs:78 System.ServiceModel.HostingManager.CreateService(String normalizedVirtualPath) +604 System.ServiceModel.HostingManager.ActivateService(String normalizedVirtualPath) +46 System.ServiceModel.HostingManager.EnsureServiceAvailable(String normalizedVirtualPath) +654 My Google-fu has failed me on finding a solution to this problem. Any insights or workarounds would be appreciated.

    Read the article

  • Cloud Apps and Single Sign-On (AD integration)

    - by Pablo Alvim
    I've been investigating some cloud vendors and the ability to implement single sign-on with them, especially when it comes to AD (Active Directory) integration. So far I've learned that with Azure this is possible through ADFS and the AppFabric Access Control offer. In AWS, since it is possible to create a VPN and see EC2 instances as a natural extension of a private datacenter, I believe implementing SSO would be rather simple (not sure if I'm right on this one... Please correct me if I'm wrong). With App Engine though, even though there is some documentation on AD synchronization (not full integration) for Google Apps, I'm struggling to find out whether AD integration would be possible... Is there any strategy for that? Any bit of information on cloud apps and AD integration will be appreciated!

    Read the article

  • Oracle Financial Analytics for SAP Certified with Oracle Data Integrator EE

    - by denis.gray
    Two days ago Oracle announced the release of Oracle Financial Analytics for SAP.  With the amount of press this has garnered in the past two days, there's a key detail that can't be missed.  This release is certified with Oracle Data Integrator EE - now making the combination of Data Integration and Business Intelligence a force to contend with.  Within the Oracle Press Release there were two important bullets: ·         Oracle Financial Analytics for SAP includes a pre-packaged ABAP code compliant adapter and is certified with Oracle Data Integrator Enterprise Edition to integrate SAP Financial Accounting data directly with the analytic application.  ·         Helping to integrate SAP financial data and disparate third-party data sources is Oracle Data Integrator Enterprise Edition which delivers fast, efficient loading and transformation of timely data into a data warehouse environment through its high-performance Extract Load and Transform (E-LT) technology. This is very exciting news, demonstrating Oracle's overall commitment to Oracle Data Integrator EE.   This is a great way to start off the new year and we look forward to building on this momentum throughout 2011.   The following links contain additional information and media responses about the Oracle Financial Analytics for SAP release. IDG News Service (Also appeared in PC World, Computer World, CIO: "Oracle is moving further into rival SAP's turf with Oracle Financial Analytics for SAP, a new BI (business intelligence) application that can crunch ERP (enterprise resource planning) system financial data for insights." Information Week: "Oracle talks a good game about the appeal of an optimized, all-Oracle stack. But the company also recognizes that we live in a predominantly heterogeneous IT world" CRN: "While some businesses with SAP Financial Accounting already use Oracle BI, those integrations had to be custom developed. The new offering provides pre-built integration capabilities." ECRM Guide:  "Among other features, Oracle Financial Analytics for SAP helps front-line managers improve financial performance and decision-making with what the company says is comprehensive, timely and role-based information on their departments' expenses and revenue contributions."   SAP Getting Started Guide for ODI on OTN: http://www.oracle.com/technetwork/middleware/data-integrator/learnmore/index.html For more information on the ODI and its SAP connectivity please review the Oracle® Fusion Middleware Application Adapters Guide for Oracle Data Integrator11g Release 1 (11.1.1)

    Read the article

  • AutoVue at the Oracle Asset Lifecycle Management Summit

    - by celine.beck
    I recently had the opportunity to attend and present the integration between AutoVue and Primavera P6 during the Oracle ALM Summit, which was held in March at Redwood Shores, on Oracle Headquarters grounds. The ALM Summit brought together over 300 Oracle maintenance practitioners who endured the foggy and rainy San Francisco weather to attend the 4th edition of this Oracle-driven conference. Attendees have roles in maintenance management and IT. Following a general session, Ralph Rio from ARC Advisory Group provided a very interesting keynote session discussing Asset Management directions, both in the short and long run. An interesting point that Ralph raised is that most organizations have done a good job at improving performance at the design / build, operate and maintain and portfolio management phases by leveraging solutions like Asset Lifecycle Management and Project & Portfolio management solutions; however, there seem to be room for improvement in between those phases, when information flows from one group to the other, during the data handover phase or when time comes to update / modify drawings to reflect the reality of physical assets. This is where AutoVue comes into play. By integrating with enterprise applications like content management systems, asset lifecycle management applications and project management solutions, AutoVue can be a real-process enabler, streamlining information flows from concept/design to decommissioning and ensuring that all project stakeholders have access to asset information and engineering data throughout the asset lifecycle. AutoVue's built-in digital annotation capabilities allows maintenance workers and technicians to report changes in configuration and visually capture the delta between as-built and as-maintained versions of asset documents. This information can then be easily handed over to engineers who can identify changes and incorporate these modifications into the drawings during the next round of document revisions. PPL Power Generation, an electric utilities headquarted in Allentown, Pennsylvania discussed this usage of AutoVue during an interesting Webcast around AutoVue's role in the Utilities space. After the keynote sessions, participants broke off into product-centric tracks around Oracle's Asset Lifecycle Management solutions (E-Business Suite, PeopleSoft, and JD Edwards). The second day of the conference was the occasion for us to present the integration between AutoVue and Primavera P6 to the Maintenance Summit audience. The presentation was a great success and generated much discussion with partners and customers during breaks. People seemed highly interested in learning more about our plans for integrating AutoVue and Primavera P6 with Oracle's ALM solutions...stay tune for further information on the subject!

    Read the article

  • What guidelines are best suited for leveraging automatic deployments?

    - by Scott
    We are hoping to leverage a static code analysis tool (Sonar) as part of our continuous integration server, and are hoping to determine some useful guidelines to serve as a base for allowing the deployment to continue. What conditions should we make mandatory before allowing a build to proceed to the next set of testing? The obvious answers include that it compiles and the unit tests are successful. But what are some other things we should require before allowing a build to not be rolled back?

    Read the article

  • SSIS Dashboard 0.5.2 and Live Demo Website

    - by Davide Mauri
    In the last days I’ve worked again on the SQL Server Integration Service Dashboard and I did some updates: Beta Added support for "*" wildcard in project names. Now you can filter a specific project name using an url like: http://<yourserver>/project/MyPro* Added initial support for Package Execution History. Just click on a package name and you'll see its latest 15 executions and I’ve also created a live demo website for all those who want to give it a try before downloading and using it: http://ssis-dashboard.azurewebsites.net/

    Read the article

  • Communicator Messages not being saved to Outlook due to Outlook Integration error

    - by Mark Rogers
    For the most part my Office Communicator appears to be configured correctly. I can login to my work account and see the work contact list. Outlook is working perfectly had a weird profile problem initially but that was fixed. Unfortunately, even though I have set the setting that says: Save my instant message conversations in the Outlook Conversation History folder. My conversations have stopped saving to the Outlook Conversation History folder. Also I have a yellow warning message on top of the server icon next to the status field. When I hover or click on the message, it says there is an Outlook Integration Error. The administrator is having trouble figuring out what is causing it. What can cause Outlook Integration Errors in Communicator and how do I go about trouble shooting them?

    Read the article

  • Tuesday at Oracle OpenWorld 2012 - Must See Session: “Oracle Fusion Applications: Best Practices in Integration Design Patterns”

    - by Lionel Dubreuil
    Don’t miss this “CON8685 - Oracle Fusion Applications: Best Practices in Integration Design Patterns “ session: Speakers: Rajesh Raheja - Senior Director, Development, Oracle Ravi Sankaran - Director, Applications Development, Oracle Date: Tuesday, Oct 2 Time: 1:15 PM - 2:15 PM Location: Palace Hotel - Telegraph Oracle Fusion Applications provide various ways to integrate their functional capabilities with other Oracle applications as well as third-party and legacy applications. In this session, you will learn the patterns used when communicating with Oracle Fusion Applications with a SOA approach. It addresses items related to identifying the integration artifacts available, also known as assets, in Oracle Enterprise Repository; how to invoke synchronous and asynchronous Web services; importing and exporting bulk data; and any integration issues to look out for. The patterns will be applicable to on-premises and SaaS/cloud deployment modes and are indicated as such. Objectives for this session are to: Highlight the various ways to integrate with Oracle Fusion Applications Showcase use of Oracle Fusion Middleware technologies for integration Describe best practices and design patterns for integration

    Read the article

  • Tuesday at Oracle OpenWorld 2012 - Must See Session: “Oracle Fusion Applications: Best Practices in Integration Design Patterns”

    - by Lionel Dubreuil
    Don’t miss this “CON8685 - Oracle Fusion Applications: Best Practices in Integration Design Patterns “ session: Speakers: Rajesh Raheja - Senior Director, Development, Oracle Ravi Sankaran - Director, Applications Development, Oracle Date: Tuesday, Oct 2 Time: 1:15 PM - 2:15 PM Location: Palace Hotel - Telegraph Oracle Fusion Applications provide various ways to integrate their functional capabilities with other Oracle applications as well as third-party and legacy applications. In this session, you will learn the patterns used when communicating with Oracle Fusion Applications with a SOA approach. It addresses items related to identifying the integration artifacts available, also known as assets, in Oracle Enterprise Repository; how to invoke synchronous and asynchronous Web services; importing and exporting bulk data; and any integration issues to look out for. The patterns will be applicable to on-premises and SaaS/cloud deployment modes and are indicated as such. Objectives for this session are to: Highlight the various ways to integrate with Oracle Fusion Applications Showcase use of Oracle Fusion Middleware technologies for integration Describe best practices and design patterns for integration

    Read the article

  • Getting MSDeploy working on our build/integration server - Is an MSBuild upgrade necessary?

    - by Jeff D
    We have what I think is a fairly standard build process: 1. Developer: Check in code 2. Build: Polls repo, sees change, and kicks off build that: 3. Build: Updates from repo, Builds w/ MSBuild, Runs unit tests w/ nunit, 4. Build: creates installer package Our security team allows us to pull from the build server, but does not allow the build server to push. So we generally rdp in, d/l the installers, and run them, which rules out the slick deployment services, so I would need to generate packages instead. I'd like to use MSDeploy, except that we have the following issues: We're on .net 3.5, and the MSBuild target (Package) that uses MSDeploy requires 4.0. Is there anything I'd need to install other than .net 4.0 RC for this? (Would MSBuild be part of that upgrade?) When I generate packages with MSDeploy, I see that I don't have just 1 file. There's a zip, deploy.cmd, SourceManifest.xml, and SetParameters.xml. What are all the other files for, and why wouldn't they all be in the 'package'? It sounds as if you can create packages by telling the system to look at a working IIS site. But if the packages are build from a CI environment, aren't you basically out of luck here? It feels like they designed some of this for small-scale developers deploying from their dev environment. That's a fine use case, but I'm interested in see what everyone's enterprise-experience is with the tool Any suggestions?

    Read the article

  • Does ActiveCollab subversion integration work with subversion over ssh?

    - by executor21
    I'm trying to setup a repository in an ActiveCollab project. During setup, it reports that the connection tests successfully. However, when I try to actually update the repository, I get the following message: Could not obrain the highest revision number for the given repository. If I try to browse the repository, the following error comes up: Fatal error: Call to a member function getRevision() on a non-object in /u/sites/activecollab/webroot/shared/activecollab/activecollab/application/modules/source/controllers/RepositoryController.class.php on line 357 Is this because of trying to access the repository via svn+ssh plugin rather than http? Or did something happen on the ActiveCollab end? The repository is accessed fine via other means -- only ActiveCollab has the problem.

    Read the article

  • Bulletproof way to DROP and CREATE a database under Continuous Integration.

    - by H. Abraham Chavez
    I am attempting to drop and recreate a database from my CI setup. But I'm finding it difficult to automate the dropping and creation of the database, which is to be expected given the complexities of the db being in use. Sometimes the process hangs, errors out with "db is currently in use" or just takes too long. I don't care if the db is in use, I want to kill it and create it again. Does some one have a straight shot method to do this? alternatively does anyone have experience dropping all objects in the db instead of dropping the db itself? USE master --Create a database IF EXISTS(SELECT name FROM sys.databases WHERE name = 'mydb') BEGIN ALTER DATABASE mydb SET SINGLE_USER --or RESTRICTED_USER --WITH ROLLBACK IMMEDIATE DROP DATABASE uAbraham_MapSifterAuthority END CREATE DATABASE mydb;

    Read the article

  • Typical SVN repo structure seems to be sub-optimal for continuous integration...

    - by Dave
    I've set up our SVN repository like the Subversion book suggests, and this is also how my previous companies have done it. It looks something like this: /trunk /branches /tags /extlibs /docs where the first three are pretty obvious, and extlibs is for 3rd party assemblies that we wouldn't typically recompile ourselves. All of this works great for the daily development stuff. Now I've installed TeamCity and have builds, unit tests, code coverage, and code analysis running. Everything is great, except for the fact that this code structure results in too much code getting downloaded. So here's the catch 22, in my opinion: it's silly to download all of aforementioned folders from the SVN repo when I only need /trunk and /extlibs. But I can only specify one repo folder to download in the TeamCity VCS settings. So then the other possibility is to put the /extlibs folder into /trunk, but in order to compile branches, /extlibs would have to go into all of those as well (since I usually branch the trunk, and not individual subfolders... and this would seem infinitely more evil since /extlibs could actually be larger than /trunk and /branches, with all of the binaries stored there... Do you guys have any suggestions for me? Thanks!

    Read the article

  • How to pass an integration property to a batch file with CruiseControlNet ?

    - by TridenT
    In the build log of my project, i can see these properties: <integrationProperties> <CCNetProject>Gdet_T</CCNetProject> ... <LastChangeNumber>0</LastChangeNumber> <LastIntegrationStatus>Success</LastIntegrationStatus> <LastSuccessfulIntegrationLabel>25</LastSuccessfulIntegrationLabel> <LastModificationDate>4/6/2010 1:29:04 PM</LastModificationDate> <LastChangeNumber>10841</LastChangeNumber> </integrationProperties> I want to pass the property CCNetProject and LastChangeNumber to a batch file. it works well with CCNetProject, as it can be used in the batch as an environment variable %CCNetProject%. But it doesn't work with other properties (those are not starting with the CCnet prefix) as LastChangeNumber or LastModificationDate. I tried to pass it as environment variable, but it fails ! <exec> <executable>$(WorkingFolderBase)\MyBatch.bat</executable> <baseDirectory>$(WorkingFolderBase)\</baseDirectory> <buildArgs>$(LastModificationDate)</buildArgs> </exec> I tried to pass it as argument, but it fails: <exec> <executable>$(WorkingFolderBase)\MyBatch.bat</executable> <baseDirectory>$(WorkingFolderBase)\</baseDirectory> <environment> <variable> <name>svn_label</name> <value>"${LastModificationDate}"</value> </variable> </environment> </exec> The results is always the same when I display the parameter or variable : empty string or the variable name $(svn_label) I'm sure it is simple, but ... I can't find ! Any idea ?

    Read the article

  • Is there a pre-made Continuous Integration solution for .NET applications?

    - by Brett Rigby
    From my perspective, we're constructing our own 'flavour' of NAnt/Ivy/CruiseControl.Net in-house and can't help but get the feeling that other dev shops are doing exactly the same work, but then everybody is finding out the same problems and pitfalls with it. I'm not complaining about NAnt, Ivy or CruiseControl at all, as they've been brilliant in helping our team of developers become more sure of the quality of their code, but it just seems strange that these tools are very popular, yet we're all re-inventing the CI-wheel. Is there a pre-made solution for building .Net applications, using the tools mentioned above?

    Read the article

  • Want a headless build server for SSDT without installing Visual Studio? You’re out of luck!

    - by jamiet
    An issue that regularly seems to rear its head on my travels is that of headless build servers for SSDT. What does that mean exactly? Let me give you my interpretation of it. A SQL Server Data Tools (SSDT) project incorporates a build process that will basically parse all of the files within the project and spit out a .dacpac file. Where an organisation employs a Continuous Integration process they will likely want to automate the building of that dacpac whenever someone commits a change to the source control repository. In order to do that the organisation will use a build server (e.g. TFS, TeamCity, Jenkins) and hence that build server requires all the pre-requisite software that understands how to build an SSDT project. The simplest way to install all of those pre-requisites is to install SSDT itself however a lot of folks don’t like that approach because it installs a lot unnecessary components on there, not least Visual Studio itself. Those folks (of which i am one) are of the opinion that it should be unnecessary to install a heavyweight GUI in order to simply get a few software components required to do something that inherently doesn’t even need a GUI. The phrase “headless build server” is often used to describe a build server that doesn’t contain any heavyweight GUI tools such as Visual Studio and is a desirable state for a build server. In his blog post Headless MSBuild Support for SSDT (*.sqlproj) Projects Gert Drapers outlines the steps necessary to obtain a headless build server for SSDT: This article describes how to install the required components to build and publish SQL Server Data Tools projects (*.sqlproj) using MSBuild without installing the full SQL Server Data Tool hosted inside the Visual Studio IDE. http://sqlproj.com/index.php/2012/03/headless-msbuild-support-for-ssdt-sqlproj-projects/ Frankly however going through these steps is a royal PITA and folks like myself have longed for Microsoft to support headless build support for SSDT by providing a distributable installer that installs only the pre-requisites for building SSDT projects. Yesterday in MSDN forum thread Building a VS2013 headless build server - it's sooo hard Mike Hingley complained about this very thing and it prompted a response from Kevin Cunnane from the SSDT product team: The official recommendation from the TFS / Visual Studio team is to install the version of Visual Studio you use on the build machine. I, like many others, would rather not have to install full blown Visual Studio and so I asked: Is there any chance you'll ever support any of these scenarios: Installation of all build/deploy pre-requisites without installing the VS shell? TFS shipping with all of the pre-requisites for doing SSDT project build/deploys 3rd party build servers (e.g. TeamCity) shipping with all of the requisites for doing SSDT project build/deploys I have to say that the lack of a single installer containing all the pre-requisites for SSDT build/deploy puzzles me. Surely the DacFX installer would be a perfect vehicle for that? Kevin replied again: The answer is no for all 3 scenarios. We looked into this issue, discussed it with the Visual Studio / TFS team, and in the end agreed to go with their latest guidance which is to install Visual Studio (e.g. VS2013 Express for Web) on the build machine. This is how Visual Studio Online is doing it and it's the approach recommended for customers setting up their own TFS build servers. I would hope this is compatible with 3rd party build servers but have not verified whether this works with TeamCity etc. Note that DacFx MSI isn't a suitable release vehicle for this as we don't want to include Visual Studio/MSBuild dependencies in that package. It's meant to just include the core DacFx DLLs used by SSMS, SqlPackage.exe on the command line, etc. What this means is we won't be providing a separate MSI installer or nuget package with just the necessary build DLLs you need to run your build and tests. If someone wanted to create a script that generated a nuget package based on our DLLs and targets files, then release that somewhere on the web for easier integration with 3rd party build servers we've no problem with that. Again, here’s the link to the thread and its worth reading in its entirety if this is something that interests you. So there you have it. Microsoft will not be be providing support for headless build servers for SSDT but if someone in the community wants to go ahead and roll their own, go right ahead. @Jamiet

    Read the article

  • CRM@Oracle Series: Showcasing Innovation with Oracle Customer Hub

    - by tony.berk
    When is having too many customers a challenge? It is not something too many people would complain about. But from a data perspective, one challenge is to keep each customer's data consistent across multiple enterprise systems such as CRM, ERP, and all of your other related applications. Buckle your seat belts, we are going a bit technical today... If you have ever tried it, you know it isn't easy. If you haven't, don't go there alone! Customer data integration projects are challenging and, depending on the environment, require sharp, innovative people to succeed. Want to hear from some guys who have done it and succeeded? Here is an interview with Dan Lanir and Afzal Asif from Oracle's Applications IT CRM Systems group on implementing Oracle Customer Hub and innovation. For more interesting discussions on innovation, check out the Oracle Innovation Showcase.

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >