Search Results

Search found 4276 results on 172 pages for 'integration'.

Page 17/172 | < Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >

  • SQL Server 2012 Integration Services - Using PowerShell to Configure Project Environments

    Continuing our discussion on how to leverage the capabilities of PowerShell to automate the most basic SSIS management tasks, this article will explore more complex topics by demonstrating the use of PowerShell in implementing and utilizing project environments. ‘Disturbing Development’Grant Fritchey & the DBA Team present the latest installment of the Top 5 hard-earned lessons of a DBA – read it now

    Read the article

  • Azure Blob and Entity Table Integration, extending the Thumbnail sample

    This article describes the concepts for doing CRUD (Create, Read, Update, Delete) operations on Windows Azure Tables and how table data can interact with the Blobs....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Continuous integration for Ubuntu Phone?

    - by Klax
    Is there any kind of framework that lets a controlling PC automate the flashing of a connected phone, waiting for the phone to boot and then tell it to download and execute tests from a repository? I know about Autopilot for applications, but I'm more interested in CI of boot loaders, drivers and platform stuff. A related question: Is there a global repository of tests for Ubuntu Phone? Best regards

    Read the article

  • SQL Server 2012 Integration Services - Unattended Execution of SSIS Packages

    Quite often, tasks accomplished via SSIS are a part of procedures that run unattended, either scheduled to launch at a particular date and time or triggered by some arbitrarily chosen event. Marcin Policht shares a typical approach to implementing such a scenario. 12 essential tools for database professionalsThe SQL Developer Bundle contains 12 tools designed with the SQL Server developer and DBA in mind. Try it now.

    Read the article

  • UIDocumentInteractionController in iOS for Whatsapp Integration

    - by Smitha
    I was looking for a way to share my application created files through Whatsapp. I found we can chat using custom URL scheme provided by whatsapp : https://www.whatsapp.com/faq/iphone/23559013. But, to share files of my app, read that I have to use DicumentIntercationController. So, is there any smaple available? Also, when I read about apple docs, it says, "Create an instance of the UIDocumentInteractionController class for each file you want to interact with" (https://developer.apple.com/library/ios/documentation/FileManagement/Conceptual/DocumentInteraction_TopicsForIOS/Articles/PreviewingandOpeningItems.html#//apple_ref/doc/uid/TP40010410-SW1). So is it necessary to create instance for each file? I just want to share the files as and when i receive it through my service to my app. Thanks for the help in advance.

    Read the article

  • Twitter integration

    - by qaisjp
    My computer game is powered using Love2d in Lua, there is dead space in the menu of my game and I'd like to fill it up with something. So I'll like to put a twitter feed there, how can I receive all the twitter posts created by AND mentioned from @stickydestroyer; how can I make it look good and code the actual thing. I know I have to use some sort of cURL module, but how can I get the feed AND make it looking nicely?

    Read the article

  • Libreoffice theme integration not working (Xubuntu)

    - by Treepata
    I am running Xubuntu 11.10 and everything works nicely. I have selected a nice Xfce theme and everything looks beautiful, except Libreoffice. While other programmes (Gimp, Inkscape, Thunar etc.) integrate with the selected Appearance & Window Manager theme, Libreoffice looks like a Windows 98 programme. Does anyone know how to fix this? I have already tried this solution, without success: http://ubuntuforums.org/showthread.php?t=1584010&page=2 Thanks for your help!

    Read the article

  • How to manage end user documentation for a project under continuous integration?

    - by mcdon
    I have a project under continuous integration and would like to add end user documentation to the project. The end user documentation is a user manual, not API documentation. In our environment we use windows, c#, msbuild, cruisecontrol.net and subversion. We are currently using DocToHelp to create our help file, which is based on an msword document. I'm looking for some guidance on how to manage the end user documentation. What documentation tools should I use? Should any of the documentation tools be part of the build script? Should the output files from the documentation tool be stored in subversion? What type of help files would be best to use?

    Read the article

  • How to leverage Spring Integration in a real-world JMS distributed architecture?

    - by ngeek
    For the following scenario I am looking for your advices and tips on best practices: In a distributed (mainly Java-based) system with: many (different) client applications (web-app, command-line tools, REST API) a central JMS message broker (currently in favor of using ActiveMQ) multiple stand-alone processing nodes (running on multiple remote machines, computing expensive operations of different types as specified by the JMS message payload) How would one best apply the JMS support provided by the Spring Integration framework to decouple the clients from the worker nodes? When reading through the reference documentation and some very first experiments it looks like the configuration of an JMS inbound adapter inherently require to use a subscriber, which in a decoupled scenario does not exist. Small side note: communication should happen via JMS text messages (using a JSON data structure for future extensibility).

    Read the article

  • Why hasn't anybody started a hosted continuous integration service?

    - by Teflon Ted
    There's a dozen services that provide hosted version control, hosted ticket tracking, hosted project management, and combinations of all of the above, there's even hosted web-based IDEs. But nobody's yet offered a hosted continuous integration service; at least that I can find. The concept seems simple enough: I register and provide the URL to my source code repository, it grabs my code and builds it via ant/rake/whatever, then runs the suite of tests and some metrics (code coverage, performance, etc.). Is there some prohibitive barrier to entry I'm not considering?

    Read the article

  • Installing Hyper-V Integration Components on Linux

    - by Lance Fisher
    Some big news this week was Microsoft released the Hyper-V integration components for Linux source code under the GPL v2. I just installed Ubuntu Server 9.04 in a Hyper-V VM with a Legacy Network Adapter. How do I install the integration components? Do I have to wait until they are included in the kernel?

    Read the article

  • And the Winners of Fusion Middleware Innovation Awards in Data Integration are…

    - by Irem Radzik
    Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin;} At OpenWorld, we announced the winners of Fusion Middleware Innovation Awards 2012. Raymond James and Morrison Supermarkets were selected for the data integration category for their innovative use of Oracle’s data integration products and the great results they have achieved. In this blog I would like to briefly introduce you to these award winning projects. Raymond James is a diversified financial services company, which provides financial planning, wealth management, investment banking, and asset management. They are using Oracle GoldenGate and Oracle Data Integrator to feed their operational data store (ODS), which supports application services across the enterprise. A major requirement for their project was low data latency, as key decisions are made based on the data in the ODS. They were able to fulfill this requirement due to the Oracle Data Integrator’s integrated solution with Oracle GoldenGate. Oracle GoldenGate captures changed data from different systems including Oracle Database, HP NonStop and Microsoft SQL Server into a single data store on SQL Server 2008. Oracle Data Integrator provides data transformations for the ODS. Leveraging ODI’s integration with GoldenGate, Raymond James now sees a 9 second median latency (from source commit to ODS target commit). The ODS solution delivers high quality, accurate data for consuming applications such as Raymond James’ next generation client and portfolio management systems as well as real-time operational reporting. It enables timely information for making better decisions. There are more benefits Raymond James achieved with this implementation of Oracle’s data integration solution. The software developers and architects of this solution, Tim Garrod and Ryan Fonnett, have told us during their presentation at OpenWorld that they also reduced application complexity significantly while improving developer productivity through trusted operational services. They were able to utilize CDC to generate alerts for business users, and for applications (for example for cache hydration mechanisms). One cool innovation example among many in this project is that using ODI's flexible architecture, Tim and Ryan could build 24/7 self-healing processes. And these processes have hardly failed. Integration processes fixes the errors itself. Pretty amazing; and a great solution for environments that need such reliability and availability. (You can see Tim and Ryan’s photo with the Innovation Award above.) The other winner of this year in the data integration category, Morrison Supermarkets, is the UK’s 4th largest grocery retailer. The company has been migrating all their legacy applications on to a new-world application set based on Oracle and consolidating all BI on to a single Oracle platform. The company recently implemented Oracle Exadata as the data warehouse engine and uses Oracle Business Intelligence EE. Their goal with deploying GoldenGate and ODI was to provide BI data to the enterprise in a way that it also supports operational decision making requirements from a wide range of Oracle based ERP applications such as E-Business Suite, PeopleSoft, Oracle Retail Suite. They use GoldenGate’s log-based change data capture capabilities and Oracle Data Integrator to populate the Oracle Retail Data Model. The electronic point of sale (EPOS) integration solution they built processes over 80 million transactions/day at busy periods in near real time (15 mins). It provides valuable insight to Retail and Commercial teams for both intra-day and historical trend analysis. As I mentioned in yesterday’s blog, the right data integration platform can transform the business. Here is another example: The point-of-sale integration enabled the grocery chain to optimize its stock management, leading to another award: Morrisons won the Grocer 33 award in 2012 - beating all other major UK supermarkets in product availability. Congratulations, Morrisons,on another award! Celebrating the innovation and the success of our customers with Oracle’s data integration products was definitely a highlight of Oracle OpenWorld for me. I look forward to hearing more from Raymond James, Morrisons, and the other customers that presented their data integration projects at OpenWorld, on how they are creating more value for their organizations.

    Read the article

  • Continuous Integration with 64-bit Sharepoint and TFS 2008?

    - by Hirvox
    I've set up a 64-bit TFS 2008 build server with Sharepoint, continuous integration and out-of-the-box MSTest. Unit tests for plain business logic classes run just fine and test results are published into TFS. However, any test that uses Sharepoint's API fails horribly, SPFarm.Local returning null and so on. Is there a way to fix this? The tests run fine in an otherwise identical 32-bit development environment (Windows Server 2008 under Hyper-V, Sharepoint patched up to June 2009 cumulative update) from both Visual Studio and command line, so the problem is not about improper use of SPContext.Current or any other part of the API that needs to be run in a web server context. I've ruled out permissions issues, because the build agent account can deploy the solution and create site collections just fine with stsadm. The next culprit could be that the unit tests were being run with a 32-bit process, which couldn't access the 64-bit Sharepoint API properly. I tried a workaround, but it has the side effect of disabling TFS support in MSTest. Do I have to wait for 2010 versions of MS tools (and hope for the best) or is there a third-party test framework available that runs natively in 64 bit and can publish test results into TFS 2008?

    Read the article

  • Can an internally developed fast evolving, agile, short sprint web application lend itself to offshoring?

    - by Gavin Howden
    I have recently been set a target to achieve readiness to successfully manage and deliver results through the usage of offshore teams on our mainline development project within 12 months. Our mainline is a multi-thousand user highly available web application, and various related SAAS components delivered through the above mentioned web application. We work agile on the mainline with a rapid 1 week sprint using continuous integration. Our delivery platform is a bespoke php framework, although we have some .net services and components in the mix. My view is: an offshore team could work if we either ship out an entire isolated project for offshore development, or we specify a component for our system in huge detail up front. But we don't currently work like that, and it will conflict with the in-house method, and unless the off-shore is working within our team, with our development/deployment chain it could be an integration nightmare. So my question is, given we have a closed source bespoke framework (Private IP) which we train our developers to use, and we work agile minimising documentation, maximising communication and responding to rapidly changing requirements, and much of the quality control is via team skills building and peer review, how can I make off-shoring work on our main line development?

    Read the article

  • Performance issues with jms and spring integration. What is wrong with the following configuration?

    - by user358448
    I have a jms producer, which generates many messages per second, which are sent to amq persistent queue and are consumed by single consumer, which needs to process them sequentially. But it seems that the producer is much faster than the consumer and i am having performance and memory problems. Messages are fetched very very slowly and the consuming seems to happen on intervals (the consumer "asks" for messages in polling fashion, which is strange?!) Basically everything happens with spring integration. Here is the configuration at the producer side. First stake messages come in stakesInMemoryChannel, from there, they are filtered throw the filteredStakesChannel and from there they are going into the jms queue (using executor so the sending will happen in separate thread) <bean id="stakesQueue" class="org.apache.activemq.command.ActiveMQQueue"> <constructor-arg name="name" value="${jms.stakes.queue.name}" /> </bean> <int:channel id="stakesInMemoryChannel" /> <int:channel id="filteredStakesChannel" > <int:dispatcher task-executor="taskExecutor"/> </int:channel> <bean id="stakeFilterService" class="cayetano.games.stake.StakeFilterService"/> <int:filter input-channel="stakesInMemoryChannel" output-channel="filteredStakesChannel" throw-exception-on-rejection="false" expression="true"/> <jms:outbound-channel-adapter channel="filteredStakesChannel" destination="stakesQueue" delivery-persistent="true" explicit-qos-enabled="true" /> <task:executor id="taskExecutor" pool-size="100" /> The other application is consuming the messages like this... The messages come in stakesInputChannel from the jms stakesQueue, after that they are routed to 2 separate channels, one persists the message and the other do some other stuff, lets call it "processing". <bean id="stakesQueue" class="org.apache.activemq.command.ActiveMQQueue"> <constructor-arg name="name" value="${jms.stakes.queue.name}" /> </bean> <jms:message-driven-channel-adapter channel="stakesInputChannel" destination="stakesQueue" acknowledge="auto" concurrent-consumers="1" max-concurrent-consumers="1" /> <int:publish-subscribe-channel id="stakesInputChannel" /> <int:channel id="persistStakesChannel" /> <int:channel id="processStakesChannel" /> <int:recipient-list-router id="customRouter" input-channel="stakesInputChannel" timeout="3000" ignore-send-failures="true" apply-sequence="true" > <int:recipient channel="persistStakesChannel"/> <int:recipient channel="processStakesChannel"/> </int:recipient-list-router> <bean id="prefetchPolicy" class="org.apache.activemq.ActiveMQPrefetchPolicy"> <property name="queuePrefetch" value="${jms.broker.prefetch.policy}" /> </bean> <bean id="connectionFactory" class="org.springframework.jms.connection.CachingConnectionFactory"> <property name="targetConnectionFactory"> <bean class="org.apache.activemq.ActiveMQConnectionFactory"> <property name="brokerURL" value="${jms.broker.url}" /> <property name="prefetchPolicy" ref="prefetchPolicy" /> <property name="optimizeAcknowledge" value="true" /> <property name="useAsyncSend" value="true" /> </bean> </property> <property name="sessionCacheSize" value="10"/> <property name="cacheProducers" value="false"/> </bean>

    Read the article

  • What's wrong performing unit test against concrete implementation if your frameworks are not going to change?

    - by palm snow
    First a bit of background: We are re-architecting our product suite that was written 10 years ago and served its purpose. One thing that we cannot change is the database schema as we have 500+ client base using this system. Our db schema has over 150+ tables. We have decided on using Entity Framework 4.1 as DAL and still evaluating various frameworks for storing our business logic. I am investigation to bring unit testing into the mix but I also confused as to how far I need to go with setting up a full blown TDD environment. One aspect of setting up unit testing is by getting into implementing Repository, unit of work and mocking frameworks etc. This mean there will be cost and investment on the code-bloat associated with all these frameworks. I understand some of this could be auto-generated but when it comes to things like behaviors, that will be mostly hand written. Just to be clear, I am not questioning the important of unit testing your code. I am just not sure we need all its components (like repository, mocking etc.) when we are fairly certain of storage mechanism/framework (SQL Server/Entity Framework). All that code bloat with generic repositories make sense when you need a generic layers with ability to change this whenever you like however its very likely a YAGNI in our case. What we need is more of integration testing where we can unit-test our code with concrete repository objects and test data in database. In this scenario, just running integration test seem to be more beneficial in our case. Any thoughts if I am missing any thing here?

    Read the article

  • CI - How long is continous?

    - by Andy
    We currently are using CCNet as our continous integration server. Most projects check for changes every 30 seconds (the default) and if needed perform a build (unit tests, stylecop, fxcop, etc). We've gotten quite a few projects now, and the server spends most of its time near 100% cpu utilization. This has alarmed some of the development team, even though the server is responsive and builds are still about the same length of time they've always been. Its been suggested that we lower the check interval to about five minutes. To me that seems too long, and we risk people committing code and then going home for the weekend and now there's a broken build possibly holding up others. In response, the suggestion is that if someone needs to know the results they can force the build. But that seems to defeat the purpose of CI, as I thought it was supposed to be automated. My proposed solution is just to get another build server and split the builds amongst the servers. Am I thinking about this the wrong way, or is there a point where if integration isn't often enough you're not really doing CI anymore?

    Read the article

< Previous Page | 13 14 15 16 17 18 19 20 21 22 23 24  | Next Page >