Search Results

Search found 71449 results on 2858 pages for 'oracle data integration'.

Page 125/2858 | < Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >

  • Continuous Integration or Publishing System

    - by Chris M
    I've been testing Hudson for a few days now and for PHP projects it seems ok; What I need though is a system that will allow me to publish an SVN tag to an FTP folder without adding a pile of rubbish on the end; hudson gets overexcited and adds a pile of folders to the export. Are their any other decent systems; it needs to be simple to use as this will eventually fall under the control of 'admins' in order to meet sox compliance (aka the man with the gun cant pull the trigger). Basic Requirements: Must be free and downloadable to a server (FTP's internal only here) Needs to be able to interact with SVN Needs to be able to publish to FTP Needs matrix login permissions (or AD if its something that can go on IIS) Needs to be auditable (logging) Tell me if im not being clear enough here; thanks. Chris

    Read the article

  • Strange error inserting new relationship data in core-data

    - by michael
    My app will allow users to create a personalised list of events from a large list of events. I have a table view which simply displays these events, tapping on one of them takes the user to the event details view, which has a button "add to my events". In this detailed view I own the original event object, retrieved via an NSFetchedResultsController and passed to the detailed view (via a table cell, the same as the core data recipes sample). I have no trouble retrieving/displaying information from this "event". I am then trying to add it to the list of MyEvents represented by a one to many (inverse) relationship: This code: NSManagedObjectContext *context = [event managedObjectContext]; MyEvents *myEvents = (MyEvents *)[NSEntityDescription insertNewObjectForEntityForName:@"MyEvents" inManagedObjectContext:context]; [myEvents addEventObject:event];//ERROR And this code (suggested below): //would this add to or overwrite the "list" i am attempting to maintain NSManagedObjectContext *context = [event managedObjectContext]; MyEvents *myEvents = (MyEvents *)[NSEntityDescription insertNewObjectForEntityForName:@"MyEvents" inManagedObjectContext:context]; NSMutableSet *myEvent = [myEvents mutableSetValueForKey:@"event"]; [myEvent addObject:event]; //ERROR Bot produce (at the line indicated by //ERROR): *** -[NSComparisonPredicate evaluateWithObject:]: message sent to deallocated instance Seems I may have missed something fundamental. I cant glean any more information through the use of debugging tools, with my knowledge of them. 1) Is this a valid way to compile and store an editable list like this? 2) Is there a better way? 3) What could possibly be the deallocated instance in error? -- I have now modified the Event entity to have a to-many relationship called "myEvents" which referrers to itself. I can add Events to this fine, and logging the object shows the correct memory addresses appearing for the relationship after a [event addMyEventObject:event];. The same failure happens right after this however. I am still at a loss to understand what is going wrong. This is the backtrace #0 0x01f753a7 in ___forwarding___ () #1 0x01f516c2 in __forwarding_prep_0___ () #2 0x01c5aa8f in -[NSFetchedResultsController(PrivateMethods) _preprocessUpdatedObjects:insertsInfo:deletesInfo:updatesInfo:sectionsWithDeletes:newSectionNames:treatAsRefreshes:] () #3 0x01c5d63b in -[NSFetchedResultsController(PrivateMethods) _managedObjectContextDidChange:] () #4 0x0002e63a in _nsnote_callback () #5 0x01f40005 in _CFXNotificationPostNotification () #6 0x0002bef0 in -[NSNotificationCenter postNotificationName:object:userInfo:] () #7 0x01bbe17d in -[NSManagedObjectContext(_NSInternalNotificationHandling) _postObjectsDidChangeNotificationWithUserInfo:] () #8 0x01c1d763 in -[NSManagedObjectContext(_NSInternalChangeProcessing) _createAndPostChangeNotification:withDeletions:withUpdates:withRefreshes:] () #9 0x01ba25ea in -[NSManagedObjectContext(_NSInternalChangeProcessing) _processRecentChanges:] () #10 0x01bdfb3a in -[NSManagedObjectContext processPendingChanges] () #11 0x01bd0957 in _performRunLoopAction () #12 0x01f4d252 in __CFRunLoopDoObservers () #13 0x01f4c65f in CFRunLoopRunSpecific () #14 0x01f4bc48 in CFRunLoopRunInMode () #15 0x0273878d in GSEventRunModal () #16 0x02738852 in GSEventRun () #17 0x002ba003 in UIApplicationMain () Cheers.

    Read the article

  • symfony/zend integration - blank screen

    - by user142176
    Hi, I need to use ZendAMF on a symfony project and I'm currently working on integrating the two. I have a frontend app with two modules, one of which is 'gateway' - the AMF gateway. In my frontend app config, I have the following in the configure function: // load symfony autoloading first parent::initialize(); // Integrate Zend Framework require_once('[MY PATH TO ZEND]\Loader.php'); spl_autoload_register(array('Zend_Loader', 'autoload')); The executeIndex function my the gateway actions.class.php looks like this // No Layout $this->setLayout(false); // Set MIME Type $this->getResponse()->setContentType('application/x-amf; charset='.sfConfig::get('sf_charset')); // Disable cause this is a non-html page sfConfig::set('sf_web_debug', false); // Create AMF Server $server = new Zend_Amf_Server(); $server->setClass('MYCLASS'); echo $server->handle(); return sfView::NONE; Now when I try to visit the url for the gateway module, or even the other module which was working perfectly fine until this attempt, I only see a blank screen, with not even the symfony dev bar loaded. Oddly enough, my symfony logs are not being updated as well, which suggests that Synfony is not even being 'reached'. So presumably the error has something to do with Zend, but I have no idea how to figure out what the error could be. One thing I do know for sure is that this is not a file path error, because if I change the path in the following line (a part of frontendConfiguration as shown above), I get a Zend_Amf_Server not found error. So the path must be correct. Also if I comment out this very same line, the second module resumes to normality, and my gateway broadcasts a blank x-amf stream. spl_autoload_register(array('Zend_Loader', 'autoload')); Does anyone have any tips on how I could attach this problem? Thanks P.S. I'm currently running an older version of Zend, which is why I am using Zend_Loader instead of Zend_autoLoader (I think). But I've tried switching to the new lib, but the error still remains. So it's not a version problem as well.

    Read the article

  • Build Pipelining and Continuous Integration with Maven and Hudson

    - by Brandon
    Currently the my team is considering splitting our single CI build process into a more streamlined multi-stage process to speed up basic build feedback and isolate different ci concerns. The idea we had was to have each stage exist in Hudson as a different build with the correct maven goal or maven plugin execution, then chain them together using the post-build hooks of Hudson. However to my knowledge, Maven as a build tool mandates that any lifecycle phase which is performed automatically builds every preceding lifecycle phase. This presents a number of problems the most significant of which is that maven is recreating the build resources with each distinct call and not using those of the previous stage. This not only breaks the consistency of the build lifecycle but has much more unnecessary processing overhead. Is there a way to accomplish pipelining with CI using Maven? Assuming there is, is there a way to let Hudson know to use those resources built from the previous stage in the next one?

    Read the article

  • Continuous Integration of Git on Windows

    - by Duncan
    Hey All, assuming I'm running a small shop (3 devs) and using a Windows 7 machine as a centralised Git and IIS server what is the easiest way to get CI up and running? This must be locally hosted CI (no github, no remote servers). I'm doing C# .Net development with Visual Studio 2008. Any help on getting this running with the minimum of effort and the nicest possible UI would be extremely helpful. Thanks!

    Read the article

  • Continuous integration with multiple branch development

    - by ryanprayogo
    In the project that I'm working on, we are using SVN with 'Stable Trunk' strategy. What that means is that for each bug that is found, QA opens a bug ticket and assigns it to a developer. Then, a developer fixes that bug and checks it in a branch (off trunk, let's call this the bug branch) and that branch will only contain fixes for that particular bug ticket When we decided to do a release, for each bug fixes that we want to release to the customer, a developer will merge all the fixes from several bug branch to trunk and proceed with the normal QA cycle. The problem is that we use trunk as the codebase for our CI job (Hudson, specifically), and therefore, for all commits to the bug branch, it will miss the daily build until it gets merged to trunk when we decided to release the new version of the software. Obviously, that defeats the purpose of having CI. What is the proper way to fix this issue?

    Read the article

  • IIS Integration testing: remove X-Powered-By: ASP.NET header

    - by stacker
    I want to have a test that testing the inexistent of this http headers, using NAnt and NUnit: X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Edit: I'm asking hot to actually test this rule: "don't have asp headers". so, I can have this test in each new website that I'm doing, so it make it easier no to forget this simple step.

    Read the article

  • troubles with integration on matlab

    - by user648666
    I'd like some help please I really need to solve this problem. Well before anything thank you for your time... My problem: I have a matrix (826x826 double) and I want to integrate this matrix with respect to a vector of (826x1 double) I don't have the functions of any of this. Is there a command or an algorithm to take the integral of a matrix with respect to a vector? Please I really need help, I'm such a newbie at matlab. Sincerely. George

    Read the article

  • Build model with nested model in rspec integration test

    - by user1116573
    I understand that I can do something like in rspec: let(:project) { Project.new } but in my app a project accepts_nested_attributes_for tasks and when I generate the Project form I build a task along with it using: @project = Project.new @project.tasks.build I need something like: let(:project) { Project.new.tasks.build } but that doesn't seem to work. How can I do this as a let in my rspec test?

    Read the article

  • Using Microsoft Office 2007 with E-Business Suite Release 12

    - by Steven Chan
    Many products in the Oracle E-Business Suite offer optional integrations with Microsoft Office and Microsoft Projects.  For example, some EBS products can export tabular reports to Microsoft Excel.  Some EBS products integrate directly with Microsoft products, and others work through the Applications Desktop Integrator (WebADI and ADI) as an intermediary.These EBS integrations have historically been documented in their respective product-specific documentation.  In other words, if an EBS product in the Oracle Financials family supported an integration with, say, Microsoft Excel, it was up to the product team to document that in the Oracle Financials documentation.Some EBS systems administrators have found the process of hunting through the various product-specific documents for Office-related information to be a bit difficult.  In response to your Service Requests and emails, we've released a new document that consolidates and summarises all patching and configuration requirements for EBS products with MS Office integration points in a single place:Using Microsoft Office 2007 with Oracle E-Business Suite 11i and R12 (Note 1072807.1)

    Read the article

  • Sensitive Data Storage - Best Practices

    - by Kenneth
    I recently started working on a personal project where I was connecting to a database using Java. This got me thinking. I have to provide the login information for a database account on the DB server in order to access the database. But if I hard code it in then it would be possible for someone to decompile the program and extract that login info. If I store it in an external setup file then the same problem exists only it would be even easier for them to get it. I could encrypt the data before storing it in either place but it seems like that's not really a fail safe either and I'm no encryption expert by any means. So what are some best practices for storing sensitive setup data for a program?

    Read the article

  • SSIS Lookup component tuning tips

    - by jamiet
    Yesterday evening I attended a London meeting of the UK SQL Server User Group at Microsoft’s offices in London Victoria. As usual it was both a fun and informative evening and in particular there seemed to be a few questions arising about tuning the SSIS Lookup component; I rattled off some comments and figured it would be prudent to drop some of them into a dedicated blog post, hence the one you are reading right now. Scene setting A popular pattern in SSIS is to use a Lookup component to determine whether a record in the pipeline already exists in the intended destination table or not and I cover this pattern in my 2006 blog post Checking if a row exists and if it does, has it changed? (note to self: must rewrite that blog post for SSIS2008). Fundamentally the SSIS lookup component (when using FullCache option) sucks some data out of a database and holds it in memory so that it can be compared to data in the pipeline. One of the big benefits of using SSIS dataflows is that they process data one buffer at a time; that means that not all of the data from your source exists in the dataflow at the same time and is why a SSIS dataflow can process data volumes that far exceed the available memory. However, that only applies to data in the pipeline; for reasons that are hopefully obvious ALL of the data in the lookup set must exist in the memory cache for the duration of the dataflow’s execution which means that any memory used by the lookup cache will not be available to be used as a pipeline buffer. Moreover, there’s an obvious correlation between the amount of data in the lookup cache and the time it takes to charge that cache; the more data you have then the longer it will take to charge and the longer you have to wait until the dataflow actually starts to do anything. For these reasons your goal is simple: ensure that the lookup cache contains as little data as possible. General tips Here is a simple tick list you can follow in order to tune your lookups: Use a SQL statement to charge your cache, don’t just pick a table from the dropdown list made available to you. (Read why in SELECT *... or select from a dropdown in an OLE DB Source component?) Only pick the columns that you need, ignore everything else Make the database columns that your cache is populated from as narrow as possible. If a column is defined as VARCHAR(20) then SSIS will allocate 20 bytes for every value in that column – that is a big waste if the actual values are significantly less than 20 characters in length. Do you need DT_WSTR typed columns or will DT_STR suffice? DT_WSTR uses twice the amount of space to hold values that can be stored using a DT_STR so if you can use DT_STR, consider doing so. Same principle goes for the numerical datatypes DT_I2/DT_I4/DT_I8. Only populate the cache with data that you KNOW you will need. In other words, think about your WHERE clause! Thinking outside the box It is tempting to build a large monolithic dataflow that does many things, one of which is a Lookup. Often though you can make better use of your available resources by, well, mixing things up a little and here are a few ideas to get your creative juices flowing: There is no rule that says everything has to happen in a single dataflow. If you have some particularly resource intensive lookups then consider putting that lookup into a dataflow all of its own and using raw files to pass the pipeline data in and out of that dataflow. Know your data. If you think, for example, that the majority of your incoming rows will match with only a small subset of your lookup data then consider chaining multiple lookup components together; the first would use a FullCache containing that data subset and the remaining data that doesn’t find a match could be passed to a second lookup that perhaps uses a NoCache lookup thus negating the need to pull all of that least-used lookup data into memory. Do you need to process all of your incoming data all at once? If you can process different partitions of your data separately then you can partition your lookup cache as well. For example, if you are using a lookup to convert a location into a [LocationId] then why not process your data one region at a time? This will mean your lookup cache only has to contain data for the location that you are currently processing and with the ability of the Lookup in SSIS2008 and beyond to charge the cache using a dynamically built SQL statement you’ll be able to achieve it using the same dataflow and simply loop over it using a ForEach loop. Taking the previous data partitioning idea further … a dataflow can contain more than one data path so why not split your data using a conditional split component and, again, charge your lookup caches with only the data that they need for that partition. Lookups have two uses: to (1) find a matching row from the lookup set and (2) put attributes from that matching row into the pipeline. Ask yourself, do you need to do these two things at the same time? After all once you have the key column(s) from your lookup set then you can use that key to get the rest of attributes further downstream, perhaps even in another dataflow. Are you using the same lookup data set multiple times? If so, consider the file caching option in SSIS 2008 and beyond. Above all, experiment and be creative with different combinations. You may be surprised at what works. Final  thoughts If you want to know more about how the Lookup component differs in SSIS2008 from SSIS2005 then I have a dedicated blog post about that at Lookup component gets a makeover. I am on a mini-crusade at the moment to get a BULK MERGE feature into the database engine, the thinking being that if the database engine can quickly merge massive amounts of data in a similar manner to how it can insert massive amounts using BULK INSERT then that’s a lot of work that wouldn’t have to be done in the SSIS pipeline. If you think that is a good idea then go and vote for BULK MERGE on Connect. If you have any other tips to share then please stick them in the comments. Hope this helps! @Jamiet Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Upcoming UPK Events

    - by kathryn.lustenberger(at)oracle.com
    February 15th: UPK: Follow Panduit's Lead and Leverage Oracle's User Productivity Kit To Achieve Your Goals - Join us for a live webcast to learn how Oracle's User Productivity Kit can help you meet and exceed your goals. The webcast will feature Jim Boss, from the Panduit Corporation, who will share how Oracle's User Productivity Kit was used with both Oracle and Non-Oracle applications to helped Panduit to meet their goals. Date: February 15th, 2011 at 12:00 PST / 3:00 EST Evite: http://www.oracle.com/us/dm/65630-naod10046029mpp005c010-se-300908.html March 2nd: Synaptis teams with Oracle to deliver a UPK customer success story - Webinar Offering The Value of UPK (Customer Success Story): How to leverage the value of UPK to streamline processes and maximize end user adoption for a global implementation Join us to learn how the power of UPK can be leveraged to train end users globally in a successful and cost effective manner. A valued Oracle UPK customer will share experiences, successes, challenges, and strategies. The webinar will also include a question and answer session to give the attendees an opportunity to interact directly with the Oracle UPK customer, Synaptis, and the Oracle UPK Team. Date: March 2, 2011 Time: 11:00am - 12:00pm EST Register for this webinar March 27 - 30th: The Alliance 2011 conference is an annual event for all higher education, government, and public sector users of Oracle applications. The Alliance conference is organized and managed by the Higher Education User Group (www.heug.org). This is the 14th annual event for the HEUG. This is your opportunity to join with over 3200 other Higher Education, Federal, State and Local Government users to network, learn and share in our amazing combined experiences. The Alliance conference team is hard at work, putting together the best conference ever for 2011 - so don't delay, make your plans now to be part of Alliance 2011! When: Sunday, March 27th, 2011 - Wednesday, March 30, 2011 Where: The Colorado Convention Center (Denver, Colorado) Registration for Alliance 2011 is Now Open! UPK will be represented at this event offering: Pre-Conference Training Learn the Basics of Oracle User Productivity Kit (UPK) Taking Your UPKs to a Whole New Level, Advanced Use of UPK Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Beyond Basic UPK -- User Tracking and SmartHelp Leveraging Oracle and User Productivity Kit (UPK) to Develop a Comprehensive Training Program Oracle User Productivity Kit Strategy and Roadmap -- Key to User Adoption April 10 - 14th: Registration for COLLABORATE 11 has begun - Don't miss the most comprehensive, user-driven conference devoted to Oracle applications and technology. Collaborate with a global network of more than 5,000 peers and experts to share real-world experiences, solve your challenges and gain insights to validate your technology plans. Read below to discover which group to register with for the best value. UPK will be represented at this event offering: Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Centralize all Project Team assets, AND, Deploy Fully Measurable Training with UPK Pro Oracle User Productivity Kit Strategy and Roadmap - Key to User Adoption Registration is Now Open!

    Read the article

  • The Best Ways to Lock Down Your Multi-User Computer

    - by Lori Kaufman
    Whether you’re sharing a computer with other family members or friends at home, or securing computers in a corporate environment, there may be many reasons why you need to protect the programs, data, and settings on the computers. This article presents multiple ways of locking down a Windows 7 computer, depending on the type of usage being employed by the users. You may need to use a combination of several of the following methods to protect your programs, data, and settings. How to Stress Test the Hard Drives in Your PC or Server How To Customize Your Android Lock Screen with WidgetLocker The Best Free Portable Apps for Your Flash Drive Toolkit

    Read the article

  • Passing data from one database to another database table (Access) (C#)

    - by SAMIR BHOGAYTA
    string conString = "Provider=Microsoft.Jet.OLEDB.4.0 ;Data Source=Backup.mdb;Jet OLEDB:Database Password=12345"; OleDbConnection dbconn = new OleDbConnection(); OleDbDataAdapter dAdapter = new OleDbDataAdapter(); OleDbCommand dbcommand = new OleDbCommand(); try { if (dbconn.State == ConnectionState.Closed) dbconn.Open(); string selQuery = "INSERT INTO [Master] SELECT * FROM [MS Access;DATABASE="+ "\\Data.mdb" + ";].[Master]"; dbcommand.CommandText = selQuery; dbcommand.CommandType = CommandType.Text; dbcommand.Connection = dbconn; int result = dbcommand.ExecuteNonQuery(); } catch(Exception ex) {}

    Read the article

  • WCF/ADO.NET Data Services - Could not load type 'System.Data.Services.Providers.IDataServiceUpdatePr

    - by Sahil Malik
    Ad:: SharePoint 2007 Training in .NET 3.5 technologies (more information). When you try accessing ListData.svc, do you get the following error? Could not load type 'System.Data.Services.Providers.IDataServiceUpdateProvider' from assembly 'System.Data.Services, Version=3.5.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089'. Well, if you followed the instructions in Chapter 1 of my book to build your VM, you wouldn’t run into the above issue. But if you do, you need to install  -   For Windows Vista and Windows 2008 - http://www.microsoft.com/downloads/details.aspx?familyid=4B710B89-8576-46CF-A4BF-331A9306D555&displaylang=en For Windows 7 and Windows 2008 R2 - http://www.microsoft.com/downloads/details.aspx?familyid=79d7f6f8-d6e9-4b8c-8640-17f89452148e&displaylang=en Remember to: a) Install the x64 version, and b) Do an IISReset before trying again. Comment on the article ....

    Read the article

  • Yes, you can benefit from both data and backup compression

    - by AaronBertrand
    Earlier today, MSSQLTips posted a backup compression tip by Thomas LaRock ( blog | twitter ). In that article, Tom states: "If you are already compressing data then you will not see much benefit from backup compression." I don't want to argue with a rock star, and I will concede that he may be right in some scenarios. Nonetheless, I tweeted that "it depends;" Thomas then asked for "an example where you have data comp and you also see a large benefit from backup comp?" My initial reaction came about...(read more)

    Read the article

  • Protect Data and Save Money? Learn How Best-in-Class Organizations do Both

    - by roxana.bradescu
    Databases contain nearly two-thirds of the sensitive information that must be protected as part of any organization's overall approach to security, risk management, and compliance. Solutions for protecting data housed in databases vary from encrypting data at the application level to defense-in-depth protection of the database itself. So is there a difference? Absolutely! According to new research from the Aberdeen Group, Best-in-Class organizations experience fewer data breaches and audit deficiencies - at lower cost -- by deploying database security solutions. And the results are dramatic: Aberdeen found that organizations encrypting data within their databases achieved 30% fewer data breaches and 15% greater audit efficiency with 34% less total cost when compared to organizations encrypting data within applications. Join us for a live webcast with Derek Brink, Vice President and Research Fellow at the Aberdeen Group, next week to learn how your organization can become Best-in-Class.

    Read the article

  • links for 2010-12-08

    - by Bob Rhubart
    Rittman Mead Consulting Blog Archive Oracle BI EE 11g &#8211; Managing Host Name Changes Rittman Mead's Venkatakrishnan J (an Oracle ACE) looks at "how we can go about getting BI EE 11g to work when the Host Name of the machine changes post install and configuration."  (tags: oracle businessintelligence obiee) Evident Software's CTO and co-founder discusses Oracle Coherence (Application Grid) (tags: oracle successcast podcast coherence grid) Choice Hotels' Rain Fletcher talks WebLogic Server (Application Grid) (tags: oracle successcast weblogic podcast) Baz Khuti, CTO of Avocent discusses their next generation application and WebLogic (Application Grid) (tags: oracle successcast podcast) Oracle Counts Clouds | JAVA Developer's Journal "The Independent Oracle Users Group (IOUG), which has 20,000 members, has run up an Oracle-sponsored survey that found significant cloud adoption by Oracle users."  (tags: oracle cloud survey ioug) Oracle Customers Warm to the Cloud | CTO Edge "The Independent Oracle Users Group (IOUG) has published the results of a study on cloud adoption and, to no surprise, the enterprise loves the cloud. " (tags: oracle cloud ioug survey)

    Read the article

  • Manufacturing Leadership 2011 Awards Event, Oracle Winners!

    - by stephen.slade(at)oracle.com
    Ready to Revolutionize Manufacturing?During the annual Manufacturing Leadership 2011 awards event, over 20 Oracle customers will be receiving leadership awards in various categories for mastery in applications, technology and/or innovation. Held at the Breakers Hotel in Palm Beach May 6-9, Oracle will sponsor exhibits and speaking engagements as well as hosting customers and prospects at the event.If you are in manufacturing and want to achieve the upper quartile of performance, up with the best-in-class performers, this is the event you should consider attending. Four time Superbowl champion quarterback Terry Bradshaw will be the Keynote speaker.Event Link: http://blog.managingautomation.com/summit/

    Read the article

  • Oracle Lean Supply Chain Newsletter

    - by [email protected]
    Ready to ride the cutting edge? Leader or Laggard? There's plenty of new material and exciting articles on Oracle Supply Chain products in the quarterly newsletter, the February '10 issue contained some interesting articles on: - Supply Chains in the new 'Abnornal" - Manufacturers go Paperless to Boost Lean - Five Good Reasons to go to Release 12.1 - Software and Hardware complete with the Sun acquisition See details at: http://www.oracle.com/newsletters/samples/supply-chain-management.html Stay tuned for the May'10 issue and some great articles worth reviewing

    Read the article

  • Customer Insight. Trend, Modelli e Tecnologie di Successo nel CRM di ultima generazione

    - by antonella.buonagurio(at)oracle.com
    Lo scorso 27 gennaio a Roma si è tenuta la 3° tappa del CRM On Demand Roadshow. L'iniziativa è stata un un momento di incontro e confronto tra Direttori Marketing, esperti di CRM e Direttori Sales, sui nuovi trend del marketing relazionale.   Grazie altri interventi di ItalTBS, Bricofer, Renault Italia, Avis,  IRCCS, San Raffale e con la moderazione del Prof. Maurizio Mesenzani  si sono condivise idee, esperienze, riflessioni sugli strumenti che ad oggi si sono dimostrati essere i  più efficaci per individuare i bisogni del cliente, trasformare i clienti potenziali in clienti soddisfatti, creare engagement. Continua a leggere per vedere le presentazioni

    Read the article

< Previous Page | 121 122 123 124 125 126 127 128 129 130 131 132  | Next Page >