Search Results

Search found 26214 results on 1049 pages for 'farm solution'.

Page 58/1049 | < Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >

  • Work Item Traceability in TFS 2010

    - by Sam Patrick
    I have created a Windows Form project (VS solution) under a TFS 2010 project. I may eventually add more solutions to the TFS project. My question: Can we create a Use Case WIT for a specific solution within a TFS project? Furthermore, is it possible to create a "traceability matrix" that starts at the Use Case level and goes down to the the code level (at least the namespace level) of that particular VS solution?

    Read the article

  • A Multi-Channel Contact Center Can Reduce Total Cost of Ownership

    - by Tom Floodeen
    In order to remain competitive in today’s market, CRM customers need to provide feature-rich superior call center experience to their customers across all communication channels while improving their service agent productivity. They also require their call center to be deeply integrated with their CRM system; and they need to implement all this quickly, seamlessly, and without breaking the bank. Oracle’s Siebel Customer Relationship Management (CRM) is the world’s leading application suite for automated customer-facing operations for Sales and Marketing and for managing all aspects of providing service to customers. Oracle’s Contact On Demand (COD) is a world-class carrier grade hosted multi-channel contact center solution that can be deployed in days without up-front capital expenditures or integration costs. Agents can work efficiently from anywhere in the world with 360-degree views into customer interactions and real-time business intelligence. Customers gain from rapid and personalized sales and service, while organizations can dramatically reduce costs and increase revenues Oracle’s latest update of Siebel CRM now comes pre-integrated with Oracle’s Contact On Demand. This solution seamlessly runs fully-functional contact center provided by a single vendor, significantly reducing your total cost of ownership. This solution supports Siebel 7.8 and higher for Voice and Siebel 8.1 and higher for Voice and Siebel CRM Chat.  The impressive feature list of Oracle’s COD solution includes full-control CTI toolbar with Voice, Chat, and Click to Dial features.  It also includes context-sensitive screens, automated desktops, built-in IVR, Multidimensional routing, Supervisor and Quality monitoring, and Instant Provisioning. The solution also ships with Extensible Web Services interface for implementing more complex business processes. Click here to learn how to reduce complexity and total cost of ownership of your contact center. Contact Ann Singh at [email protected] for additional information.

    Read the article

  • How to avoid big and clumpsy UITableViewController on iOS?

    - by Johan Karlsson
    I have a problem when implementing the MVC-pattern on iOS. I have searched the Internet but seems not to find any nice solution to this problem. Many UITableViewController implementations seems to be rather big. Most example I have seen lets the UITableViewController implement UITableViewDelegate and UITableViewDataSource. These implementations are a big reason why UITableViewControlleris getting big. One solution would be to create separate classes that implements UITableViewDelegate and UITableViewDataSource. Of course these classes would have to have a reference to the UITableViewController. Are there any drawbacks using this solution? In general I think you should delegate the functionality to other "Helper" classes or similar, using the delegate pattern. Are there any well established ways of solving this problem? I do not want the model to contain to much functionality, nor the view. A believe that the logic should really be in the controller class, since this is one of the cornerstones of the MVC-pattern. But the big question is; How should you divide the controller of a MVC-implementation into smaller manageable pieces? (Applies to MVC in iOS in this case) There might be a general pattern for solving this, although I am specifically looking for a solution for iOS. Please give an example of a good pattern for solving this issue. Also an argument why this solution is awesome.

    Read the article

  • Imaging: Paper Paper Everywhere, but None Should be in Sight

    - by Kellsey Ruppel
    Author: Vikrant Korde, Technical Architect, Aurionpro's Oracle Implementation Services team My wedding photos are stored in several empty shoeboxes. Yes...I got married before digital photography was mainstream...which means I'm old. But my parents are really old. They have shoeboxes filled with vacation photos on slides (I doubt many of you have even seen a home slide projector...and I hope you never do!). Neither me nor my parents should have shoeboxes filled with any form of photographs whatsoever. They should obviously live in the digital world...with no physical versions in sight (other than a few framed on our walls). Businesses grapple with similar challenges. But instead of shoeboxes, they have file cabinets and warehouses jam packed with paper invoices, legal documents, human resource files, material safety data sheets, incident reports, and the list goes on and on. In fact, regulatory and compliance rules govern many industries, requiring that this paperwork is available for any number of years. It's a real challenge...especially trying to find archived documents quickly and many times with no backup. Which brings us to a set of technologies called Image Process Management (or simply Imaging or Image Processing) that are transforming these antiquated, paper-based processes. Oracle's WebCenter Content Imaging solution is a combination of their WebCenter suite, which offers a robust set of content and document management features, and their Business Process Management (BPM) suite, which helps to automate business processes through the definition of workflows and business rules. Overall, the solution provides an enterprise-class platform for end-to-end management of document images within transactional business processes. It's a solution that provides all of the capabilities needed - from document capture and recognition, to imaging and workflow - to effectively transform your ‘shoeboxes’ of files into digitally managed assets that comply with strict industry regulations. The terminology can be quite overwhelming if you're new to the space, so we've provided a summary of the primary components of the solution below, along with a short description of the two paths that can be executed to load images of scanned documents into Oracle's WebCenter suite. WebCenter Imaging (WCI): the electronic document repository that provides security, annotations, and search capabilities, and is the primary user interface for managing work items in the imaging solution SOA & BPM Suites (workflow): provide business process management capabilities, including human tasks, workflow management, service integration, and all other standard SOA features. It's interesting to note that there a number of 'jumpstart' processes available to help accelerate the integration of business applications, such as the accounts payable invoice processing solution for E-Business Suite that facilitates the processing of large volumes of invoices WebCenter Enterprise Capture (WEC): expedites the capture process of paper documents to digital images, offering high volume scanning and importing from email, and allows for flexible indexing options WebCenter Forms Recognition (WFR): automatically recognizes, categorizes, and extracts information from paper documents with greatly reduced human intervention WebCenter Content: the backend content server that provides versioning, security, and content storage There are two paths that can be executed to send data from WebCenter Capture to WebCenter Imaging, both of which are described below: 1. Direct Flow - This is the simplest and quickest way to push an image scanned from WebCenter Enterprise Capture (WEC) to WebCenter Imaging (WCI), using the bare minimum metadata. The WEC activities are defined below: The paper document is scanned (or imported from email). The scanned image is indexed using a predefined indexing profile. The image is committed directly into the process flow 2. WFR (WebCenter Forms Recognition) Flow - This is the more complex process, during which data is extracted from the image using a series of operations including Optical Character Recognition (OCR), Classification, Extraction, and Export. This process creates three files (Tiff, XML, and TXT), which are fed to the WCI Input Agent (the high speed import/filing module). The WCI Input Agent directory is a standard ingestion method for adding content to WebCenter Imaging, the process for doing so is described below: WEC commits the batch using the respective commit profile. A TIFF file is created, passing data through the file name by including values separated by "_" (underscores). WFR completes OCR, classification, extraction, export, and pulls the data from the image. In addition to the TIFF file, which contains the document image, an XML file containing the extracted data, and a TXT file containing the metadata that will be filled in WCI, are also created. All three files are exported to WCI's Input agent directory. Based on previously defined "input masks", the WCI Input Agent will pick up the seeding file (often the TXT file). Finally, the TIFF file is pushed in UCM and a unique web-viewable URL is created. Based on the mapping data read from the TXT file, a new record is created in the WCI application.  Although these processes may seem complex, each Oracle component works seamlessly together to achieve a high performing and scalable platform. The solution has been field tested at some of the largest enterprises in the world and has transformed millions and millions of paper-based documents to more easily manageable digital assets. For more information on how an Imaging solution can help your business, please contact [email protected] (for U.S. West inquiries) or [email protected] (for U.S. East inquiries). About the Author: Vikrant is a Technical Architect in Aurionpro's Oracle Implementation Services team, where he delivers WebCenter-based Content and Imaging solutions to Fortune 1000 clients. With more than twelve years of experience designing, developing, and implementing Java-based software solutions, Vikrant was one of the founding members of Aurionpro's WebCenter-based offshore delivery team. He can be reached at [email protected].

    Read the article

  • Do you want to be an ALM Consultant?

    - by Martin Hinshelwood
    Northwest Cadence is looking for our next great consultant! At Northwest Cadence, we have created a work environment that emphasizes excellence, integrity, and out-of-the-box thinking.  Our customers have high expectations (rightfully so) and we wouldn’t have it any other way!   Northwest Cadence has some of the most exciting customers I have ever worked with and even though I have only been here just over a month I have already: Provided training/consulting for 3 government departments Created and taught courseware for delivering Scrum to teams within a high profile multinational company Started presenting Microsoft's ALM Engagement Program  So if you are interested in helping companies build better software more efficiently, then.. Enquire at [email protected] Application Lifecycle Management (ALM) Consultant An ALM Consultant with a minimum of 8 years of relevant experience with Application Lifecycle Management, Visual Studio (including Visual Studio Team System) and software design is needed. Must provide thought leadership on best practices for enterprise architecture, understand the Microsoft technology solution stack, and have a thorough understanding of enterprise application integration. The ALM Practice Lead will play a central role in designing and implementing the overall ALM Practice strategy, including creating, updating, and delivering ALM courseware and consultancy engagements. This person will also provide project support, deliverables, and quality solutions on Visual Studio Team System that exceed client expectations. Engagements will vary and will involve providing expert training, consulting, mentoring, formulating technical strategies and policies and acting as a “trusted advisor” to customers and internal teams. Sound sense of business and technical strategy required. Strong interpersonal skills as well as solid strategic thinking are key. The ideal candidate will be capable of envisioning the solution based on the early client requirements, communicating the vision to both technical and business stakeholders, leading teams through implementation, as well as training, mentoring, and hands-on software development. The ideal candidate will demonstrate successful use of both agile and formal software development methods, enterprise application patterns, and effective leadership on prior projects. Job Requirements Minimum Education: Bachelor’s Degree (computer science, engineering, or math preferred). Locale / Travel: The Practice Lead position requires estimated 50% travel, most of which will be in the Continental US (a valid national Passport must be maintained).  This is a full time position and will be based in the Kirkland office. Preferred Education: Master’s Degree in Information Technology or Software Engineering; Premium Microsoft Certifications on .NET (MCSD) or MCPD or relevant experience; Microsoft Certified Trainer (MCT) or relevant experience. Minimum Experience and Skills: 7+ years experience with business information systems integration or custom business application design and development in a professional technology consulting, corporate MIS or software development environment. Essential Duties & Responsibilities: Provide training, consulting, and mentoring to organizations on topics that include Visual Studio Team System and ALM. Create content, including labs and demonstrations, to be delivered as training classes by Northwest Cadence employees. Lead development teams through the complete ALM and/or Visual Studio Team System solution. Be able to communicate in detail how a solution will integrate into the larger technical problem space for large, complex enterprises. Define technical solution requirements. Provide guidance to the customer and project team with respect to technical feasibility, complexity, and level of effort required to deliver a custom solution. Ensure that the solution is designed, developed and deployed in accordance with the agreed upon development work plan. Create and deliver weekly status reports of training and/or consulting progress. Engagement Responsibilities: · Provide a strong desire to provide thought leadership related to technology and to help grow the business. · Work effectively and professionally with employees at all levels of a customer’s organization. · Have strong verbal and written communication skills. · Have effective presentation, organizational and planning skills. · Have effective interpersonal skills and ability to work in a team environment. Enquire at [email protected]

    Read the article

  • Declarative Architectures in Infrastructure as a Service (IaaS)

    - by BuckWoody
    I deal with computing architectures by first laying out requirements, and then laying in any constraints for it's success. Only then do I bring in computing elements to apply to the system. As an example, a requirement might be "world-side availability" and a constraint might be "with less than 80ms response time and full HA" or something similar. Then I can choose from the best fit of technologies which range from full-up on-premises computing to IaaS, PaaS or SaaS. I also deal in abstraction layers - on-premises systems are fully under your control, in IaaS the hardware is abstracted (but not the OS, scale, runtimes and so on), in PaaS the hardware and the OS is abstracted and you focus on code and data only, and in SaaS everything is abstracted - you merely purchase the function you want (like an e-mail server or some such) and simply use it. When you think about solutions this way, the architecture moves to the primary factor in your decision. It's problem-first architecting, and then laying in whatever technology or vendor best fixes the problem. To that end, most architects design a solution using a graphical tool (I use Visio) and then creating documents that  let the rest of the team (and business) know what is required. It's the template, or recipe, for the solution. This is extremely easy to do for SaaS - you merely point out what the needs are, research the vendor and present the findings (and bill) to the business. IT might not even be involved there. In PaaS it's not much more complicated - you use the same Application Lifecycle Management and design tools you always have for code, such as Visual Studio or some other process and toolset, and you can "stamp out" the application in multiple locations, update it and so on. IaaS is another story. Here you have multiple machines, operating systems, patches, virus scanning, run-times, scale-patterns and tools and much more that you have to deal with, since essentially it's just an in-house system being hosted by someone else. You can certainly automate builds of servers - we do this as technical professionals every day. From Windows to Linux, it's simple enough to create a "build script" that makes a system just like the one we made yesterday. What is more problematic is being able to tie those systems together in a coherent way (as a solution) and then stamp that out repeatedly, especially when you might want to deploy that solution on-premises, or in one cloud vendor or another. Lately I've been working with a company called RightScale that does exactly this. I'll point you to their site for more info, but the general idea is that you document out your intent for a set of servers, and it will deploy them to on-premises clouds, Windows Azure, and other cloud providers all from the same script. In other words, it doesn't contain the images or anything like that - it contains the scripts to build them on-premises or on a cloud vendor like Microsoft. Using a tool like this, you combine the steps of designing a system (all the way down to passwords and accounts if you wish) and then the document drives the distribution and implementation of that intent. As time goes on and more and more companies implement solutions on various providers (perhaps for HA and DR) then this becomes a compelling investigation. The RightScale information is here, if you want to investigate it further. Yes, there are other methods I've found, but most are tied to a single kind of cloud, and I'm not into vendor lock-in. Poppa Bear Level - Hands-on EvaluateRightScale at no cost.  Just bring your Windows Azurecredentials and follow the these tutorials: Sign Up for Windows Azure Add     Windows Azure to a RightScale Account Windows Azure Virtual Machines     3-tier Deployment Momma Bear Level - Just the Right level... ;0)  WindowsAzure Evaluation Guide - if you are new toWindows Azure Virtual Machines and new to RightScale, we recommend that youread the entire evaluation guide to gain a more complete understanding of theWindows Azure + RightScale solution.    WindowsAzure Support Page @ support.rightscale.com - FAQ's, tutorials,etc. for  Windows Azure Virtual Machines (Work in Progress) Baby Bear Level - Marketing WindowsAzure Page @ www.rightscale.com - find overview informationincluding solution briefs and presentation & demonstration videos   Scale     and Automate Applications on Windows Azure  Solution Brief     - how RightScale makes Windows Azure Virtual Machine even better SQL     Server on Windows Azure  Solution Brief   -       Run Highly Available SQL Server on Windows Azure Virtual Machines

    Read the article

  • WebCenter Customer Spotlight: Institute of Financing for Agriculture and Fisheries

    - by kellsey.ruppel
     Author: Peter Reiser - Social Business Evangelist, Oracle WebCenter  Solution SummaryThe Institute of Financing for Agriculture and Fisheries (IFAP) provides access, process payments, and oversee the application of EU and domestic funds distribution to individuals and companies. IFAP business objectives were to establish electronic processing of EU funds, improve relations between government agencies and public in compliance with the International Organization for Standardization (ISO) requirements for information management and security They implemented a complete solution for managing the entire document content life cycle through the use of Oracle WebCenter Content and Oracle WebCenter Capture. IFAP improved relationships with the public by accelerating payments electronically to individuals and organizations engaged in agriculture and fisheries, which is much easier, faster, and more secure than paper-based payments and the solution complies with ISO information and security requirements.  Company OverviewAs part of the Ministry of Agriculture, Rural Development, and Fisheries, the mission of the Institute of Financing for Agriculture and Fisheries (IFAP) is to provide access, process payments, and oversee the application of European Union (EU) and domestic funds distribution to individuals and companies engaged in the agriculture, rural development, and fisheries industries. Business ChallengesIFAP main business objective was to establish electronic processing of EU funds invested in agriculture and fisheries, improve relations between government agencies and the public and  comply with International Organization for Standardization (ISO) requirements for information management and security systems regarding access to stored documents. Solution DeployedIFAP implemented a complete solution for managing the entire document content life cycle through the use of Oracle WebCenter Content and Oracle WebCenter Capture.  The use of paper was replaced with digital formats, accelerating internal processes and ensuring compliance with ISO requirements Business Results Scalability The number of documents included and managed in the document system, called iDOC, increased to a total of 490,847, of which 103,298 are internally generated, 113,824 are digitized correspondence, and 264,870 are forms that have been digitized or received via the institute’s Web site. Efficiency  IFAP improved relationships with the public by accelerating payments electronically to individuals and organizations engaged in agriculture and fisheries, which is much easier, faster, and more secure than paper-based payments. The overall productivity increased through the use of digital formats and citizens’ ID cards as digital signatures. Compliance The implemented solution complies with International Organization for Standardization (ISO) requirements for information management and security systems regarding access to stored documents. Oracle Products and Services IFAP Customer Snapshot Oracle WebCenter Content Oracle WebCenter Capture Oracle Application Server Oracle Forms Oracle Reports

    Read the article

  • Data Aggregation of CSV files java

    - by royB
    I have k csv files (5 csv files for example), each file has m fields which produce a key and n values. I need to produce a single csv file with aggregated data. I'm looking for the most efficient solution for this problem, speed mainly. I don't think by the way that we will have memory issues. Also I would like to know if hashing is really a good solution because we will have to use 64 bit hashing solution to reduce the chance for a collision to less than 1% (we are having around 30000000 rows per aggregation). For example file 1: f1,f2,f3,v1,v2,v3,v4 a1,b1,c1,50,60,70,80 a3,b2,c4,60,60,80,90 file 2: f1,f2,f3,v1,v2,v3,v4 a1,b1,c1,30,50,90,40 a3,b2,c4,30,70,50,90 result: f1,f2,f3,v1,v2,v3,v4 a1,b1,c1,80,110,160,120 a3,b2,c4,90,130,130,180 algorithm that we thought until now: hashing (using concurentHashTable) merge sorting the files DB: using mysql or hadoop or redis. The solution needs to be able to handle Huge amount of data (each file more than two million rows) a better example: file 1 country,city,peopleNum england,london,1000000 england,coventry,500000 file 2: country,city,peopleNum england,london,500000 england,coventry,500000 england,manchester,500000 merged file: country,city,peopleNum england,london,1500000 england,coventry,1000000 england,manchester,500000 The key is: country,city. This is just an example, my real key is of size 6 and the data columns are of size 8 - total of 14 columns. We would like that the solution will be the fastest in regard of data processing.

    Read the article

  • How to avoid big and clumsy UITableViewController on iOS?

    - by Johan Karlsson
    I have a problem when implementing the MVC-pattern on iOS. I have searched the Internet but seems not to find any nice solution to this problem. Many UITableViewController implementations seems to be rather big. Most examples I have seen lets the UITableViewController implement <UITableViewDelegate> and <UITableViewDataSource>. These implementations are a big reason why UITableViewControlleris getting big. One solution would be to create separate classes that implements <UITableViewDelegate> and <UITableViewDataSource>. Of course these classes would have to have a reference to the UITableViewController. Are there any drawbacks using this solution? In general I think you should delegate the functionality to other "Helper" classes or similar, using the delegate pattern. Are there any well established ways of solving this problem? I do not want the model to contain too much functionality, nor the view. I believe that the logic should really be in the controller class, since this is one of the cornerstones of the MVC-pattern. But the big question is: How should you divide the controller of a MVC-implementation into smaller manageable pieces? (Applies to MVC in iOS in this case) There might be a general pattern for solving this, although I am specifically looking for a solution for iOS. Please give an example of a good pattern for solving this issue. Please provide an argument why your solution is awesome.

    Read the article

  • How do I turn on wireless adapter on HP Envy dv6 7200 under Ubuntu (any version)?

    - by Dave B.
    I have a new HP Envy dv6 7200 with dual boot Windows 8 / Ubuntu 12.04. In windows, the F12 key in Windows activates the "airplane mode" switch which enables/disables both on-board (mini PCIe) and USB wireless adapters. In Ubuntu, however, the wireless adapter is turned off by default and cannot be turned back on via the F12 key (or any other combination of F12 and Ctrl, Fn, Shift, etc.). Let me explain the "fixes" I've seen in various forums and explain what did or did not happen. These are listed in no particular order. (Spoiler alert: wireless is still broke). Solution 1? Use HP's "Wireless Assistant" utility to permanently activate the wireless card in Windows, then boot into Ubuntu to happily find it working. Unfortunately, this utility works in Windows 7 but not Windows 8. On the other hand, hardware drivers from HP are only available for Windows 8 for this model. Catch 22 (I could not find a comparable utility for Windows 8). Solution 2? Use a USB wireless adapter to sidestep the on-board device. I purchased such a device from thinkpenguin.com to be sure that it would be Linux-friendly. However, the wireless switch enables / disables all wireless devices including USB. So, there's my $50 donation to the nice folks at thinkpenguin.com, but still no solution. Solution 3? Following the Think Penguin folk's suggestion, modify the mini PCI express adapter following instructions here: http://www.notebookforums.com/t/225429/broken-wireless-hardware-switch-fix Tempting, but I then violate the terms of my warranty mere days after opening the box. This might be a good solution for an older machine that you want to get your geek on with, but not for a new box. Solution 4? rfkill unblock all No effect whatsoever. ubuntu@ubuntu-hp-evny:~$ rfkill unblock all ubuntu@ubuntu-hp-evny:~$ rfkill list all 0: hp-wifi: Wireless LAN Soft blocked: no Hard blocked: yes Solution 5? Re-install drivers. Done and done. Ubuntu recognizes the device - perhaps even without re-installing the drivers? - but cannot turn it on. How do I know this? In the Network Manager drop-down menu, the wireless option is blacked out and a message reads something like: "wireless network is disabled by a hardware switch". Solution 6? Identify a physical switch on the laptop and flip it. There is no such switch on this machine. In fact, walking through Best Buy yesterday, I checked and not a single new laptop PC had a physical switch on it. All of the wireless switches are either the F2 or F12 key ... I wonder if askubuntu will not be plagued by this exact issue in the near future? Additional info - lspci ubuntu@ubuntu-hp-evny:~$ lspci 00:00.0 Host bridge: Intel Corporation Ivy Bridge DRAM Controller (rev 09) 00:01.0 PCI bridge: Intel Corporation Ivy Bridge PCI Express Root Port (rev 09) 00:02.0 VGA compatible controller: Intel Corporation Ivy Bridge Graphics Controller (rev 09) 00:14.0 USB controller: Intel Corporation Panther Point USB xHCI Host Controller (rev 04) 00:16.0 Communication controller: Intel Corporation Panther Point MEI Controller #1 (rev 04) 00:1a.0 USB controller: Intel Corporation Panther Point USB Enhanced Host Controller #2 (rev 04) 00:1b.0 Audio device: Intel Corporation Panther Point High Definition Audio Controller (rev 04) 00:1c.0 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 1 (rev c4) 00:1c.2 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 3 (rev c4) 00:1c.3 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 4 (rev c4) 00:1c.5 PCI bridge: Intel Corporation Panther Point PCI Express Root Port 6 (rev c4) 00:1d.0 USB controller: Intel Corporation Panther Point USB Enhanced Host Controller #1 (rev 04) 00:1f.0 ISA bridge: Intel Corporation Panther Point LPC Controller (rev 04) 00:1f.2 RAID bus controller: Intel Corporation 82801 Mobile SATA Controller [RAID mode] (rev 04) 00:1f.3 SMBus: Intel Corporation Panther Point SMBus Controller (rev 04) 01:00.0 VGA compatible controller: NVIDIA Corporation Device 0de9 (rev a1) 08:00.0 Unassigned class [ff00]: Realtek Semiconductor Co., Ltd. Device 5229 (rev 01) 0a:00.0 Network controller: Ralink corp. Device 539b 0b:00.0 Ethernet controller: Realtek Semiconductor Co., Ltd. RTL8111/8168B PCI Express Gigabit Ethernet controller (rev 07) Any suggestions would be much appreciated!

    Read the article

  • Best thing to do about projects supporting multiple versions of Visual Studio?

    - by Earlz
    I have an open source project that works on .Net 2.0 and up. The thing is though that I prefer to use Visual Studio 2012, which forces the solution and project files to only work with VS2010/2012. What exactly should I do? I don't want for my users to have to create a solution from scratch if they don't have access to VS2010, but yet, I also don't want to attempt to keep 3 different project files in sync(VS2005, VS2008, and VS2010/2012) What is the usual solution for this?

    Read the article

  • Three Master Data Management Deployment Tips

    - by david.butler(at)oracle.com
    MDM is all about data quality and data governance. We now know that improved data quality raises all operational and analytical boats. But it's not just about deploying data quality tools. It's about deploying data quality tools within and across the IT landscape - from a thousand points of data entry to a single version of the truth. Here are three tips to deploying MDM across your applications and enterprise.   #1: Identify a tactical, high-value business problem where MDM can materially help. §  Support a customer acquisition and retention program with a 'customer' master data solution. §  Accelerate new products and services to market with a 'product' master data solution. §  Reduce supplier exceptions or support spend control initiatives with a 'supplier' master data solution. §  Support new store (branch, campus, restaurant, hospital, office, well head) location analysis with a 'site' master data solution. §  Fix long standing Chart of Accounts and Cost Center problems with a 'financial' master data solution. §  Support M&A activity, application upgrades, an SOA initiative, a cloud computing program, or a new business intelligence deployment by implementing a mix of master data solutions.   #2: Incrementally expand to a full information architecture. Quite often, the measurable return on interest from tactical MDM initiatives will fund future deployments. Over time, the MDM solution expands into its full architecture to cover the entire IT landscape. Operations and analytics are united, IT flexibility is restored, and sustainable competitive advantage is achieved.   #3: Bring business into every MDM deployment. To be successful, MDM must work hand in hand with data governance. In fact, Oracle MDM incorporates data governance tools for business users. IT can insure data quality, but only after the business side has defined what quality means. The business establishes the rules for governing the master data, and then IT enforces the rules via the MDM applications. Without this business/IT collaboration, MDM initiatives seldom achieve their full potential.   It is not very often that a technology comes along that can measurably assist organizations across a wide variety of top IT initiatives. Reducing costs, increasing flexibility, getting more out of existing assets, and aligning business and IT are not easy tasks for any CIO. But with MDM, success is achievable. IT can regain its place as a center for innovation.   For more information on this topic, take a look at my article Master Data Management Deployment Tips in the Opinion Section of Oracle's Profit Online magazine.

    Read the article

  • Webcast Q&A: Hitachi Data Systems Improves Global Web Experiences with Oracle WebCenter

    - by kellsey.ruppel
    Last Thursday we had the third webcast in our WebCenter in Action webcast series, "Hitachi Data Systems Improves Global Web Experiences with Oracle WebCenter", where customer Sean Mattson from HDS and Rob Vandenberg from Oracle Partner Lingotek shared how Oracle WebCenter is powering Hitachi Data System’s externally facing website and providing a seamless experience for their customers. In case you missed it, here's a recap of the Q&A.   Sean Mattson, Hitachi Data Systems  Q: Did you run into any issues in the deployment of the platform?A: There were some challenges, we were one of the first enterprise ‘on premise’ installations for Lingotek and our WebCenter platform also has a lot of custom features.  There were a lot of iterations and back and forth working with Lingotek at first.  We both helped each other, learned a lot and in the end managed to resolve all issues and roll out a very compelling solution for HDS. Q: What has been the biggest benefit your end users have seen?A: Being able to manage and govern the content lifecycle globally and centrally and at the same time enabling the field to update, review and publish the incremental content changes without a lot of touchpoints has helped us streamline and simplify the entire publishing process. Q: Was there any resistance internally when implementing the solution? If so, how did you overcome that?A: I wouldn't say resistance as much as skepticism that we could actually deploy an automated and self publishing solution.  Even if a solution is great, adoption of a new process can be a challenge and we are still pursuing our adoption targets.  One of the most important aspects is to include lots of training and support materials and offer as much helpdesk type support as needed to get the field self sufficient and confident in the capabilities of the system.  Rob Vandenberg, Lingotek  Q: Are there any limitations regarding supported languages such as support for French Canadian and Indian languages?A: Lingotek supports all language pairs. Including right to left languages and double byte languages such as Chinese, Japanese and Korean Q: Is the Lingotek solution integrated with the new 11g release of WebCenter Sites? A: Yes! In fact, Lingotek is the first OVI partner for Oracle WebCenter Sites  Q: Can translation memories help to improve the accuracy of machine translation?A: One of the greatest long term strategic benefits of using Lingotek is the accumulation of translation memories, or past human translations. These TMs can be used to "train" statistical machine translation engines to have higher and higher quality. This virtuous cycle is ongoing and will consistently improve both machine and human translations.  Q: We have existing translation memories from previous work with our translation service provider. Can they be easily imported in to the Lingotek solution for re-use? Q: Yes, Lingotek is standards compliant. We support TM import in both the TMX and XLIFF formats. Q: If we use Lingotek as a service to do our professional translation and also use the Lingotek software solution, do we get the translation memories to give us a means of just translating future adds and changes ourselves? A: Yes, all the data is yours, always. Lingotek can provide both the integrated translation software as well as the professional translation services. All the content and translation memories are yours. Q: Can you give us an example of where community translation has proved to be successful?A: The key word here is community. If you have a community that cares about you, your content, and the rest of the community, then community translation can work for you. We've seen effective use cases in Product User Groups content, Support Communities, and other types of User Generated content, like wikis and blogs.   If you missed the webcast, be sure to catch the replay to see a live demonstration of WebCenter in action!   Hitachi Data Systems Improves Global Web Experiences with Oracle WebCenter from Oracle WebCenter

    Read the article

  • How to maintain symlinks in linux file manager?

    - by MountainX
    I want to use symlinks extensively. However, if I move the target file, the symlink becomes broken (unlike on Windows). That's not acceptable to me, so I either need a solution or I won't be able to use symlinks the way I wish to. Is there a solution that will work with Dolphin file manager? A command line solution is described on commandlinefu. In summary, it is something like one of these: lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#:1}" ] && mv "$a";"${@:$#:1}" && ln -s "${@:$#:1}"/"$(basename "$a")";"$(dirname "$a")";done} lmv(){for a in ${@:1:$(expr $#-1)};do [ -e "$a" -a -e "${@:$#}" ] && mv "$a";"${@:$#}" && ln -s "${@:$#}"/"$(basename "$a")";"$(dirname "$a")";done} But about half the time I'm using a file manager (Dolphin), so I need a complete solution to this problem. Is a solution available for a GUI file manager? EDIT: The context of this question is that I'm searching for an alternative to hardlinks. I previously asked this question about the pitfalls of hardlinks.

    Read the article

  • How to create SharePoint2013 workflow using visual studio

    - by ybbest
    If you like to use Visual Studio to create workflow in SharePoint2013, here are the steps on how to get started. 1. Create a SharePoint sandbox solution. 2. Add a list workflow 3. I add a WriteToHistory to the workflow. 4. Here is the final solution looks like: 5. Deploy the sandbox solution to your Office 365 Preview and activate the site collection feature first 6. Then you can activate the site features in the following orders 7. You can run your work as shown below 8. Navigate to your workflow history list, you will see the workflow is successfully completed. You can download the solution here.

    Read the article

  • Creating practically solvable 15 puzzle inputs

    - by Ashwin
    I am now developing a 15 puzzle game. I know the method to detect unsolvable puzzles. But unlike 8-puzzle, solution for 15-puzzle takes quite long time for some input states and can be solved within 5 seconds some other set of input states. Now the problem is that I cannot give the user(the player), a problem for which the solution takes more than 10 seconds(if he/she chooses to see the solution). So what I want is that when I initially shuffle the puzzle, I want to only present those puzzles which can be solved within 10 seconds. There must be some way to determine the hardness of the puzzle. I tried searching the net but could not find it. Does anyone know a way of determining the hardness of a puzzle? NOTE : I am using A* algorithm to find out the solution on a computer with 3GB RAM and 2.27GHZ processor.

    Read the article

  • Tools for managing eCommerce backend

    - by rboarman
    I am working with an eCommerce company that has outgrown their hacked together backend for managing inventory, pricing and feeds to various shopping engines (Yahoo, 3d cart, Amazon, etc.). They currently manage about 12,000 skus and are doing $40M in revenue. Their internal people are working on a new Magento solution, but that is six months away and they need to replace/improve their current solution in order to hold them over. Their current solution was developed by two people who have left the company. What tools/architecture do other eCommerce sites use to manage their inventory, pricing, product descriptions and feed generation for the shopping engines? The current solution looks like this: 1) Inventory, pricing and product descriptions are maintained in a database and in NetSuite by employees 2) New products are added to the database via import 3) Twice a week data is extracted into a giant Excel spreadsheet 4) The Excel file adjusts pricing based on some simple algorithms 5) The Excel file exports about six different csv feeds which are manually uploaded to Amazon, 3d cart, Yahoo, Google and Merchant Advantage a. Each feed is a variant of the product which different field names and formatting b. Pricing levels differ between feeds c. Some products are not sent to all feeds 6) Orders are manually parsed and the inventory is adjusted as needed once product is sold The new solution should: 1) Import data from ODBC, CSV and NetSuite (CSV via ftp) 2) Apply pricing changes via simple algorithms (< $80 add $10, $200 add $25) 3) Ensure margins are being met 4) Format and generate a bunch of CSV and XML feeds 5) Perhaps upload feeds to shopping engines automatically What I need to do is replace the Excel file with something that is maintainable and automated. Something in the .Net stack is preferable but not mandatory. I’ve been looking at BizTalk but it may take too long to develop and deploy. Any suggestions?

    Read the article

  • Migrating an Existing ASP.NET App to run on Windows Azure

    - by kaleidoscope
    Converting Existing ASP.NET application in to Windows Azure Method 1:  1. Add a Windows Azure Cloud Service to the existing solution                         2. Add WebRole Project in solution and select Existing Asp.Net project in it. Method 2: The other option would have been to create a new Cloud Service project and add the ASP.NET project (which you want to deploy on Azure) to it using -                    1. Solution | Add | Existing Project                    2. Add | Web Role Project in solution Converting Sql server database to SQL Azure - Step 1: Migrate the <Existing Application>.MDF Step 2: Migrate the ASP.NET providers Step 3: Change the connection strings More details can be found at http://blogs.msdn.com/jnak/archive/2010/02/08/migrating-an-existing-asp-net-app-to-run-on-windows-azure.aspx   Ritesh, D

    Read the article

  • Is there any way to distribute x264 encoding jobs across multiple computers (to increase the encodin

    - by Breakthrough
    Does anyone know of a current, active solution to encoding x264 videos across many computers (via the network) to increase encoding FPS? I heard in the past of the project x264farm; unfortunately, it is not actively developed anymore, and isn't compatible with newer x264 builds. I'm looking for a current solution, which is compatible with newer builds of x264. Just to note this, I'm a Windows user, so I'm only looking for Windows solutions (or at the very least, Linux). I've also seen the ELDER distributed encoder, but the quality varies depending on the settings you use - I'd prefer a solution similar to x264farm as noted above (the documentation outlines the encoding process), but one that is compatible with new(er) x264 builds, and is preferably actively developed. Final Edit: Unfortunately, the bounty for this question has expired, and I haven't found a decent solution for this. So if at any time someone finds a new, distributed encoding solution for x264 (or any h.264 coded, for that matter) - please answer this question! I'd love to discover an ideal method to make this work!

    Read the article

  • Tip: File Download in ASP.Net and Tracking the status of success/failure of Download

    While working on a e-commerce project I had a requirement to implement such a functionality where the success/failure of the download can be tracked. After searching for the solution I found that there is no such article related to similar problem. Then I came up with this solution after reading an article on transferring file in small packets. I hope this solution will help others struggling with similar problem.

    Read the article

  • PeopleSoft's Enterprise Financial Management 8.9

    Fred interviews Annette Melatti, Senior Director Financials Product Marketing and discusses the latest release and the value this release offers to customers including compliance, superior ownership experience, industry specific solution extensions, enhancements to the enterprise service automation solution and the introduction of the new asset lifecycle management solution.

    Read the article

< Previous Page | 54 55 56 57 58 59 60 61 62 63 64 65  | Next Page >