Search Results

Search found 19649 results on 786 pages for 'visual studio integration'.

Page 37/786 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • autoexp.dat does not seem to take affect in Visual Studio C++ 2005 debugger.

    - by Pradyot
    autoexp.dat does not seem to take affect in Visual Studio C++ 2005 debugger. I am not trying to add any custom rules. Just want commonly used stuff like stl::string, to display in a friendlier manner. Does anyone know. how I can accomplish this? Is this just question of specifying a path to the autoexp.dat file somewhere. The file is available under the Visual Studio installation directory.

    Read the article

  • Will Visual Studio 2010 support HTML 5?

    - by Chris
    Since Visual Studio 2010 is slated for release in March of 2010 and HTML 5 is now starting to be used even more widely, I would like to know if Visual Studio will ship with HTML 5 templates, standard controls and support for the more common markup? A definition for support of HTML 5 would be that any new version of Visual Studio should have similar support for code-completion, validation and markup that is currently supported for HTML 4.01 and XHTML 1.0 and 1.1. Update From the Visual Web Develolper Team Blog: HTML 5 intellisense and validation schema for Visual Studio 2008 and Visual Web Developer is for downloading. Follow the instructions posted on the page to install the new scheme. Seems like the Visual Studio Team will be supporting HTML 5 after all.

    Read the article

  • Why does visual studio think js file is a cs file?

    - by divitiae
    I have a ASP.NET solution in Visual Studio 2008 and I added a file identical to http://plugins.jquery.com/files/jquery.cookie.js.txt named jquery.cookie.js in a subfolder of my project containing other javascript files and Visual Studio is treating it as a C# file, giving me errors like CS1012: Too many characters in character literal and Semicolon after method or accesssor block is not valid. Why?

    Read the article

  • "Pretty" Continuous Integration for Python

    - by dbr
    This is a slightly.. vain question, but BuildBot's output isn't particularly nice to look at.. For example, compared to.. phpUnderControl Hudson CruiseControl.rb ..and others, BuildBot looks rather.. archaic I'm currently playing with Hudson, but it is very Java-centric (although with this guide, I found it easier to setup than BuildBot, and produced more info) Basically: is there any Continuous Integration systems aimed at python, that produce lots of shiney graphs and the likes? Update: After trying a few alternatives, I think I'll stick with Hudson. Integrity was nice and simple, but quite limited. I think Buildbot is better suited to having numerous build-slaves, rather than everything running on a single machine like I was using it. Setting Hudson up for a Python project was pretty simple: Download Hudson from https://hudson.dev.java.net/ Run it with java -jar hudson.war Open the web interface on the default address of http://localhost:8080 Go to Manage Hudson, Plugins, click "Update" or similar Install the Git plugin (I had to set the git path in the Hudson global preferences) Create a new project, enter the repository, SCM polling intervals and so on Install nosetests via easy_install if it's not already In the a build step, add nosetests --with-xunit --verbose Check "Publish JUnit test result report" and set "Test report XMLs" to **/nosetests.xml That's all that's required. You can setup email notifications, and the plugins are worth a look. A few I'm currently using for Python projects: SLOCCount plugin to count lines of code (and graph it!) - you need to install sloccount separately Violations to parse the PyLint output (you can setup warning thresholds, graph the number of violations over each build) Cobertura can parse the coverage.py output. Nosetest can gather coverage while running your tests, using nosetests --with-coverage (this writes the output to **/coverage.xml)

    Read the article

  • Facebooker Causing Problems with Rails Integration Testing

    - by Eric Lubow
    I am (finally) attempting to write some integration tests for my application (because every deploy is getting scarier). Since testing is a horribly documented feature of Rails, this was the best I could get going with shoulda. class DeleteBusinessTest < ActionController::IntegrationTest context "log skydiver in and" do setup do @skydiver = Factory( :skydiver ) @skydiver_session = SkydiverSession.create(@skydiver) @biz = Factory( :business, :ownership = Factory(:ownership, :skydiver = @skydiver )) end context "delete business" do setup do @skydiver_session = SkydiverSession.find post '/businesses/destroy', :id = @biz.id end should_redirect_to('businesses_path()'){businesses_path()} end end end In theory, this test seems like it should pass. My factories seem like they are pushing the right data in: Factory.define :skydiver do |s| s.sequence(:login) { |n| "test#{n}" } s.sequence(:email) { |n| "test#{n}@example.com" } s.crypted_password '1805986f044ced38691118acfb26a6d6d49be0d0' s.password 'secret' s.password_confirmation { |u| u.password } s.salt 'aowgeUne1R4-F6FFC1ad' s.firstname 'Test' s.lastname 'Salt' s.nickname 'Mr. Password Testy' s.facebook_user_id '507743444' end The problem I am getting seems to be from Facebooker only seems to happen on login attempts. When the test runs, I am getting the error: The error occurred while evaluating nil.set_facebook_session. I believe that error is to be expected in a certain sense since I am not using Facebook here for this session. Can anyone provide any insight as to how to either get around this or at least help me out with what is going wrong?

    Read the article

  • google appengine local datastore integration testing with spring

    - by mirror303
    Hi all, I want to write some integration tests to see how my spring-managed DAO's behave when talking to the appengine datastore. Following the spring manual I will be providing my test-classes with the proper annotations: @RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration(locations = { "classpath:applicationContext.xml" }) After a lot of browsing I found this blog post dating back to august '09 from somebody doing exactly what I want to achieve. It involves writing a TestEnvironment class that implements ApiProxy.Environment plus talking to ApiProxyLocalImpl. However, if I look at the current docs (for version 1.3.1), it seems that this has been replaced by newing an instance of the framework provided LocalDatastoreServiceTestConfig which is passed to a LocalServiceTestHelper. It is too bad that the appengine docs don't show an example how to do this with JPA because then the spring wiring would be trivial. Trying to follow the route outlined in the blog posting has me running into a compiler messages telling me that classes such as ApiProxyLocalImpl are not visible by me. Hence, there must be a new way of doing it, which probably involves the LocalServiceTestHelper. My question: Does anybody know how? I know I will need to configure an EntityManagerFactory and provide it with the Datastore connection somehow... but how? :)

    Read the article

  • appengine local datastore integration testing with spring

    - by mirror303
    Hi all, I want to write some integration tests to see how my spring-managed DAO's behave when talking to the appengine datastore. Following the spring manual I will be providing my test-classes with the proper annotations: @RunWith(SpringJUnit4ClassRunner.class) @ContextConfiguration(locations = { "classpath:applicationContext.xml" }) After a lot of browsing I found this blog post dating back to august '09 from somebody doing exactly what I want to achieve. It involves writing a TestEnvironment class that implements ApiProxy.Environment plus talking to ApiProxyLocalImpl. However, if I look at the current docs (for version 1.3.1), it seems that this has been replaced by newing an instance of the framework provided LocalDatastoreServiceTestConfig which is passed to a LocalServiceTestHelper. It is too bad that the appengine docs don't show an example how to do this with JPA because then the spring wiring would be trivial. Trying to follow the route outlined in the blog posting has me running into a compiler messages telling me that classes such as ApiProxyLocalImpl are not visible by me. Hence, there must be a new way of doing it, which probably involves the LocalServiceTestHelper. My question: Does anybody know how? I know I will need to configure an EntityManagerFactory and provide it with the Datastore connection somehow... but how? :)

    Read the article

  • Web framework with JasperReports integration?

    - by Dave Jarvis
    What web development frameworks natively support JasperReports? Consider the following form as an example: <form name="report" method="post"> <input type="hidden" name="REPORT_PATH" value="reports/Names" /> <input type="hidden" name="REPORT_FILE" value="List" /> <input type="hidden" name="REPORT_FORMAT" value="pdf" /> <input type="hidden" name="REPORT_EMBED" value="false" /> Name: <input type="text" name="report_Name" value="" /><br /> Date: <input type="text" name="report_Date" value="" /><br /> <input type="submit" name="View" value="View" /> </form> The framework would pass the report_ parameters to JasperReports, which in turn runs reports/Names/List.jasper, and then sends a PDF attachment to the browser. In general: Ability to configure report (i.e., the hidden REPORT_ variables) Web FORM for setting report parameters (i.e., the report_ variables) Framework handles configuring database connection, report execution, etc. I don't care about the technical minutia on how the integration works. The example above is just one possibility of many.

    Read the article

  • Best practice to test a web application, regarding domain name and integration with external service

    - by ycseattle
    I have run into these problems several times and was never able to find a comfortable solution. Let's say my website has the domain name MyDomain.com. When I run the tests on the test machine (a continuous integration server), I will modify the HOSTS file on this machine so the MyDomain.com is mapped to this local machine instead of the real production server. This doesn't work very well for many situations. For example, my application will create subdomain names user1.MyDomain.com dynamically but this is difficult to keep the testing flexible. Another problem is my web application will interact with Amazon S3, and sometimes other service like Amazon Simple Message Queue. I am only comfortable to include these interaction in my tests but I am never happy with my solution for mixing testing and production on Amazon services. Could somebody offer some tips on these issues? I would like to make my testing framework clean and flexible. I am sure this is a common question for all web applications and there must be a mature way to deal with these. Thanks!

    Read the article

  • Recommendations for Continuous integration for Mercurial/Kiln + MSBuild + MSTest

    - by TDD
    We have our source code stored in Kiln/Mercurial repositories; we use MSBuild to build our product and we have Unit Tests that utilize MSTest (Visual Studio Unit Tests). What solutions exist to implement a continuous integration machine (i.e. Build machine). The requirements for this are: A build should be kicked of when necessary (i.e. code has changed in the Repositories we care about) Before the actual build, the latest version of the source code must be acquired from the repository we are building from The build must build the entire product The build must build all Unit Tests The build must execute all unit tests A summary of success/failure must be sent out after the build has finished; this must include information about the build itself but also about which Unit Tests failed and which ones succeeded. The summary must contain which changesets were in this build that were not yet in the previous successful (!) build The system must be configurable so that it can build from multiple branches(/Repositories). Ideally, this system would run on a single box (our product isn't that big) without any server components. What solutions are currently available? What are their pros/cons? From the list above, what can be done and what cannot be done? Thanks

    Read the article

  • How to change Ubuntu Studio from XFCE to Gnome?

    - by Starx
    I have 12.04 Ubuntu Studio Installed at the moment. On removing a application called Terminal Emulator, I accidentally removed xfce too. As I am not very fond of xfce, I am OK with it as I can change the session to gnome 3 upon login. But session runs with occasional glitches. My question is how to remove xfce completely and turn it into pure gnome system. I am aware of the way to do apt-get install ubuntu-desktop But, this is going to change the ubuntu studio, I dont want that. I just want to change my ubuntu studio to pure gnome without uninstalling ubuntu-desktop. i.e. Something like ubuntustudio-12.04-gnome build. I've looked at this Q&A: How to remove xubuntu? The accepted answer removes some vital software from Ubuntu Studio such as Gimp, including ubuntustudio-desktop at the end.

    Read the article

  • At which point is a continuous integration server interesting?

    - by Cedric Martin
    I've been reading a bit about CI servers like Jenkins and I'm wondering: at which point is it useful? Because surely for a tiny project where you'd have only 5 classes and 10 unit tests, there's no real need. Here we've got about 1500 unit tests and they pass (on old Core 2 Duo workstations) in about 90 seconds (because they're really testing "units" and hence are very fast). The rule we have is that we cannot commit code when a test fail. So each developers launches all his tests to prevent regression. Obviously, because all the developers always launch all the test we catch errors due to conflicting changes as soon as one developer pulls the change of another (when any). It's still not very clear to me: should I set up a CI server like Jenkins? What would it bring? Is it just useful for the speed gain? (not an issue in our case) Is it useful because old builds can be recreated? (but we can do this to with Mercurial, by checking out old revs) Basically I understand it can be useful but I fail to see exactly why. Any explanation taking into account the points I raised above would be most welcome.

    Read the article

  • Today VS 2010 SP1 comes out, any news on the roadmap for Visual Studio 2012?

    - by Abel
    Today Visual Studio 2010 SP1 comes out as general availability release. This made me wondering about the upcoming release of Visual Studio 2012: What are Microsoft's plans for Visual Studio 2012? I heard they'll come with a new version every two years. Are there any open fora or discussions? When will a preview be publicly available? But most importantly: what are the new highlights, improvements in .NET and C#/F#/VB (and C++ of course, request from Stijn)?

    Read the article

  • Summary of usage policies for website integration of various social media networks?

    - by Dallas
    To cut to the chase... I look at Twitter's usage policy and see limitations on what can and can't be done with their logo. I also see examples of websites that use icons that have been integrated with the look and feel of their own site. Given Twitter's policy, for example, it would appear that legal conversations/agreements would need to take place to do this, especially on a commercial site. I believe it is perfectly acceptable to have a plain text button that simply has the word "Tweet" on it, that has the same functionality. My question is if anyone can provide online (or other) references that attempt to summarize what can and can't be done when integrating various social networks into your own work? The answer I will mark as the correct one will be the one which provides the best resource(s) giving the best summaries of what can and can't be done with specific logos/icons, with a secondary factor being that a variety of social networking sites are addressed in your answer. Before people point to specific questions, I am looking for a well-rounded approach that considers a breadth of networks and considerations. Background: I would like to incorporate social media icons and functionality, but would like to consider what type of modifications can be done without needing to involve lawyers. For example, can I bring in a standard Facebook logo, but incorporate my site color into the logo? Would the answer differ if I maintained their color, but add in a few pixels of another color to transition? I am not saying I want to do this, but rather using it as an example.

    Read the article

  • Window out of the screen when maximized using WPF shell integration library

    - by Eduardo Molteni
    I'm using the WPF Shell Integration Library to create a custom chrome of my wpf app. All is good, but when maximizing the app, 6 or 7 pixels are out of the screen. This is the code I'm using: <Style TargetType="{x:Type local:MainWindow}"> <Setter Property="shell:WindowChrome.WindowChrome"> <Setter.Value> <shell:WindowChrome ResizeBorderThickness="6" CaptionHeight="10" CornerRadius="0" GlassFrameThickness="1"/> </Setter.Value> </Setter> <Setter Property="Template"> <Setter.Value> <ControlTemplate TargetType="{x:Type local:MainWindow}"> <Grid> <Border BorderThickness="1" BorderBrush="#389FD1" Background="#389FD1"> <ContentPresenter Margin="0,22,0,0" Content="{TemplateBinding Content}"/> </Border> <StackPanel Orientation="Horizontal" HorizontalAlignment="Right" VerticalAlignment="Top" > <TextBlock Text="{Binding NombreUsuario}" Foreground="White" Margin="5,5,20,5" Opacity=".8" /> <Button Style="{StaticResource ImageButton}" Height="20" Width="20" Margin="0" Click="WindowMinimize" shell:WindowChrome.IsHitTestVisibleInChrome="True"> <Image Height="10" Width="10" Source="/Resources/Images/minimize.png" /> </Button> <Button Style="{StaticResource ImageButton}" Height="20" Width="20" Margin="0" Click="WindowMaximizeRestore" shell:WindowChrome.IsHitTestVisibleInChrome="True" > <Image Height="10" Width="10" Source="/Resources/Images/maximize.png" /> </Button> <Button Style="{StaticResource ImageButton}" Height="20" Width="20" Margin="0" Click="WindowClose" shell:WindowChrome.IsHitTestVisibleInChrome="True"> <Image Height="10" Width="10" Source="/Resources/Images/close.png" /> </Button> </StackPanel> </Grid> </ControlTemplate> </Setter.Value> </Setter> </Style>

    Read the article

  • How to ignore a test within the JUnit test method itself

    - by Benju
    We have a number of integration tests that fail when our staging server goes down for weekly maintenance. When the staging server is down we send a specific response that I could detect in my integration tests. When I get this response instead of failing the tests I'm wondering if it is possible to skip/ignore that test even though it has started running. This would keep our test reports a bit cleaner. Does anybody have suggestions?

    Read the article

  • Why isnt sql management studio integrated in visual studio?

    - by Rob Packwood
    I have both SQL Server 2005 and Visual Studio 2008 installed and think it would be really nice to have SQL Management Studio integrated directly within Visual Studio. Is there a way to make that happen? What about in VS 2010 with SQL Server 2008? I find the Visual Studio Server Explorer window to be much slower too than the Object Browser in SQL Server's Management Studio... it would be nice to never really need to use the Server Explorer.

    Read the article

  • Maximize Performance and Availability with Oracle Data Integration

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin-top:0in; mso-para-margin-right:0in; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0in; line-height:115%; mso-pagination:widow-orphan; font-size:10.0pt; font-family:"Calibri","sans-serif"; mso-fareast-font-family:Calibri; mso-bidi-font-family:"Times New Roman";} Alert: Oracle is hosting the 12c Launch Webcast for Oracle Data Integration and Oracle Golden Gate on Tuesday, November 12 (tomorrow) to discuss the new capabilities in detail and share customer perspectives. Hear directly from customer experts and executives from SolarWorld Industries America, British Telecom and Rittman Mead and get your questions answered live by product experts. Register for this complimentary webcast today and join in the discussion tomorrow. Author: Irem Radzik, Senior Principal Product Director, Oracle Organizations that want to use IT as a strategic point of differentiation prefer Oracle’s complete application offering to drive better business performance and optimize their IT investments. These enterprise applications are in the center of business operations and they contain critical data that needs to be accessed continuously, as well as analyzed and acted upon in a timely manner. These systems also need to operate with high-performance and availability, which means analytical functions should not degrade applications performance, and even system maintenance and upgrades should not interrupt availability. Oracle’s data integration products, Oracle Data Integrator, Oracle GoldenGate, and Oracle Enterprise Data Quality, provide the core foundation for bringing data from various business-critical systems to gain a broader, unified view. As a more advance offering to 3rd party products, Oracle’s data integration products facilitate real-time reporting for Oracle Applications without impacting application performance, and provide ability to upgrade and maintain the system without taking downtime. Oracle GoldenGate is certified for Oracle Applications, including E-Business Suite, Siebel CRM, PeopleSoft, and JD Edwards, for moving transactional data in real-time to a dedicated operational reporting environment. This solution allows the app users to offload the resource-heavy queries to the reporting instance(s), reducing CPU utilization, improving OLTP performance, and extending the lifetime of existing IT assets. In addition, having a dedicated reporting instance with up-to-the-second transactional data allows optimizing the reporting environment and even decreasing costs as GoldenGate can move only the required data from expensive mainframe environments to cost-efficient open system platforms.  With real-time data replication capabilities GoldenGate is also certified to enable application upgrades and database/hardware/OS migration without impacting business operations. GoldenGate is certified for Siebel CRM, Communications Billing and Revenue Management and JD Edwards for supporting zero downtime upgrades to the latest app version. GoldenGate synchronizes a parallel, upgraded system with the old version in real time, thus enables continuous operations during the process. Oracle GoldenGate is also certified for minimal downtime database migrations for Oracle E-Business Suite and other key applications. GoldenGate’s solution also minimizes the risk by offering a failback option after the switchover to the new environment. Furthermore, Oracle GoldenGate’s bidirectional active-active data replication is certified for Oracle ATG Web Commerce to enable geographically load balancing and high availability for ATG customers. For enabling better business insight, Oracle Data Integration products power Oracle BI Applications with high performance bulk and real-time data integration. Oracle Data Integrator (ODI) is embedded in Oracle BI Applications version 11.1.1.7.1 and helps to integrate data end-to-end across the full BI Applications architecture, supporting capabilities such as data-lineage, which helps business users identify report-to-source capabilities. ODI is integrated with Oracle GoldenGate and provides Oracle BI Applications customers the option to use real-time transactional data in analytics, and do so non-intrusively. By using Oracle GoldenGate with the latest release of Oracle BI Applications, organizations not only leverage fresh data in analytics, but also eliminate the need for an ETL batch window and minimize the impact on OLTP systems. You can learn more about Oracle Data Integration products latest 12c version in our upcoming launch webcast and access the app-specific free resources in the new Data Integration for Oracle Applications Resource Center.

    Read the article

  • Installing ASP.NET MVC 2 RTM on Visual Studio 2010 RC

    - by shiju
    Visual Studio 2010 RC is built against the ASP.NET MVC 2 RC version but you easily install ASP.NET MVC 2 RTM on the Visual Studio 2010 RC. For installing ASP.NET MVC 2 RTM, do the following steps 1) Uninstall "ASP.NET MVC 2 ". 2) Uninstall "Microsoft ASP.NET MVC 2 – Visual Studio 2008 Tools". 3) Install the new ASP.NET MVC 2 RTM version for Visual Studio 2008 SP1. The above steps will enable you to use ASP.NET MVC 2 RTM version on the Visual Studio 2010 RC. Note : Don't uninstall Microsoft ASP.NET MVC 2 – Visual Studio 2010 Tools

    Read the article

  • SQL SERVER – List of Article on Expressor Data Integration Platform

    - by pinaldave
    The ability to transform data into meaningful and actionable information is the most important information in current business world. In this fast growing and changing business needs effective data integration is single most important thing in making proper decision making. I have been following expressor software since November 2010, when I met expressor team in Seattle. Here are my posts on their innovative data integration platform and expressor Studio, a free desktop ETL tool: 4 Tips for ETL Software IDE Developers Introduction to Adaptive ETL Tool – How adaptive is your ETL? Sharing your ETL Resources Across Applications with Ease expressor Studio Includes Powerful Scripting Capabilities expressor 3.2 Release Review 5 Tips for Improving Your Data with expressor Studio As I had mentioned in some of my blog posts on them, I encourage you to download and test-drive their Studio product – it’s free. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: SSIS

    Read the article

  • How Visual WebGui helps ASP.NET Cloud-based apps

    - by Visual WebGui
    Everyone is talking about Cloud computing and moving to the cloud (public or private), but very few have actually done it so far. The reason is that the process of migrating existing applications to the cloud is a lot more complicated than one might think which is exactly where the Visual WebGui technology comes in for a rescue. In the past year the Visual WebGui R&D Team have been intensively working on a tool-based solution that gives Microsoft application developers and enterprises a simpler...(read more)

    Read the article

  • Visual Studio 11 not 2011

    - by Daniel Moth
    A little pet peeve of mine is when people incorrectly refer to the Developer Preview (or the upcoming Beta) as Visual Studio 2011 instead of the correct Visual Studio 11. The "11" refers to the version number (internally we call it Dev11). What the product will be called when it is released is anyone's guess (it could keep the name or it could have a year appended to it, or it could be something else, who knows). Even if it does have a year appended to the name, I think it is a safe bet it won't be last year! For reference, version 10 was the previous version of Visual Studio which happened to be released in 2010, hence it got the name Visual Studio 2010. That is what confuses new people to this product I guess... they think that the two-digit number matches the year, just because it coincided like that last year. (btw, internally we called it Dev10). For further reference, older releases were: Visual Studio 2008 (v9) aka "Orcas", Visual Studio 2005 (v8) aka "Whidbey", Visual Studio .NET 2003 (v7.1) aka "Everett", and Visual Studio .NET 2002 (v7) aka "Rainier". Before that, we were in the pre-.NET era with Visual Studio 6 (where the version and the product name matched, without the year appended to the name). So next time you hear someone saying "Visual Studio 2011", point them to this post for some mini-education... thanks. Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • Oracle’s AutoVue Enables Visual Decision Making

    - by Pam Petropoulos
    That old saying about a picture being worth a thousand words has never been truer.  Check out the latest reports from IDC Manufacturing Insights which highlight the importance of incorporating visual information in all facets of decision making and the role that Oracle’s AutoVue Enterprise Visualization solutions can play. Take a look at the excerpts below and be sure to click on the titles to read the full reports. Technology Spotlight: Optimizing the Product Life Cycle Through Visual Decision Making, August 2012 Manufacturers find it increasingly challenging to make effective product-related decisions as the result of expanded technical complexities, elongated supply chains, and a shortage of experienced workers. These factors challenge the traditional methodologies companies use to make critical decisions. However, companies can improve decision making by the use of visual decision making, which synthesizes information from multiple sources into highly usable visual context and integrates it with existing enterprise applications such as PLM and ERP systems. Product-related information presented in a visual form and shared across communities of practice with diverse roles, backgrounds, and job skills helps level the playing field for collaboration across business functions, technologies, and enterprises. Visual decision making can contribute to manufacturers making more effective product-related decisions throughout the complete product life cycle. This Technology Spotlight examines these trends and the role that Oracle's AutoVue and its Augmented Business Visualization (ABV) solution play in this strategic market. Analyst Connection: Using Visual Decision Making to Optimize Manufacturing Design and Development, September 2012 In today's environments, global manufacturers are managing a broad range of information. Data is often scattered across countless files throughout the product life cycle, generated by different applications and platforms. Organizations are struggling to utilize these multidisciplinary sources in an optimal way. Visual decision making is a strategy and technology that can address this challenge by integrating and widening access to digital information assets. Integrating with PLM and ERP tools across engineering, manufacturing, sales, and marketing, visual decision making makes digital content more accessible to employees and partners in the supply chain. The use of visual decision-making information rendered in the appropriate business context and shared across functional teams contributes to more effective product-related decision making and positively impacts business performance.

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >