Search Results

Search found 8264 results on 331 pages for 'agile platform'.

Page 18/331 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Google+ Platform Office Hours for February 29th 2012

    Google+ Platform Office Hours for February 29th 2012 We hold weekly Google+ Platform Office Hours using Hangouts On Air most Wednesdays from 11:30 am until 12:15 pm PST. This week students from the University of Washington are going to show us the karaoke Hangout App that they created during a recent hackathon. Then we'll field your questions about developing on the Google+ Platform. Discuss this video on Google+: goo.gl Learn more about our Office Hours: developers.google.com From: GoogleDevelopers Views: 1926 30 ratings Time: 30:28 More in Science & Technology

    Read the article

  • Easy to use cross-platform 3D engines for C++ game development?

    - by davr
    I want to try my hand at writing a 3D game. However I don't want to start at such a low level of drawing individual triangles and writing my own 3D object loader and so on. I've heard of things like Irrlicht, Crystal Space 3D, and Cafu, but I don't have any experience with any of them. I'm looking for suggestions from people who have experience with these or other engines on which ones are well written, and are easy to get started using, without having to learn a ton of 3D math theory and how GPU's work internally.

    Read the article

  • eFX on NetBeans Platform at Silicon Valley JavaFX User Group

    - by Geertjan
    Below you can watch (in addition to seeing Steve Chin and Ben Evans) Sven Reimers presenting eFX, a JavaFX application framework on the NetBeans Platform, yesterday at the Silicon Valley JavaFX User Group. While watching, you'll learn quite a few things about the NetBeans Platform, at the same time. In the end, you see a VisualVM clone written in JavaFX on the NetBeans Platform. Sven will also talk on this topic at NetBeans Day and during his sessions at JavaOne.

    Read the article

  • Tackling Security and Compliance Barriers with a Platform Approach to IDM: Featuring SuperValu

    - by Darin Pendergraft
    On October 25, 2012 ISACA and Oracle sponsored a webcast discussing how SUPERVALU has embraced the platform approach to IDM.  Scott Bonnell, Sr. Director of Product Management at Oracle, and Phil Black, Security Director for IAM at SUPERVALU discussed how a platform strategy could be used to formulate an upgrade plan for a large SUN IDM installation. See the webcast replay here: ISACA Webcast Replay (Requires Internet Explorer or Chrome) Some of the main points discussed in the webcast include: Getting support for an upgrade project by aligning with corporate initiatives How to leverage an existing IDM investment while planning for future growth How SUN and Oracle IDM architectures can be used in a coexistance strategy Advantages of a rationalized, modern, IDM Platform architecture ISACA Webcast Featuring SuperValu - Tackling Security and Compliance Barriers with a Platform Approach to Identity Management from OracleIDM  

    Read the article

  • Free E-Book from APress - Platform Embedded Security Technology Revealed

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2014/08/23/free-e-book-from-apress---platform-embedded-security-technology-revealed.aspxAt  http://www.apress.com/9781430265719, APress are providing a free E-Book - Platform Embedded Security Technology Revealed. “Platform Embedded Security Technology Revealed is an in-depth introduction to Intel’s security and management engine, with details on the security features and the steps for configuring and invoking them. It's written for security professionals and researchers; embedded-system engineers; and software engineers and vendors.”

    Read the article

  • Musical Movements on the NetBeans Platform

    - by Geertjan
    I came across VirtMus recently, the "modern music stand", on the NetBeans Platform: Its intentions remind me a LOT of Mike Kelly's Chord Maestro, which is also on the NetBeans Platform. Maybe the two should integrate? Speaking of music, I've been in touch with Winston Dehaney who is creating score notation software, named "Acapella Score", also on the NetBeans Platform: That's an app that could be integrated with the JFugue Music NotePad at some stage!

    Read the article

  • What could be a reason for cross-platform server applications developer to make his app work in multiple processes?

    - by Kabumbus
    So we consider a server app development - heavily loaded with messing with big data streams.An app will be running on one powerful server. a server app shall be developed in form of crossplatform application - so to work on Windows, Mac OS X and Linux. So same code many platforms for standing alone server architecture. We wonder what benefits does distributing applications not only over threads but over processes as wall would bring to programmers and to server end users and why? Some people sad to me that even having 48 cores, 4 process threads would be shared via OS throe all cores... is it true BTW?

    Read the article

  • handling the holding of money on a platform

    - by user1716672
    We are building a platform for a client, developed in Yii, where users can top up their account on the platform with money from paypal. Users can upload files and buy access to each others files. User can also gift other users with money. I was thinking that when users top up their account, the money goes rom their Paypal to the merchant account of the website. So all users' money goes to one merchant account. Then, any transactions on the platform are simply recorded on the platform and each users' balance is the maximum amount they can withdraw from the merchant account. Is this the right approach? Legally, are there any problems?

    Read the article

  • What could be a reason for cross-platform server applications developer to make his app work in multiple processes?

    - by Kabumbus
    We consider a server app development - heavily loaded with messing with big data streams. An app will be running on one powerful server. A server app will be developed in form of crossplatform application - working on Windows, Mac OS X and Linux. So same code, many platforms for stand alone server architecture. We wonder what are the benefits of distributing applications not only over threads but over processes as well, for programmers and server end users? Some people said to me that even having 48 cores, 4 process threads would be shared via OS through all cores, is that true?

    Read the article

  • Deploy and Test an Azure App with Platform Ready

    Microsoft Platform Ready provides technical and marketing resources for companies building applications for the Microsoft platform. Currently they are working with The Code Project on a promotion that will pay $250 USD to companies for their FIRST Windows Azure Application that is verified compatible using the Microsoft Platform Ready testing tools. The contest is valid only through 21 June 2011 12:00 PST in the US only, but the walkthrough I’m about to show will work for any company who wishes to confirm and verify to customers that their application is running correctly on Windows Azure.

    Read the article

  • Beyond Cloud Technology, Enabling A More Agile and Responsive Organization

    - by sxkumar
    This is the second part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain”. In the first part,  I was sharing with you how a broad-based transformation makes cloud more than a technology initiative, I will describe in this section how it requires people (organizational) and process changes as well, and these changes are as critical as is the choice of right tools and technology. People: Most IT organizations have a fairly complex organizational structure. There are different groups, managing different pieces of the puzzle, and yet, they don't always work together. Provisioning a new application therefore may require a request to float endlessly through system administrators, DBAs and middleware admin worlds – resulting in long delays and constant finger pointing.  Cloud users expect end-to-end automation - which requires these silos to be greatly simplified, if not completely eliminated.  Most customers I talk to acknowledge this problem but are quick to admit that such a transformation is hard. As hard as it may be, I am afraid that the status quo is no longer an option. Sticking to an organizational structure that was created ages back will not only impede cloud adoption,  it also risks making the IT skills increasingly irrelevant in a world that is rapidly moving towards converged applications and infrastructure.   Process: Most IT organizations today operate with a mindset that they must fully "control" access to any and all types of IT services. This in turn leads to people clinging on to outdated manual approval processes .  While requiring approvals for scarce resources makes sense, insisting that every single request must be manually approved defeats the very purpose of cloud. Not only this causes delays, thereby at least partially negating the agility benefits, it also results in gross inefficiency. In a cloud environment, self-service access should be governed by policies, quotas that the administrators can define upfront . For a cloud initiative to be successful, IT organizations MUST be ready to empower users by giving them real control rather than insisting on brokering every single interaction between users and the cloud resources. Technology: From a technology perspective, cloud is about consolidation, standardization and automation. A consolidated and standardized infrastructure helps increase utilization and reduces cost. Additionally, it  enables a much higher degree of automation - thereby providing users the required agility while minimizing operational costs.  Obviously, automation is the key to cloud. Unfortunately it hasn’t received as much attention within enterprises as it should have.  Many organizations are just now waking up to the criticality of automation and it still often gets relegated to back burner in favor of other "high priority" projects. However, it is important to understand that without the right type and level of automation, cloud will remain a distant dream for most enterprises. This in turn makes the choice of the cloud management software extremely critical.  For a cloud management software to be effective in an enterprise environment, it must meet the following qualifications: Broad and Deep Solution It should offer a broad and deep solution to enable the kind of broad-based transformation we are talking about.  Its footprint must cover physical and virtual systems, as well as infrastructure, database and application tiers. Too many enterprises choose to equate cloud with virtualization. While virtualization is a critical component of a cloud solution, it is just a component and not the whole solution. Similarly, too many people tend to equate cloud with Infrastructure-as-a-Service (IaaS). While it is perfectly reasonable to treat IaaS as a starting point, it is important to realize that it is just the first stepping stone - and on its own it can only provide limited business benefits. It is actually the higher level services, such as (application) platform and business applications, that will bring about a more meaningful transformation to your enterprise. Run and Manage Efficiently Your Mission Critical Applications It should not only be able to run your mission critical applications, it should do so better than before.  For enterprises, applications and data are the critical business assets  As such, if you are building a cloud platform that cannot run your ERP application, it isn't truly a "enterprise cloud".  Also, be wary of  vendors who try to sell you the idea that your applications must be written in a certain way to be able to run on the cloud. That is nothing but a bogus, self-serving argument. For the cloud to be meaningful to enterprises, it should adopt to your applications - and not the other way around.  Automated, Integrated Set of Cloud Management Capabilities At the root of many of the problems plaguing enterprise IT today is complexity. A complex maze of tools and technology, coupled with archaic  processes, results in an environment which is inflexible, inefficient and simply too hard to manage. Management tool consolidation, therefore, is key to the success of your cloud as tool proliferation adds to complexity, encourages compartmentalization and defeats the very purpose that you are building the cloud for. Decision makers ought to be extra cautious about vendors trying to sell them a "suite" of disparate and loosely integrated products as a cloud solution.  An effective enterprise cloud management solution needs to provide a tightly integrated set of capabilities for all aspects of cloud lifecycle management. A simple question to ask: will your environment be more or less complex after you implement your cloud? More often than not, the answer will surprise you.  At Oracle, we have understood these challenges and have been working hard to create cloud solutions that are relevant and meaningful for enterprises.  And we have been doing it for much longer than you may think. Oracle was one of the very first enterprise software companies to make our products available on the Amazon Cloud. As far back as in 2007, we created new cloud solutions such as Cloud Database Backup that are helping customers like Amazon save millions every year.  Our cloud solution portfolio is also the broadest and most deep in the industry  - covering public, private, hybrid, Infrastructure, platform and applications clouds. It is no coincidence therefore that the Oracle Cloud today offers the most comprehensive set of public cloud services in the industry.  And to a large part, this has been made possible thanks to our years on investment in creating cloud enabling technologies. I will dedicated the third and final part of the blog “Clouds, Clouds Everywhere But not a Drop of Rain” to Oracle Cloud Technologies Building Blocks and how they mapped into our vision of Enterprise Cloud. Stay Tuned.

    Read the article

  • Rewritten NetBeans Platform Feed Reader Tutorial

    - by Geertjan
    The next tutorial that has been thoroughly restructured and rewritten is: NetBeans Platform Feed Reader Tutorial Originally written by Rich Unger, it was one of the very first NetBeans Platform tutorials that ever existed. In this particular rewrite, the entire structure of the tutorial has changed, in an attempt to make the flow more "bite size", rather than a big lump. Also, thanks to recent NetBeans Platform changes, there are no Bundle files anymore, all Strings are declared via @Messages annotations. Theoretically, the browser in the application could be a JavaFX WebView, though the browser part of the application isn't a central theme of the tutorial, hence only a reference is made to the JavaFX alternative. Here's what it looks like: Comments to the NetBeans Platform Feed Reader Tutorial are, as always, very welcome. 

    Read the article

  • Cross-platform game development: ease of development vs security

    - by alcuadrado
    Hi, I'm a member and contributor of the Argentum Online (AO) community, the first MMORPG from Argentina, which is Free Software; which, although it's not 3D, it's really addictive and has some dozens of thousands of users. Really unluckily AO was developed in Visual Basic (yes, you can laugh) but the former community, so imagine, the code not only sucks, it has zero portability. I'm planning, with some friends to rewrite the client, and as a GNU/Linux frantic, want to do it cross-platform. Some other people is doing the same with the server in Java. So my biggest problem is that we would like to use a rapid development language (like Java, Ruby or Python) but the client would be pretty insecure. Ruby/Python version would have all it's code available, and the Java one would be easily decompilable (yes, we have some crackers in the community) We have consider the option to implement the security module in C/C++ as a dynamic library, but it can be replaced with a custom one, so it's not really secure. We are also considering the option of doing the core application in C++ and the GUI in Ruby/Python. But haven't analysed all it's implications yet. But we really don't want to code the entire game in C/C++ as it doesn't need that much performance (the game is played at 18fps on average) and we want to develop it as fast as possible. So what would you choose in my case? Thank you!

    Read the article

  • Removing a platform from Configuration Manager

    - by demoncodemonkey
    I have a solution containing C# and C++/CLI projects. There are 3 platforms in my solution: Any CPU Win32 Mixed Platforms I never want to "just build the C# ones" or "just build the C++ ones", I always want to build all projects. So the platforms metaphor is meaningless to me, I'll leave it on Mixed Platforms or whatever as long as they all build. Now VS sometimes automatically switches the current platform to Any CPU (I'm not sure when or why). This means that pressing F7 will only try to build the C# projects, which is obviously no good. So I have to switch back to Mixed Platforms and try again. So how to workaround this irritating problem? I have tried 2 ways: In Configuration Manager, remove Any CPU and Win32 platforms. This worked until I added a new project and Visual Studio very kindly added them back in... :/ In Configuration Manager, check all checkboxes for all projects in all configurations in all platforms. This becomes a nightmare to manage with many projects in the solution. Any other ideas?

    Read the article

  • Web Start Application built on NetBeans Platform doesn't create desktop shortcut & start menu item

    - by rudolfv
    I've created a NetBeans Platform application that is launched using Java Web Start. I built the WAR file using the 'Build JNLP Application'-command in Netbeans. I've added a desktop shortcut and menu item to the JNLP file, but for some reason, these are not created when the application is launched. However, when I go to: Control Panel - Java - Temporary Internet Files - View - Select my application Click 'Install shortcuts to the selected application' the desktop and menu shortcuts are created correctly. Also, in the Java Console, the Shortcut Creation option is set to the following (the default, I presume): Prompt user if hinted Below is a snippet of my JNLP file: <jnlp spec="6.0+" codebase="$$codebase"> <information> <title>${app.title}</title> <vendor>SomeVendor (Pty) Ltd</vendor> <description>Some description</description> <icon href="${app.icon}"/> <shortcut online="true"> <desktop/> <menu submenu="MyApp"/> </shortcut> </information> ... I'm stumped. Does anybody have an explanation for this? Thanks PS This is on both Windows XP and Windows 7. NetBeans version: 6.8

    Read the article

  • Cross-Platform Camera API

    - by Karim
    Hi, I'm now building a video transforming filter that have to transform video frames in real-time. One of the key requirements of the filter is to have high performance to minimize the number of dropped frames during the transform. Another requirement that is of lower priority but also nice to have is to make it cross-platform (both PC's and Mobile devices). The application is built in C++. Now my question is: is there any API that is more portable and has a similar or better performance characteristics than DirectShow? as DirectShow's portability is only limited to Windows-based devices (PCs and Windows Mobile&CE platforms). Also I've notices that for example using HTC's custom camera API has far better performance than what DirectShow offers. If you want to check this, try to build a filter in DirectShow that will multiply each color by 2 and render that in real-time from camera on the screen. Then do the same with HTC's API. There is almost 4-5x performance boost with vendor's specific API. So it'd be very nice if the library used the device-specific implementation of the driver, as performance is critical when doing this transforms on a mobile device (which is about ~500 MHz).

    Read the article

  • Cross-platform iteration of Unicode string

    - by kizzx2
    I want to iterate each character of a Unicode string, treating each surrogate pair and combining character sequence as a single unit (one grapheme). Example The text "??????" is comprised of the code points: U+0928, U+092E, U+0938, U+094D, U+0924, U+0947, of which, U+0938 and U+0947 are combining marks. static void Main(string[] args) { const string s = "??????"; Console.WriteLine(s.Length); // Ouptuts "6" var l = 0; var e = System.Globalization.StringInfo.GetTextElementEnumerator(s); while(e.MoveNext()) l++; Console.WriteLine(l); // Outputs "4" } So there we have it in .NET. We also have Win32's CharNextW() #include <Windows.h> #include <iostream> #include <string> int main() { const wchar_t * s = L"??????"; std::cout << std::wstring(s).length() << std::endl; // Gives "6" int l = 0; while(CharNextW(s) != s) { s = CharNextW(s); ++l; } std::cout << l << std::endl; // Gives "4" return 0; } Question Both ways I know of are specific to Microsoft. Are there portable ways to do it? I heard about ICU but I couldn't find something related quickly (UnicodeString(s).length() still gives 6). Would be an acceptable answer to point to the related function/module in ICU. C++ doesn't have a notion of Unicode, so a lightweight cross-platform library for dealing with these issues would make an acceptable answer.

    Read the article

  • Cross-platform general purpose C++ RPC library

    - by iUm
    Here's the task: Imagine, we have an applications and a plug-in for it (dynamic library). Interface between the application and the plug-in is completely defined. Now I need to run the application and the plug-in on different computers. I wrote a stub for the plug-in on a computer where the real applications is running. And the application loads it and calls its functions like if it were a native plug-in. On the other computer there's a stub instead of the real application, which loads the native plug-in. Now I need to organize RPCs between my stubs over the network, regardless the very transport. Usually, it's not difficult. But there're some restrictions: Application-plug-in interaction can be reenterable (e.g. application calls f1() from plug-in, in f1() plug-in calls g1() from application, in g1() application calls f2() from plug-in and so on...) Any such reenteration should be executed exactly by the same thread, which started the sequence Where can I find a cross-platform C++ RPC library with such features?

    Read the article

  • Create cross platform Java SWT Application

    - by mchr
    I have written a Java GUI using SWT. I package the application using an ANT script (fragment below). <jar destfile="./build/jars/swtgui.jar" filesetmanifest="mergewithoutmain"> <manifest> <attribute name="Main-Class" value="org.swtgui.MainGui" /> <attribute name="Class-Path" value="." /> </manifest> <fileset dir="./build/classes" includes="**/*.class" /> <zipfileset excludes="META-INF/*.SF" src="lib/org.eclipse.swt.win32.win32.x86_3.5.2.v3557f.jar" /> </jar> This produces a single jar which on Windows I can just double click to run my GUI. The downside is that I have had to explicitly package the windows SWT package into my jar. I would like to be able to run my application on other platforms (primarily Linux and OS X). The simplest way to do it would be to create platform specific jars which packaged the appropriate SWT files into separate JARs. Is there a better way to do this? Is it possible to create a single JAR which would run on multiple platforms?

    Read the article

  • Enterprise Platform in Python, Design Advice

    - by Jason Miesionczek
    I am starting the design of a somewhat large enterprise platform in Python, and was wondering if you guys can give me some advice as to how to organize the various components and which packages would help achieve the goals of scalability, maintainability, and reliability. The system is basically a service that collects data from various outside sources, with each outside source having its own separate application. These applications would poll a central database and get any requests that have been submitted to perform on the external source. There will be a main website and REST/SOAP API that should also have access to the central data service. My initial thought was to use Django for the web site, web service and data access layer (using its built-in ORM), and then the outside source applications can use the web service(s) to get the information they need to process the request and save the results. Using this method would allow me to have multiple instances of the service applications running on the same or different machines to balance out the load. Are there more elegant means of accomplishing this? i've heard of messaging systems such as MQ, would something like that be beneficial in this scenario? My other thought was to use a completely separate data service not based on Django, and use some kind of remoting or remote objects (in they exist in Python) to interact with the data model. The downside here would be with the website which would become much slower if it had to push all of its data requests through a second layer. I would love to hear what other developers have come up with to achieve these goals in the most flexible way possible.

    Read the article

  • Cross-platform HTML application options

    - by Charles
    I'd like to develop a stand-alone desktop application targeting Windows (XP through 7) and Mac (Tiger through Snow Leopard), and if possible iPhone and Android. In order to make it all work with as much common code as possible (and because it's the only thing I'm good at), I'd like to handle the main logic with HTML and JS. Using Adobe AIR is a possibility. And I think I can do this with various application wrappers, using .NET for Windows XP, Objective C for iPhone, Java for Android and native "widget" platform support for Mac and Windows Vista & 7 (though I'd like to keep the widget in the foreground, so the Mac dashboard isn't ideal). Does anyone have any suggestions on where to start? The two sticking points are: I'll certainly need some form of persistent storage (cookies perhaps) to keep state between sessions I'll also probably need access to remote data files, so if I use AJAX and the hosting HTML file resides on the device, it will need to be able to do cross-domain requests. I've done this on the iPhone without any problems, but I'd be surprised if this were possible on other platforms. For me, Android and iPhone will be the easiest to handle, and it looks like I can use Adobe AIR to handle the rest. But I wanted to know if there are any other alternatives. Does anyone have any suggesions?

    Read the article

  • Five Reasons to Attend PLM Summit 2013: The Conference Formerly Known as AGILITY

    - by Terri Hiskey
    As we approach the end of 2012, we are also closing in on the last couple of weeks that Agile customers and prospects can register for the upcoming PLM Summit 2013 for the bargain early bird rate of $195. Register now to secure your spot! The Conference Formerly Known as AGILITY... Long-time Agile customers may remember AGILITY, which was Agile's PLM customer conference that was held on an annual basis prior to Oracle's acquisiton of Agile in 2007. In February 2012, due to feedback we received from our Agile PLM community, we successfully resurrected the AGILITY conference and renamed it the PLM Summit. The PLM Summit was so well received and well-attended, that we are doing it again in 2013. This upcoming PLM Summit is being co-located in San Francisco under the overarching banner of the Oracle Value Chain Summit, and will be held alongside several other Oracle customer conferences that cover a range of value chain solutions, including Value Chain Planning, Value Chain Execution, Procurement, Maintenance and Manufacturing. This setup offers PLM attendees the best of all worlds--the opportunity to participate and learn about PLM in smaller, focused sessions by product and by industry, while also giving attendees the chance to see how PLM works together with other critical enterprise applications that address other important aspects of the value chain. Top Five Reasons to Attend the PLM Summit 2013 In the spirit of all of the end-of-the-year lists that are currently popping up, here is a list of the top five reasons to attend the PLM Summit for anyone out there needs a little extra encouragement to register: 1. The Best Opportunities for Customer Networking   The PLM Summit offers attendees numerous opportunities to learn and network with fellow Agile users. Customer stories are featured in keynote and breakout presentations and the schedule allows for plenty of networking time during breakfasts, lunches, breaks and dinners. Customer networking is the number one reason that Agile users attend the PLM Summit. Read what attendees thought of the most recent PLM Summit: "Hearing about the implementation of Agile products from a customers’ perspective is invaluable." - Director of Quality Assurance & Regulatory Affairs, leading medical device manufacturer "Understanding the scope of other companies’ projects and the lessons learned made attending this event well worth my time." - Director of Test Engineering, global industrial manufacturer "The most beneficial thing about attending this event is the opportunity to network with other customers with similar experiences." - Director of Business Process Improvement, leading high technology company Come to the PLM Summit and play an active role within the PLM community: swap war stories and business cards, connect on LinkedIn and Facebook, share your stories and discuss the sessions from each day. Register now! 2. It's Educational! The PLM Summit is the premier educational event for anyone in the Agile PLM community. There are nearly 40 PLM-focused in-depth educational sessions led by Agile PLM experts, customers and partners that will cover a range of specific product and industry-focused topics. Keynotes will give attendees a broad overview of the entire Agile PLM footprint, while sessions will delve deeply into specific product functionality and customer case studies. There is truly something for everyone. Check out the latest agenda for view of all the sessions. 3. Visit with the PLM Partner Community Our partners play a significant and important role within the Agile PLM community. At the PLM Summit, attendees will be able to meet and mingle with several of the top Oracle Agile PLM partners including: Deloitte, Domain, GoEngineer, Hitachi Consulting, IBM, Kalypso, KPIT Cummins (CPG Solutions), Perception Software, Verdant, Xavor and ZeroWaitState. Go here for a complete list of all the Value Chain Summit sponsors. 4. See Agile PLM in Action at our Dedicated PLM Demo Pods At the PLM Summit, attendees will have the chance to see Agile PLM in action at dedicated PLM demo pods, manned by expert members of our Agile PLM team. If you would like to see up close specific Agile PLM functionality, or if you have a question on how to extend the scope of your current implemention or if you want a better understanding of how to leverage Agile PLM to address specific use-cases, stop by one of the Agile PLM demo pods and engage the Agile PLM experts on hand at the PLM Summit. 5. Spend Some Time in Lovely San Francisco Still on the fence about the upcoming PLM Summit? Remember that it is being held in San Francisco, which is a fantastic city for a getaway. After spending time learning and networking about PLM, take an extra day or two to escape the dreary winter and enjoy the beautiful scenery and the unique actitivies offered only by the City by the Bay. You will walk away from the conference not only with renewed excitement about Agile PLM, but feeling rejuvenated in general.

    Read the article

  • How to conciliate OOAD and Database Design?

    - by user1620696
    Recently I've studied about object oriented analysis and design and I liked a lot about it. In every place I've read people say that the idea is to start with the minimum set of requirements and go improving along the way, revisiting this each iteration and making it better as we contiuously develop and contact the customer interested in the software. In particular, one course from Lynda.com said a lot of that: we don't want to spend a lot of time planing everything upfront, we just want to have the minimum to get started and then improve this each iteration. Now, I've also seem a course from the same guy about database design, and there he says differently. He says that although when working with object orientation he likes the agile iterative approach, for database design we should really spend a lot of time planing things upfront instead of just going along the way with the minimum. But this confuses me a little. Indeed, the database will persist important data from our domain model and perhaps configurations of the software and so on. Now, if I'm going to continuously revist the analysis and design of the model, it seems the database design should change also. In the same way, if we plan all the database upfront it seems we are also planing all the model upfront, so the two ideas seems to be incompatible. I really like agile iterative approach, but I'm also looking at getting better design for the database also, so when working with agile iterative approach, how should we deal with the database design?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >