Search Results

Search found 9710 results on 389 pages for 'training resources'.

Page 73/389 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • Windows Azure Recipe: Enterprise LOBs

    - by Clint Edmonson
    Enterprises are more and more dependent on their specialized internal Line of Business (LOB) applications than ever before. Naturally, the more software they leverage on-premises, the more infrastructure they need manage. It’s frequently the case that our customers simply can’t scale up their hardware purchases and operational staff as fast as internal demand for software requires. The result is that getting new or enhanced applications in the hands of business users becomes slower and more expensive every day. Being able to quickly deliver applications in a rapidly changing business environment while maintaining high standards of corporate security is a challenge that can be met right now by moving enterprise LOBs out into the cloud and leveraging Azure’s Access Control services. In fact, we’re seeing many of our customers (both large and small) see huge benefits from moving their web based business applications such as corporate help desks, expense tracking, travel portals, timesheets, and more to Windows Azure. Drivers Cost Reduction Time to market Security Solution Here’s a sketch of how many Windows Azure Enterprise LOBs are being architected and deployed: Ingredients Web Role – this will host the core of the application. Each web role is a virtual machine hosting an application written in ASP.NET (or optionally php, or node.js). The number of web roles can be scaled up or down as needed to handle peak and non-peak traffic loads. Many Java based applications are also being deployed to Windows Azure with a little more effort. Database – every modern web application needs to store data. SQL Azure databases look and act exactly like their on-premise siblings but are fault tolerant and have data redundancy built in. Access Control – this service is necessary to establish federated identity between the cloud hosted application and an enterprise’s corporate network. It works in conjunction with a secure token service (STS) that is hosted on-premises to establish the corporate user’s identity and credentials. The source code for an on-premises STS is provided in the Windows Azure training kit and merely needs to be customized for the corporate environment and published on a publicly accessible corporate web site. Once set up, corporate users see a near seamless single sign-on experience. Reporting – businesses live and die by their reports and SQL Azure Reporting, based on SQL Server Reporting 2008 R2, can serve up reports with tables, charts, maps, gauges, and more. These reports can be accessed from the Windows Azure Portal, through a web browser, or directly from applications. Service Bus (optional) – if deep integration with other applications and systems is needed, the service bus is the answer. It enables secure service layer communication between applications hosted behind firewalls in on-premises or partner datacenters and applications hosted inside Windows Azure. The Service Bus provides the ability to securely expose just the information and services that are necessary to create a simpler, more secure architecture than opening up a full blown VPN. Data Sync (optional) – in cases where the data stored in the cloud needs to be shared internally, establishing a secure one-way or two-way data-sync connection between the on-premises and off-premises databases is a perfect option. It can be very granular, allowing us to specify exactly what tables and columns to synchronize, setup filters to sync only a subset of rows, set the conflict resolution policy for two-way sync, and specify how frequently data should be synchronized Training Labs These links point to online Windows Azure training labs where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Windows Azure Recipe: Software as a Service (SaaS)

    - by Clint Edmonson
    The cloud was tailor built for aspiring companies to create innovative internet based applications and solutions. Whether you’re a garage startup with very little capital or a Fortune 1000 company, the ability to quickly setup, deliver, and iterate on new products is key to capturing market and mind share. And if you can capture that share and go viral, having resiliency and infinite scale at your finger tips is great peace of mind. Drivers Cost avoidance Time to market Scalability Solution Here’s a sketch of how a basic Software as a Service solution might be built out: Ingredients Web Role – this hosts the core web application. Each web role will host an instance of the software and as the user base grows, additional roles can be spun up to meet demand. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model provides extensibility to hook into other industry specific identity providers as well. Databases – nearly every modern SaaS application is backed by a relational database for its core operational data. If the solution is sold to organizations, there’s a good chance multi-tenancy will be needed. An emerging best practice for SaaS applications is to stand up separate SQL Azure database instances for each tenant’s proprietary data to ensure isolation from other tenants. Worker Role – this is the best place to handle autonomous background processing such as data aggregation, billing through external services, and other specialized tasks that can be performed asynchronously. Placing these tasks in a worker role frees the web roles to focus completely on user interaction and data input and provides finer grained control over the system’s scalability and throughput. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Blobs (optional) – depending on the nature of the software, users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Training & Examples These links point to online Windows Azure training labs and examples where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Developing Applications for the Cloud, 2nd Edition (eBook) This book demonstrates how you can create from scratch a multi-tenant, Software as a Service (SaaS) application to run in the cloud using the latest versions of the Windows Azure Platform and tools. The book is intended for any architect, developer, or information technology (IT) professional who designs, builds, or operates applications and services that run on or interact with the cloud. Fabrikam Shipping (SaaS reference application) This is a full end to end sample scenario which demonstrates how to use the Windows Azure platform for exposing an application as a service. We developed this demo just as you would: we had an existing on-premises sample, Fabrikam Shipping, and we wanted to see what it would take to transform it in a full subscription based solution. The demo you find here is the result of that investigation See my Windows Azure Resource Guide for more guidance on how to get started, including more links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • The Dispose Pattern (and FxCop warnings)

    - by Scott Dorman
    [This is actually a response to Bill’s blog post, but since it isn’t possible to leave this as a comment on his blog it’s a post here.] There are many different ways to implement the Dispose pattern correctly. Some are (in my opinion) better than others. In Bill’s blog post he presents a particular pattern, which is an excerpt from his book (Effective C#). The issue centers around the fact that a reader took the code sample presented in the book and ran FxCop (Code Analysis) on it, which generated a warning: “Ensure that base.Dispose() is always called.” The “lesson learned” that Bill presents is that “tools are there to help us, not control us.” While I completely agree with the belief that tools are there to help us, I think it’s important to understand why FxCop is raising this particular warning. The code presented in Bill’s book looks like: // Have its own disposed flag.private bool disposed = false;protected override void Dispose(bool isDisposing){ // Don't dispose more than once. if (disposed) return; if (isDisposing) { // TODO: free managed resources here. } // TODO: free unmanaged resources here. // Let the base class free its resources. // Base class is responsible for calling // GC.SuppressFinalize( ) base.Dispose(isDisposing); // Set derived class disposed flag: disposed = true;} This code does follow all of the guidelines for implementing the Dispose pattern. In this case, it’s presumably part of a larger example showing how to implement the pattern as part of a base class. The reason FxCop is warning you about this code is the first if statement in the Dispose method, which will cause the method to exit if disposed is true. The problem here is that there is the possibility that if the disposed flag is true, the call to base.Dispose() will never be executed. As Bill points out, it is possible for some other code elsewhere in the class to set this flag. He states that this is an “unlikely occurrence.” While that is probably true, it can be a potentially dangerous assumption to make and is one that can be easily corrected. By changing the code slightly you can remove this assumption and correct the FxCop violation. private bool disposed = false;protected override void Dispose(bool disposing){ if (!disposed) { if (disposing) { // Dispose managed resources. } // Dispose unmanaged resources. disposed = true; } base.Dispose(disposing);} Using this implementation allows the call to base.Dispose() to always occur, which ensures that the the disposal chain is always properly followed. Technorati Tags: .NET,C#,Dispose Pattern

    Read the article

  • Walmart and Fusion Apps

    - by ultan o'broin
    Photograph: Misha Vaughan I attended Fusion Apps (yes, I know I am supposed to say "Oracle Fusion Applications", but stuffy old style guides are a turn-off in interwebs conversations) User Experience Advocate (FXA) training in Long Beach, California last week; a suitable location as ODTUG KSCOPE 11 was kicking off and key players were in the area. As a member of Oracle's Apps-UX team I know the Fusion Apps messaging, natch, and done some other Fusion Apps go-to-market content work too. For the messaging details themselves, see Lonneke Dikmans (@lonnekedikmans) great blog, by the way. However, I wanted some 'formal' training combined with the opportunity to meet and learn from people already out there delivering those messages. The idea in me reaching out to Misha Vaughan, Apps-UX FXA maven, to get me onto this training was that in addition to my UX knowledge, I could leverage my location in EMEA and hit up customer events more quickly and easily. Those local user groups do like to hear the voice of locals too you know (so I need to work on that mid-Atlantic accent). I'm looking forward to such opportunities. The training was all smashing stuff, just the right level of detail, delivered professionally and with great style and humor. I was especially honored to be paired off for my er, coaching with Debra Lilley (@debralilley), who shared with everyone all kinds of tips and insights from her experiences of delivering the message and demo. For me, that was the real power of the FXA event--the communal, conversational aspect--the meeting up with people who had done all this for real, the sharing in their experiences, while learning along with other newbies. Sorry, but that all-important social aspect doesn't work so well with remote meetings. Katie Candland (Apps-UX) gave us a great tour of the Fusion Apps demo and included some useful presentational tips too (any excuse to buy that iPad). It's clear to me that the Fusion Apps messaging and demos really come alive with real-world examples that local application users will recognize, and I picked up some "yes, that's my job made easier" scene-stealers from Debra and Karen Brownfield too, to add to the great ones already provided. This power of examples shouldn't surprise anyone, they've long been a mainstay of applications user assistance, popular with users. We'll offer customers different types of example topics in the Fusion Apps online help too (stay tuned), and we know from research how important those 3S's (stories, scenarios, and simulations) are to users when they consume and apply information. Well, we've got the simulation, now it's time for more stories and scenarios. If you get a chance to participate in an FXA event (whether you are an Oracle employee or otherwise), I'd encourage it. It's committing your time and energy for sure, but I got real bang for the buck from it for my everyday job too. Listening to the room's feedback on the application demo really brought our internal design work to life, and I picked up on some things that I need to follow up on (like how you alphabetically sort stuff in other languages). User experience is after all, about users. What will I be doing next, and what would I like to see happen? Obviously, I need to develop my story-telling links with the people I met in Long Beach and do some practicing with the materials, and then get out there and deliver them at a suitable location. The demo is what it is right now, and that's a super-rich demo that I know everyone will want to see and ask questions about. Then, as mentioned by attendees at the FXA event, follow up on those translated and localized messages for EMEA (and APAC), that deal with different statutory or reporting requirements of the target markets. Given my background I would say that, wouldn't I? However, language is part of the UX, and international revenue is greater than US-only revenue for Oracle, so yes dear, we all need to get over the fact that enterprise apps users don't all speak, or want to speak, American-English. Most importantly perhaps, the continued development of a strong messaging community between Oracle and partners and customers where we can swap and share those FXA messaging stories and scenarios about Fusion Apps in a conversational way. The more the better, a combination of online and face-to-face meetings. I must also mention the great dinner after the event at Parker's Lighthouse, and the fun myself and Andrew Gilmour (Apps-UX) had at our end of the table talking about just about everything except Fusion Apps with Ronald Van Luttikhuizen and Ben Prusinski (who now understands the difference between Cork and Dublin people. I hope). Thanks to all the Apps-UXers who helped bring the FXA training to town, and to Debra and all the others that I am too jetlagged to mention right who were instrumental in making it happen for me. Here's to the next one. And the Walmart angle? That was me doing my Robert Scoble (ScO'bilizer?)-style guerilla smart phone research in Walmart in Long Beach, before the FXA event. It's all about stories for me. You can read more about it on the appslab blog (see the comments).

    Read the article

  • Oracle RightNow CX for Good Customer Experiences

    - by Andreea Vaduva
    Oracle RightNow CX is all about the customer experience, it’s about understanding what drives a good interaction and it’s about delivering a solution which works for our customers and by extension, their customers. One of the early guiding principles of Oracle RightNow was an 8-point strategy to providing good customer experiences. Establish a knowledge foundation Empowering the customer Empower employees Offer multi-channel choice Listen to the customer Design seamless experiences Engage proactively Measure and improve continuously The application suite provides all of the tools necessary to deliver a rewarding, repeatable and measurable relationship between business and customer. The Knowledge Authoring tool provides gap analysis, WYSIWIG editing (and includes HTML rich content for non-developers), multi-level categorisation, permission based publishing and Web self-service publishing. Oracle RightNow Customer Portal, is a complete web application framework that enables businesses to control their own end-user page branding experience, which in turn will allow customers to self-serve. The Contact Centre Experience Designer builds a combination of workspaces, agent scripting and guided assistances into a Desktop Workflow. These present an agent with the tools they need, at the time they need them, providing even the newest and least experienced advisors with consistently accurate and efficient information, whilst guiding them through the complexities of internal business processes. Oracle RightNow provides access points for customers to feedback about specific knowledge articles or about the support site in general. The system will generate ‘incidents’ based on the scoring of the comments submitted. This makes it easy to view and respond to customer feedback. It is vital, more now than ever, not to under-estimate the power of the social web – Facebook, Twitter, YouTube – they have the ability to cause untold amounts of damage to businesses with a single post – witness musician Dave Carroll and his protest song on YouTube, posted in response to poor customer services from an American airline. The first day saw 150,000 views and is currently at 12,011,375. The Times reported that within 4 days of the post, the airline’s stock price fell by 10 percent, which represented a cost to shareholders of $180 million dollars. It is a universally acknowledged fact, that when customers are unhappy, they will not come back, and, generally speaking, it only takes one bad experience to lose a customer. The idea that customer loyalty can be regained by using social media channels was the subject of a 2011 Survey commissioned by RightNow and conducted by Harris Interactive. The survey discovered that 68% of customers who posted a negative review about a holiday on a social networking site received a response from the business. It further found that 33% subsequently posted a positive review and 34% removed the original negative review. Cloud Monitor provides the perfect mechanism for seeing what is being said about a business on public Facebook pages, Twitter or YouTube posts; it allows agents to respond proactively – either by creating an Oracle RightNow incident or by using the same channel as the original post. This leaves step 8 – Measuring and Improving: How does a business know whether it’s doing the right thing? How does it know if its customers are happy? How does it know if its staff are being productive? How does it know if its staff are being effective? Cue Oracle RightNow Analytics – fully integrated across the entire platform – Service, Marketing and Sales – there are in excess of 800 standard reports. If this were not enough, a large proportion of the database has been made available via the administration console, allowing users without any prior database experience to write their own reports, format them and schedule them for e-mail delivery to a distribution list. It handles the complexities of table joins, and allows for the manipulation of data with ease. Oracle RightNow believes strongly in the customer owning their solution, and to provide the best foundation for success, Oracle University can give you the RightNow knowledge and skills you need. This is a selection of the courses offered: RightNow Customer Service Administration Rel 12.02 (3 days) Available as In Class and Live Virtual Class (Release 11.11 is available as In Class, Live Virtual Class and Training On Demand) This course familiarises users with the tasks and concepts needed to configure and maintain their system. RightNow Customer Portal Designer and Contact Center Experience Designer Administration Rel 12.02 (2 days) Available as In Class and Live Virtual Class (Release 11.11 is available as In Class, Live Virtual Class and Training On Demand) This course introduces basic CP structure and how to make changes to the look, feel and behaviour of their self-service pages RightNow Analytics Rel 12.02 (2 days) Available as In Class, Live Virtual Class and Training On Demand (Release 11.11 is available as In Class and Live Virtual Class) This course equips users with the skills necessary to understand data supplied by standard reports and to create custom reports RightNow Integration and Customization For Developers Rel 12.02 (5-days) Available as In Class and Live Virtual Class (Release 11.11 is available as In Class, Live Virtual Class and Training On Demand) This course is for experienced web developers and offers an introduction to Add-In development using the Desktop Add-In Framework and introduces the core knowledge that developers need to begin integrating Oracle RightNow CX with other systems A full list of courses offered can be found on the Oracle University website. For more information and course dates please get in contact with your local Oracle University team. On top of the Service components, the suite also provides marketing tools, complex survey creation and tracking and sales functionality. I’m a fan of the application, and I think I’ve made that clear: It’s completely geared up to providing customers with support at point of need. It can be configured to meet even the most stringent of business requirements. Oracle RightNow is passionate about, and committed to, providing the best customer experience possible. Oracle RightNow CX is the application that makes it possible. About the Author: Sarah Anderson worked for RightNow for 4 years in both in both a consulting and training delivery capacity. She is now a Senior Instructor with Oracle University, delivering the following Oracle RightNow courses: RightNow Customer Service Administration RightNow Analytics RightNow Customer Portal Designer and Contact Center Experience Designer Administration RightNow Marketing and Feedback

    Read the article

  • Controlling text that appears in google search results

    - by Mick
    I have recently made a simple (pure HTML) website. The most important key phrase that I want to capture is "full reserve banking". Currently, if I type "full reserve banking" (without quotes) into google, then my site appears as the 7th item on the first page. I am reasonably happy with this as the site is so new. But one frustration is that the text that google displays in relation to my site is rather misleading. The main message I would like to get across is that my site is "A collection of resources for anyone interested in this alternative monetary system." and I have this as the first line of text on the page. Unfortunately, this important sentence is nowhere to be seen in the google search result. So my question is - is there anything I can do to fix this error? Edit: I noticed that someone edited this question to remove the name of the website. I was very keen to leave it in because being able to look at it makes it far easier to diagnose what I did wrong. Indeed the answer suggested by "Su" clearly shows that they looked at my website and analyzed what it was doing which helped them give a clearer answer. If I am breaking some policy by including the name then please explain what this policy is in a comment. Edit: I have now made a series of changes to my meta descriptions as inspired by the answers given here. On the homepage I now have the text: <META NAME="description" CONTENT="A collection of resources for anyone interested in Full Reserve Banking. What it is, how it works, web resources, organisations, research papers etc."> I am now very excited to see what will happen after the next visit by the google robots. Edit: Result! I just did a google search for "full reserve banking", and the text that appeared was: Full Reserve Banking: The definitive resource. A collection of resources for anyone interested in Full Reserve Banking. What it is, how it works, web resources, organisations, research papers etc. www.fullreservebanking.com/ - Cached By the way, I did originally have a meta description - but it was too short, it just said "full reserve banking". Google obviously assumed this was too little and so chose to use its own algorithms to cook up a different sentence from the main text.

    Read the article

  • Finding Tools Guidance in OUM

    - by user716869
    OUM is not tool – specific. However, it does include tool guidance.  Tool guidance in OUM includes: a mention of a tool that could be used to complete a specific task(s) templates created with a specific tool example work products in a specific tool links to tool resources Tool Supplemental Guides So how do you find all this helpful tool information? Start at the lowest level first – the Task Overview.  Even though the task overviews are written tool-agnostic, they sometimes mention suggestions, or examples of a tool that might be used to complete the task.  More specific tool information can be found in the Task Overview, Templates and Tools section.  In some cases, the tool used to create the template (for example, Microsoft Word, Powerpoint, Project and Visio) is useful. The Templates and Tools section also provides more specific tool guidance, such as links to: White Papers Viewlets Example Work Products Additional Resources Tool Supplemental Guides If you’re more interested in seeing what tools might be helpful in general for your project or to see if there is any tool guidance for a specific tool that your project is committed to using, go to the Supplemental Guidance page in OUM.  This page is available from the Method Navigation pull down located in the header of almost every OUM page. When you open the Supplemental Guidance page, the first thing you see is a table index of everything that is included on the page.  At the top of the right column are all the Tool Supplemental Guides available in OUM.  Use the index to navigate to any of the guides. Next in the right column is Discipline/Industry/View Resources and Samples.  Use the index to navigate to any of these topics and see what’s available and more specifically, if there is any tool guidance available.  For example, if you navigate to the Cloud Resources, you will find a link to the IT Strategies from Oracle page that provides information for Cloud Practitioner Guides, Cloud Reference Architectures and Cloud White Papers, including the Cloud Candidate Selection Tool and Cloud Computing Maturity Model. The section for Method Tool and Technique Cross References can take you to the Task to Tool Cross Reference.  This page provides a task listing with possible helpful tools and links to more information regarding the tools.  By no means is this tool guidance all inclusive.  You can use other tools not mentioned in OUM to complete an OUM task. The Method Tool and Technique Cross References can also take you to the various Technique pages (Index and Cross References).  While techniques are not necessarily “tools,” they can certainly provide valuable assistance in completing tasks. In the Other Resources section of the Supplemental Guidance page, you find links to the viewlets and white papers that are included within OUM.

    Read the article

  • Custom Rails actions: I have issues every time

    - by normalocity
    Every time I go to add a custom action to a controller, I completely screw it up somehow. I'm trying to add a route "listings/buyer_listings", that will display all of my listings where someone is a buyer (rather than a seller). With the routes.rb file below, when I go to "listings/buyer_listings", I get routed instead to "users" WTF? In the past, I've had to define my routes using "map.", but this seems like a very verbose way to do something that should work with the :collection specification. You can see that I've done this with many routes as specified toward the end of the file, such as "edit_my_profile", etc. If I put the ":collection" part last my browser routes to the "show" action, which is not the correct action, and which also doesn't make sense to me why it would even do this. If I do "rake routes", my routes look correctly mapped. If I go into a Ruby console and have it recognize the url, it maps to the correct action, so what am I missing? ActionController::Routing::Routes.draw do |map| map.resources :locations map.resources :browse_boxes map.resources :tags map.resources :ratings map.resources :listings, :collection => { :buyer_listings => :get }, :has_many => :bids, :has_many => :comments map.resources :users map.resources :invite_requests map.resource :user_session map.resource :account, :controller => "users" map.root :controller => "listings", :action => "index" # optional, this just sets the root route map.login "login", :controller => "user_sessions", :action => "new" map.logout "logout", :controller => "user_sessions", :action => "destroy" map.search "search", :controller => "listings", :action => "search" map.edit_my_profile "edit_my_profile", :controller => "users", :action => "edit_my_profile" map.all_listings "all_listings", :controller => "listings", :action => "all_listings" map.my_listings "my_listings", :controller => "listings", :action => "my_listings" map.posting_guidelines "posting_guidelines", :controller => "listings", :action => "posting_guidelines" map.filter_on "filter_on", :controller => "listings", :action => "filter_on" map.top_25_tags "top_25_tags", :controller => "tagging_search", :action => "top_25_tags" map.connect ':controller/:action/:id' map.connect ':controller/:action/:id.:format' end

    Read the article

  • Why does my Spring Controller direct me to the wrong page?

    - by kc2001
    I am writing my first Spring 3.0.5 MVC app and am confused about why my controller mappings aren't doing what I expect. I have a VerifyPasswordController that is called after a user tries to log in by entering his name and password. // Called upon clicking "submit" from /login @RequestMapping(value = "/verifyPassword", method = RequestMethod.POST) @ModelAttribute("user") public String verifyPassword(User user, BindingResult result) { String email = user.getEmail(); String nextPage = CHOOSE_OPERATION_PAGE; // success case if (result.hasErrors()) { nextPage = LOGIN_PAGE; } else if (!passwordMatches(email, user.getPassword())) { nextPage = LOGIN_FAILURE_PAGE; } else { // success } return nextPage; } I can verify in the debugger that this method is being called, but afterwards, the verifyPassword page is displayed rather than the chooseOperation page. The console output of WebLogic seems to show that my mapping are correct: INFO : org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping - Mapped URL path [/chooseOperation] onto handler 'chooseOperationController' INFO : org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping - Mapped URL path [/chooseOperation.*] onto handler 'chooseOperationController' INFO : org.springframework.web.servlet.mvc.annotation.DefaultAnnotationHandlerMapping - Mapped URL path [/chooseOperation/] onto handler 'chooseOperationController' Here is the ChooseOperationController: @Controller @SessionAttributes("leaveRequestForm") public class ChooseOperationController implements PageIfc, AttributeIfc { @RequestMapping(value = "/chooseOperation") @ModelAttribute("leaveRequestForm") public LeaveRequest setUpLeaveRequestForm( @RequestParam(NAME_ATTRIBUTE) String name) { LeaveRequest form = populateFormFromDatabase(name); return form; } // helper methods omited } I welcome any advice, particularly "generic" techniques for debugging such mapping problems. BTW, I've also tried to "redirect" to the desired page, but got the same result. servlet-context.xml: <?xml version="1.0" encoding="UTF-8"?> <beans:beans xmlns="http://www.springframework.org/schema/mvc" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:beans="http://www.springframework.org/schema/beans" xmlns:context="http://www.springframework.org/schema/context" xsi:schemaLocation=" http://www.springframework.org/schema/mvc http://www.springframework.org/schema/mvc/spring-mvc-3.0.xsd http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-3.0.xsd http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context-3.0.xsd"> <!-- DispatcherServlet Context: defines this servlet's request-processing infrastructure --> <!-- Enables the Spring MVC @Controller programming model --> <annotation-driven /> <!-- Handles HTTP GET requests for /resources/** by efficiently serving up static resources in the ${webappRoot}/resources directory --> <resources mapping="/resources/**" location="/resources/" /> <!-- Resolves views selected for rendering by @Controllers to .jsp resources in the /WEB-INF/views directory --> <beans:bean class="org.springframework.web.servlet.view.InternalResourceViewResolver"> <beans:property name="prefix" value="/WEB-INF/views/" /> <beans:property name="suffix" value=".jsp" /> </beans:bean> <context:component-scan base-package="com.engilitycorp.leavetracker" /> <beans:bean id="leaveRequestForm" class="com.engilitycorp.leavetracker.model.LeaveRequest" /> </beans:beans> The constants: String LOGIN_FAILURE_PAGE = "loginFailure"; String LOGIN_PAGE = "login"; String CHOOSE_OPERATION_PAGE = "chooseOperation";

    Read the article

  • Developing custom MBeans to manage J2EE Applications (Part III)

    - by philippe Le Mouel
    This is the third and final part in a series of blogs, that demonstrate how to add management capability to your own application using JMX MBeans. In Part I we saw: How to implement a custom MBean to manage configuration associated with an application. How to package the resulting code and configuration as part of the application's ear file. How to register MBeans upon application startup, and unregistered them upon application stop (or undeployment). How to use generic JMX clients such as JConsole to browse and edit our application's MBean. In Part II we saw: How to add localized descriptions to our MBean, MBean attributes, MBean operations and MBean operation parameters. How to specify meaningful name to our MBean operation parameters. We also touched on future enhancements that will simplify how we can implement localized MBeans. In this third and last part, we will re-write our MBean to simplify how we added localized descriptions. To do so we will take advantage of the functionality we already described in part II and that is now part of WebLogic 10.3.3.0. We will show how to take advantage of WebLogic's localization support to localize our MBeans based on the client's Locale independently of the server's Locale. Each client will see MBean descriptions localized based on his/her own Locale. We will show how to achieve this using JConsole, and also using a sample programmatic JMX Java client. The complete code sample and associated build files for part III are available as a zip file. The code has been tested against WebLogic Server 10.3.3.0 and JDK6. To build and deploy our sample application, please follow the instruction provided in Part I, as they also apply to part III's code and associated zip file. Providing custom descriptions take II In part II we localized our MBean descriptions by extending the StandardMBean class and overriding its many getDescription methods. WebLogic 10.3.3.0 similarly to JDK 7 can automatically localize MBean descriptions as long as those are specified according to the following conventions: Descriptions resource bundle keys are named according to: MBean description: <MBeanInterfaceClass>.mbean MBean attribute description: <MBeanInterfaceClass>.attribute.<AttributeName> MBean operation description: <MBeanInterfaceClass>.operation.<OperationName> MBean operation parameter description: <MBeanInterfaceClass>.operation.<OperationName>.<ParameterName> MBean constructor description: <MBeanInterfaceClass>.constructor.<ConstructorName> MBean constructor parameter description: <MBeanInterfaceClass>.constructor.<ConstructorName>.<ParameterName> We also purposely named our resource bundle class MBeanDescriptions and included it as part of the same package as our MBean. We already followed the above conventions when creating our resource bundle in part II, and our default resource bundle class with English descriptions looks like: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "MBean used to manage persistent application properties"}, {"PropertyConfigMXBean.attribute.Properties", "Properties associated with the running application"}, {"PropertyConfigMXBean.operation.setProperty", "Create a new property, or change the value of an existing property"}, {"PropertyConfigMXBean.operation.setProperty.key", "Name that identify the property to set."}, {"PropertyConfigMXBean.operation.setProperty.value", "Value for the property being set"}, {"PropertyConfigMXBean.operation.getProperty", "Get the value for an existing property"}, {"PropertyConfigMXBean.operation.getProperty.key", "Name that identify the property to be retrieved"} }; } } We have now also added a resource bundle with French localized descriptions: package blog.wls.jmx.appmbean; import java.util.ListResourceBundle; public class MBeanDescriptions_fr extends ListResourceBundle { protected Object[][] getContents() { return new Object[][] { {"PropertyConfigMXBean.mbean", "Manage proprietes sauvegarde dans un fichier disque."}, {"PropertyConfigMXBean.attribute.Properties", "Proprietes associee avec l'application en cour d'execution"}, {"PropertyConfigMXBean.operation.setProperty", "Construit une nouvelle proprietee, ou change la valeur d'une proprietee existante."}, {"PropertyConfigMXBean.operation.setProperty.key", "Nom de la propriete dont la valeur est change."}, {"PropertyConfigMXBean.operation.setProperty.value", "Nouvelle valeur"}, {"PropertyConfigMXBean.operation.getProperty", "Retourne la valeur d'une propriete existante."}, {"PropertyConfigMXBean.operation.getProperty.key", "Nom de la propriete a retrouver."} }; } } So now we can just remove the many getDescriptions methods from our MBean code, and have a much cleaner: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Map; import java.util.HashMap; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig extends StandardMBean implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; private static Map operationsParamNames_ = null; static { operationsParamNames_ = new HashMap(); operationsParamNames_.put("setProperty", new String[] {"key", "value"}); operationsParamNames_.put("getProperty", new String[] {"key"}); } public PropertyConfig(String relativePath) throws Exception { super(PropertyConfigMXBean.class , true); props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} protected String getParameterName(MBeanOperationInfo op, MBeanParameterInfo param, int sequence) { return operationsParamNames_.get(op.getName())[sequence]; } } The only reason we are still extending the StandardMBean class, is to override the default values for our operations parameters name. If this isn't a concern, then one could just write the following code: package blog.wls.jmx.appmbean; import java.io.IOException; import java.io.InputStream; import java.io.OutputStream; import java.io.FileInputStream; import java.io.FileOutputStream; import java.io.File; import java.net.URL; import java.util.Properties; import javax.management.MBeanServer; import javax.management.ObjectName; import javax.management.MBeanRegistration; import javax.management.StandardMBean; import javax.management.MBeanOperationInfo; import javax.management.MBeanParameterInfo; public class PropertyConfig implements PropertyConfigMXBean, MBeanRegistration { private String relativePath_ = null; private Properties props_ = null; private File resource_ = null; public PropertyConfig(String relativePath) throws Exception { props_ = new Properties(); relativePath_ = relativePath; } public String setProperty(String key, String value) throws IOException { String oldValue = null; if (value == null) { oldValue = String.class.cast(props_.remove(key)); } else { oldValue = String.class.cast(props_.setProperty(key, value)); } save(); return oldValue; } public String getProperty(String key) { return props_.getProperty(key); } public Map getProperties() { return (Map) props_; } private void load() throws IOException { InputStream is = new FileInputStream(resource_); try { props_.load(is); } finally { is.close(); } } private void save() throws IOException { OutputStream os = new FileOutputStream(resource_); try { props_.store(os, null); } finally { os.close(); } } public ObjectName preRegister(MBeanServer server, ObjectName name) throws Exception { // MBean must be registered from an application thread // to have access to the application ClassLoader ClassLoader cl = Thread.currentThread().getContextClassLoader(); URL resourceUrl = cl.getResource(relativePath_); resource_ = new File(resourceUrl.toURI()); load(); return name; } public void postRegister(Boolean registrationDone) { } public void preDeregister() throws Exception {} public void postDeregister() {} } Note: The above would also require changing the operations parameters name in the resource bundle classes. For instance: PropertyConfigMXBean.operation.setProperty.key would become: PropertyConfigMXBean.operation.setProperty.p0 Client based localization When accessing our MBean using JConsole started with the following command line: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -debug We see that our MBean descriptions are localized according to the WebLogic's server Locale. English in this case: Note: Consult Part I for information on how to use JConsole to browse/edit our MBean. Now if we specify the client's Locale as part of the JConsole command line as follow: jconsole -J-Djava.class.path=$JAVA_HOME/lib/jconsole.jar:$JAVA_HOME/lib/tools.jar: $WL_HOME/server/lib/wljmxclient.jar -J-Djmx.remote.protocol.provider.pkgs=weblogic.management.remote -J-Dweblogic.management.remote.locale=fr-FR -debug We see that our MBean descriptions are now localized according to the specified client's Locale. French in this case: We use the weblogic.management.remote.locale system property to specify the Locale that should be associated with the cient's JMX connections. The value is composed of the client's language code and its country code separated by the - character. The country code is not required, and can be omitted. For instance: -Dweblogic.management.remote.locale=fr We can also specify the client's Locale using a programmatic client as demonstrated below: package blog.wls.jmx.appmbean.client; import javax.management.MBeanServerConnection; import javax.management.ObjectName; import javax.management.MBeanInfo; import javax.management.remote.JMXConnector; import javax.management.remote.JMXServiceURL; import javax.management.remote.JMXConnectorFactory; import java.util.Hashtable; import java.util.Set; import java.util.Locale; public class JMXClient { public static void main(String[] args) throws Exception { JMXConnector jmxCon = null; try { JMXServiceURL serviceUrl = new JMXServiceURL( "service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime"); System.out.println("Connecting to: " + serviceUrl); // properties associated with the connection Hashtable env = new Hashtable(); env.put(JMXConnectorFactory.PROTOCOL_PROVIDER_PACKAGES, "weblogic.management.remote"); String[] credentials = new String[2]; credentials[0] = "weblogic"; credentials[1] = "weblogic"; env.put(JMXConnector.CREDENTIALS, credentials); // specifies the client's Locale env.put("weblogic.management.remote.locale", Locale.FRENCH); jmxCon = JMXConnectorFactory.newJMXConnector(serviceUrl, env); jmxCon.connect(); MBeanServerConnection con = jmxCon.getMBeanServerConnection(); Set mbeans = con.queryNames( new ObjectName( "blog.wls.jmx.appmbean:name=myAppProperties,type=PropertyConfig,*"), null); for (ObjectName mbeanName : mbeans) { System.out.println("\n\nMBEAN: " + mbeanName); MBeanInfo minfo = con.getMBeanInfo(mbeanName); System.out.println("MBean Description: "+minfo.getDescription()); System.out.println("\n"); } } finally { // release the connection if (jmxCon != null) jmxCon.close(); } } } The above client code is part of the zip file associated with this blog, and can be run using the provided client.sh script. The resulting output is shown below: $ ./client.sh Connecting to: service:jmx:iiop://127.0.0.1:7001/jndi/weblogic.management.mbeanservers.runtime MBEAN: blog.wls.jmx.appmbean:type=PropertyConfig,name=myAppProperties MBean Description: Manage proprietes sauvegarde dans un fichier disque. $ Miscellaneous Using Description annotation to specify MBean descriptions Earlier we have seen how to name our MBean descriptions resource keys, so that WebLogic 10.3.3.0 automatically uses them to localize our MBean. In some cases we might want to implicitly specify the resource key, and resource bundle. For instance when operations are overloaded, and the operation name is no longer sufficient to uniquely identify a single operation. In this case we can use the Description annotation provided by WebLogic as follow: import weblogic.management.utils.Description; @Description(resourceKey="myapp.resources.TestMXBean.description", resourceBundleBaseName="myapp.resources.MBeanResources") public interface TestMXBean { @Description(resourceKey="myapp.resources.TestMXBean.threshold.description", resourceBundleBaseName="myapp.resources.MBeanResources" ) public int getthreshold(); @Description(resourceKey="myapp.resources.TestMXBean.reset.description", resourceBundleBaseName="myapp.resources.MBeanResources") public int reset( @Description(resourceKey="myapp.resources.TestMXBean.reset.id.description", resourceBundleBaseName="myapp.resources.MBeanResources", displayNameKey= "myapp.resources.TestMXBean.reset.id.displayName.description") int id); } The Description annotation should be applied to the MBean interface. It can be used to specify MBean, MBean attributes, MBean operations, and MBean operation parameters descriptions as demonstrated above. Retrieving the Locale associated with a JMX operation from the MBean code There are several cases where it is necessary to retrieve the Locale associated with a JMX call from the MBean implementation. For instance this can be useful when localizing exception messages. This can be done as follow: import weblogic.management.mbeanservers.JMXContextUtil; ...... // some MBean method implementation public String setProperty(String key, String value) throws IOException { Locale callersLocale = JMXContextUtil.getLocale(); // use callersLocale to localize Exception messages or // potentially some return values such a Date .... } Conclusion With this last part we conclude our three part series on how to write MBeans to manage J2EE applications. We are far from having exhausted this particular topic, but we have gone a long way and are now capable to take advantage of the latest functionality provided by WebLogic's application server to write user friendly MBeans.

    Read the article

  • Slackware Linux: Glib can't find libffi.so.6. Where is it trying to look?

    - by Mathmagician
    I have libffi installed and it's in /usr/local/lib, yet the glib make process can't find it /home/mathmagi/src/glib-2.32.4/gio/.libs/lt-glib-compile-resources: error while loading shared libraries: libffi.so.6: cannot open shared object file: No such file or directory /home/mathmagi/src/glib-2.32.4/gio/.libs/lt-glib-compile-resources: error while loading shared libraries: libffi.so.6: cannot open shared object file: No such file or directory /home/mathmagi/src/glib-2.32.4/gio/.libs/lt-glib-compile-resources: error while loading shared libraries: libffi.so.6: cannot open shared object file: No such file or directory /home/mathmagi/src/glib-2.32.4/gio/.libs/lt-glib-compile-resources: error while loading shared libraries: libffi.so.6: cannot open shared object file: No such file or directory make[4]: Entering directory `/home/mathmagi/src/glib-2.32.4/gio/tests' GEN gdbus-test-codegen-generated.c GEN test_resources.c /home/mathmagi/src/glib-2.32.4/gio/.libs/lt-glib-compile-resources: error while loading shared libraries: libffi.so.6: cannot open shared object file: No such file or directory make[4]: *** [test_resources.c] Error 127 make[4]: Leaving directory `/home/mathmagi/src/glib-2.32.4/gio/tests' make[3]: *** [all-recursive] Error 1 make[3]: Leaving directory `/home/mathmagi/src/glib-2.32.4/gio' make[2]: *** [all] Error 2 make[2]: Leaving directory `/home/mathmagi/src/glib-2.32.4/gio' make[1]: *** [all-recursive] Error 1 make[1]: Leaving directory `/home/mathmagi/src/glib-2.32.4' make: *** [all] Error 2 It's definitely in /usr/local/lib! bash-4.1# updatedb bash-4.1# locate libffi.so.6 /usr/local/lib/libffi.so.6 /usr/local/lib/libffi.so.6.0.0 /home/mathmagi/src/libffi-3.0.11/x86_64-unknown-linux-gnu/.libs/libffi.so.6 /home/mathmagi/src/libffi-3.0.11/x86_64-unknown-linux-gnu/.libs/libffi.so.6.0.0 With glib I've tried LDFLAGS=-L/usr/local/lib ./configure Doesn't work. How do I find where glib is looking and change it?

    Read the article

  • Home Sharing and Remote on iTunes causing firewall nags

    - by BoltClock
    It seems that enabling Home Sharing and/or hooking up my iPhone's Remote to iTunes causes Mac OS X Snow Leopard's firewall to freak out and keep nagging every time I launch iTunes to ask if I'd like it to accept incoming connections. If I turn off Home Sharing and forget all Remotes, the nag dialog no longer comes up. I could also disable the firewall, but I think that's a silly thing to do. iTunes is already in the firewall whitelist, so the only thing I know that could cause Mac OS X to nag is a bad application bundle code signature. I checked with this Terminal command: $ codesign -vvv /Applications/iTunes.app/ And sure enough, this is what it outputs: /Applications/iTunes.app/: a sealed resource is missing or invalid /Applications/iTunes.app/Contents/Resources/English.lproj/AutofillSettings.nib/objects.xib: resource added /Applications/iTunes.app/Contents/Resources/English.lproj/iTunesDJSettings.nib/objects.xib: resource added /Applications/iTunes.app/Contents/Resources/English.lproj/MobilePhonePrefs.nib/objects.xib: resource added /Applications/iTunes.app/Contents/Resources/English.lproj/MobilePhoneSetup.nib/objects.xib: resource added /Applications/iTunes.app/Contents/Resources/English.lproj/UniversalAccess.nib/objects.xib: resource added I've tried reinstalling iTunes as suggested by this answer, but Mac OS X still nags about incoming connections and the exact same output is generated when I run the above command again. On my PC, Windows Firewall has never nagged whenever I turn on Home Sharing and hook up Remote on my iPhone. Both computers use iTunes 9.2.1. My Mac runs Mac OS X 10.6.4. Is there anything special I need to do that I might have missed? Or how do I resolve the issue? EDIT: I've updated to iTunes 10, but the nags on my Mac are still there and only go away if I turn off Home Sharing and Remote. EDIT 2: I've updated to Remote 2.0 on my iPhone, but the firewall nags are persisting. Has anyone else had this firewall issue at all?

    Read the article

  • How cloudfront works?

    - by Dharmik Bhandari
    I'm planning to Implement CDN(Content Delivery Network) of Amazon which is known as CloudFront in ASP.NET MVC3 with c#. I've googled about it but little bit confuse about few things mentions below. Is it compulsory that we have to uploads all static resources to CDN Network first and then we can use or Is it manageable by Amazon to crawl site static resources which is predefine folder or directory of sites? Is Amazon automatic update its copies when we anything change in static resources or every time we have to upload updated resources to CDN network.

    Read the article

  • Add collection or array to wpf resource dictionary

    - by Chris Cap
    I've search high and low and can't find an answer to this. I have two questions How do you create an array or collection in XAML. I've got an array I want to stick in there and bind to a combo box. My first idea was to put an ItemsControl in a resource dictionary, but the ItemsSource of a combo box expects IEnumerable so that didn't work. Here's what I've tried in my resource dictionary and neither works <ItemsControl x:Key="stateList"> <sys:String>AL</sys:String> <sys:String>CA</sys:String> <sys:String>CN</sys:String> </ItemsControl> <ItemsControl x:Key="stateList2"> <ComboBoxItem>AL</ComboBoxItem> <ComboBoxItem>CA</ComboBoxItem> <ComboBoxItem>CN</ComboBoxItem> </ItemsControl> and here's how I bind to it <ComboBox SelectedValue="{Binding Path=State}" ItemsSource="{Binding Source={StaticResource stateList2}}" > </ComboBox> EDIT: UPDATED I got this first part to work this way <col:ArrayList x:Key="stateList3"> <sys:String>AL</sys:String> <sys:String>CA</sys:String> <sys:String>CN</sys:String> </col:ArrayList> However, I'd rather not use an array list, I'd like to use a generic list so if anyone knows how please let me know. EDIT UPDATE: I guess XAML has very limited support for generics so maybe an array list is the best I can do for now, but I would still like help on the second question if anyone has an anser 2nd. I've tried referencing a merged resource dictionary in my XAML and had problems because under Window.resources I had more than just the dictionary so it required me to add x:Key. Once I add the key, the system can no longer find the items in my resource dictionary. I had to move the merged dictionary to Grid.Resources instead. Ideally I'd like to reference the merged dictionary in the app.xaml but I have the same problem Here's some sample code. This first part required an x:key to compile because I have converter and it complained that every item must have a key if there is more than one <UserControl.Resources> <win:BooleanToVisibilityConverter x:Key="VisibilityConverter" /> <ResourceDictionary> <ResourceDictionary.MergedDictionaries> <ResourceDictionary Source="/ResourcesD.xaml" /> </ResourceDictionary.MergedDictionaries> </ResourceDictionary> </UserControl.Resources> I had to change it to this <UI:BaseStep.Resources> <win:BooleanToVisibilityConverter x:Key="VisibilityConverter" /> </UI:BaseStep.Resources> <Grid> <Grid.Resources> <ResourceDictionary> <ResourceDictionary.MergedDictionaries> <ResourceDictionary Source="/ResourcesD.xaml" /> </ResourceDictionary.MergedDictionaries> </ResourceDictionary> </Grid.Resources> </Grid> Thank you

    Read the article

  • Visual Studio compiles WPF application twice during build

    - by Brian Ensink
    I have a WPF app in VS2008 that compiles twice during the build. The two CSC command lines are similar but with some differences. The first CSC command line does not have an /resource options, the second has two /resource options on the command line. The second CSC command line has these additional arguments: /resource:"obj\Debug AutoCAD\VisualApp.g.resources" /resource:"obj\Debug AutoCAD\CAP.Visual.Properties.Resources.resources" I hate to post such a huge ugly compiler output but here are both command lines. 2>c:\WINDOWS\Microsoft.NET\Framework\v3.5\Csc.exe /noconfig /nowarn:1701,1702 /platform:x86 /errorreport:prompt /warn:4 /define:DEBUG;TRACE /reference:..\BIN\RELEASE\FOO.Base.dll /reference:..\BIN\RELEASE\FOO.CAPArchiveHandler.dll /reference:..\BIN\RELEASE\FOO.CAPDOM.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\PresentationCore.dll" /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\PresentationFramework.dll" /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Core.dll" /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Data.DataSetExtensions.dll" /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Data.dll /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\System.Runtime.Serialization.dll" /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Xml.dll /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Xml.Linq.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.Docking.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.Navigation.dll" /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\UIAutomationProvider.dll" /reference:c:\project\FooStudio\BIN\DEBUGCAD\VS-3DEngine-Wrapper.dll /reference:c:\project\FooStudio\BIN\DEBUGCAD\VisualServiceClient.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\WindowsBase.dll" /debug+ /debug:full /filealign:512 /out:"obj\Debug AutoCAD\VisualApp.exe" /target:winexe App.xaml.cs MainWindow.xaml.cs CameraAndLightingControl.xaml.cs CameraAndLightingViewModel.cs MainWindowViewModel.cs Properties\AssemblyInfo.cs Properties\Resources.Designer.cs Properties\Settings.Designer.cs ScenarioToolsWindow.xaml.cs SceneGraph.cs ScenePart.cs ToolWindow.xaml.cs "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\CameraAndLightingControl.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\MainWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\ScenarioToolsWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\ToolWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\App.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\GeneratedInternalTypeHelper.g.cs" 2>Done building project "0ye0i4wb.tmp_proj". 2>c:\WINDOWS\Microsoft.NET\Framework\v3.5\Csc.exe /noconfig /nowarn:1701,1702 /platform:x86 /errorreport:prompt /warn:4 /define:DEBUG;TRACE /reference:..\BIN\RELEASE\FOO.Base.dll /reference:..\BIN\RELEASE\FOO.CAPArchiveHandler.dll /reference:..\BIN\RELEASE\FOO.CAPDOM.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\PresentationCore.dll" /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\PresentationFramework.dll" /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Core.dll" /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Data.DataSetExtensions.dll" /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Data.dll /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\System.Runtime.Serialization.dll" /reference:c:\WINDOWS\Microsoft.NET\Framework\v2.0.50727\System.Xml.dll /reference:"c:\Program Files\Reference Assemblies\Microsoft\Framework\v3.5\System.Xml.Linq.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.Docking.dll" /reference:"C:\Program Files\Telerik\RadControls for WPF Q1 2010\Binaries\WPF\Telerik.Windows.Controls.Navigation.dll" /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\UIAutomationProvider.dll" /reference:c:\project\FooStudio\BIN\DEBUGCAD\VS-3DEngine-Wrapper.dll /reference:c:\project\FooStudio\BIN\DEBUGCAD\VisualServiceClient.dll /reference:"C:\Program Files\Reference Assemblies\Microsoft\Framework\v3.0\WindowsBase.dll" /debug+ /debug:full /filealign:512 /out:"obj\Debug AutoCAD\VisualApp.exe" /resource:"obj\Debug AutoCAD\VisualApp.g.resources" /resource:"obj\Debug AutoCAD\FOO.Visual.Properties.Resources.resources" /target:winexe App.xaml.cs MainWindow.xaml.cs CameraAndLightingControl.xaml.cs CameraAndLightingViewModel.cs MainWindowViewModel.cs Properties\AssemblyInfo.cs Properties\Resources.Designer.cs Properties\Settings.Designer.cs ScenarioToolsWindow.xaml.cs SceneGraph.cs ScenePart.cs ToolWindow.xaml.cs "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\CameraAndLightingControl.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\MainWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\ScenarioToolsWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\ToolWindow.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\App.g.cs" "c:\project\FooStudio\VisualApp\obj\Debug AutoCAD\GeneratedInternalTypeHelper.g.cs" Any idea what could possibly cause this? I think this is causing a problem I posted about earlier today.

    Read the article

  • Unresolved compilation problems -- can't use .jar files that I have created

    - by Mike
    I created a few .jar files and am trying to access them in another application - I have tried to use both Eclipse and IntelliJ and experience the same issue: java.lang.Error: Unresolved compilation problems: The import com.XXXX.XXXXXXXXX.project2 cannot be resolved The import com.XXXX.XXXXXXXXX.project2 cannot be resolved BeanFactory cannot be resolved to a type Author cannot be resolved to a type AuthorFactoryImpl cannot be resolved to a type Author cannot be resolved to a type Author cannot be resolved to a type I have been using Maven during this process and the jars compile fine. I have included them on the file path using both the Maven .pom file and directly assigning them. I also have unassigned the direct file path and left the reference in Maven and vise versa -- no difference. See below .jar file class info: file structure: Author.java BeanWithIdentityInterface Books Subject ie: Interface: package com.XXXX.training; /** * Created with IntelliJ IDEA. * User: kBPersonal * Date: 11/5/12 * Time: 3:16 PM * */ public interface BeanWithIdentityInterface <I> { I getId(); } Author.java: package com.XXXX.training; /** * Created with IntelliJ IDEA. * User: kBPersonal * Date: 10/25/12 * Time: 12:03 PM */ public class Author implements BeanWithIdentityInterface <Integer>{ private Integer id = null; private String name = null; private String picture = null; private String bio = null; public Author(Integer id, String bio, String name, String picture) { this.id = id; this.bio = bio; this.name = name; this.picture = picture; } public Author (){} @Override public Integer getId() { return id; } public void setId(Integer id) { this.id = id; } public String getName() { return name; } public void setName(String name) { this.name = name; } public String getPicture() { return picture; } public void setPicture(String picture) { this.picture = picture; } public String getBio() { return bio; } public void setBio(String bio) { this.bio = bio; } @Override public String toString() { return "\n\tAuthor Id: "+this.getId() + " | Bio:"+ this.getBio()+ " | Name:"+ this.getName()+ " | Picture: "+ this.getPicture(); } } implementing Servlet: package com.acentia.training.project3.controller; import com.acentia.training.*; import com.acentia.training.project2.AuthorFactoryImpl; import com.acentia.training.project2.BeanFactory; import javax.servlet.ServletException; import javax.servlet.http.HttpServletRequest; import javax.servlet.http.HttpServletResponse; import java.io.IOException; import java.io.PrintWriter; /** * Created with IntelliJ IDEA. * User: kBPersonal * Date: 11/11/12 * Time: 6:34 PM * */ public class ListAuthorServlet extends AbstractBaseServlet { private static final long serialVersionUID = -6934109551750492182L; public void doProcess(final HttpServletRequest request, final HttpServletResponse response) throws IOException { final BeanFactory<Author, Integer> authorFactory = new AuthorFactoryImpl(); Author author = null; if (authorFactory != null) { author = (Author) authorFactory.getMember(5); } I can't pull the Author class. Any help would be greatly appreciated.

    Read the article

  • Windows Azure Learning Plan - SQL Azure

    - by BuckWoody
    This is one in a series of posts on a Windows Azure Learning Plan. You can find the main post here. This one deals with Security for  Windows Azure.   Overview and Training Overview and general  information about SQL Azure - what it is, how it works, and where you can learn more. General Overview (sign-in required, but free) http://social.technet.microsoft.com/wiki/contents/articles/inside-sql-azure.aspx General Guidelines and Limitations http://msdn.microsoft.com/en-us/library/ee336245.aspx Microsoft SQL Azure Documentation http://msdn.microsoft.com/en-us/windowsazure/sqlazure/default.aspx Samples and Learning Sources for online and other SQL Azure Training Free Online Training http://blogs.msdn.com/b/sqlazure/archive/2010/05/06/10007449.aspx 60-minute Overview (webcast) https://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?culture=en-US&EventID=1032458620&CountryCode=US Architecture SQL Azure Internals and Architectures for Scale Out and other use-cases. SQL Azure Architecture http://social.technet.microsoft.com/wiki/contents/articles/inside-sql-azure.aspx Scale-out Architectures http://tinyurl.com/247zm33 Federation Concepts http://tinyurl.com/34eew2w Use-Cases http://blogical.se/blogs/jahlen/archive/2010/11/23/sql-azure-why-use-it-and-what-makes-it-different-from-sql-server.aspx SQL Azure Security Model (video) http://www.msdev.com/Directory/Description.aspx?EventId=1491 Administration Standard Administrative Tasks and Tools Tools Options http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx SQL Azure Migration Wizard http://sqlazuremw.codeplex.com/ Managing Databases and Login Security http://msdn.microsoft.com/en-us/library/ee336235.aspx General Security for SQL Azure http://msdn.microsoft.com/en-us/library/ff394108.aspx Backup and Recovery http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-backup-and-restore-strategy.aspx More Backup and Recovery Options http://social.technet.microsoft.com/wiki/contents/articles/current-options-for-backing-up-data-with-sql-azure.aspx Syncing Large Databases to SQL Azure http://blogs.msdn.com/b/sync/archive/2010/09/24/how-to-sync-large-sql-server-databases-to-sql-azure.aspx Programming Programming Patterns and Architectures for SQL Azure systems. How to Build and Manage a Business Database on SQL Azure http://tinyurl.com/25q5v6g Connection Management http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-connection-management-in-sql-azure.aspx Transact-SQL Supported by SQL Azure http://msdn.microsoft.com/en-us/library/ee336250.aspx

    Read the article

  • What’s New in JD Edwards EnterpriseOne Release 9.1

    - by Breanne Cooley
    Oracle JD Edwards EnterpriseOne 9.1 offers customers significant updates to usability and business processes to improve productivity and bolster business value. This release addresses critical user needs, while delivering key enhancements in several areas, including:  New User Experience Release 9.1 offers significant enhancements to the user experience. New Web 2.0 features reduce task time and enable you to access meaningful information. Enhanced Reporting Oracle’s JD Edwards EnterpriseOne One View Reporting is a breakthrough solution that allows business users to create interactive reports - without IT support. Industry Specific Functionality This new release delivers key enhancements for the consumer goods, real estate management and manufacturing and distribution industries. Enhanced Support for Global Operations JD Edwards EnterpriseOne 9.1 supports global operations with several new features, including enhancements that consider the entire ERP business process associated with managing country of origin requirements. Productivity Features This new release offers new more tightly integrated business processes and other productivity advancements like improved data access and enhanced financial controls. Want to find out more? ü Bookmark the JD Edwards EnterpriseOne web page ü Listen to the Podcast: Announcing JD Edwards EnterpriseOne 9.1  ü Watch the Demo: JD Edwards EnterpriseOne 9.1 Features Demo ü Watch the Demo: JD Edwards EnterpriseOne One View Reporting Demo  ü Review the Data Sheet: JD Edwards EnterpriseOne Tools 9.1  New Training JD Edwards EnterpriseOne 9.1 training through Oracle University is designed to help you leverage these usability and productivity enhancements. The curriculum is aligned to the JD Edwards EnterpriseOne products and will teach your team how to efficiently and effectively implement and use your applications. Get started today! View available training and schedules.   -Jim Vonick, Oracle University Market Development Manager 

    Read the article

  • Navigating the Unpredictable Swinging of the Financial Regulation Pendulum

    - by Sylvie MacKenzie, PMP
    Written by Guest Blogger: Maureen Clifford, Sr Product Marketing Manager, Oracle The pendulum of the regulatory clock is constantly in motion, albeit often not in any particular rhythm.  Nevertheless, given what many insurers have been through economically, any movement can send shock waves through critical innovation and operational plans.  As pointed out in Deloitte’s 2012 Global Insurance Outlook, the impact of regulatory reform can cause major uncertainty in the area of costs.  As the reality of increasing government regulations settles in, the change that comes along with it creates more challenges in compliance and ultimately on delivering the optimum return on investment.  The result of this changing environment is a proliferation of compliance projects that must be executed with an already constrained set of resources, budget and time. Insurers are confronted by the need to gain visibility into all of their compliance efforts and proactively manage them. Currently that is very difficult to do as these projects often are being managed by groups across the enterprise and they lack a way to coordinate their efforts and drive greater synergies.  With limited visibility and equally limited resources it is no surprise that reporting on project status and determining realistic completion of these projects is only a dream. As a result, compliance deadlines are missed, penalties are incurred, credibility with key stakeholders and the public is jeopardized and returns and competitive advantage go unrealized. Insurers need to ask themselves some key questions: Do I have “one stop” visibility into all of my compliance efforts?  If not, what can I do to change that? What is top priority and how does that impact my already taxed resources? How can I figure out how to best balance my resources to get these compliance projects done as well as keep key innovation and operational efforts on track? How can ensure that I have all the requisite documentation for each compliance project I undertake? Dealing with complying with regulatory efforts is a necessary evil. Don't let the regulatory pendulum sideline your efforts to generate the greatest return on investment for your key stakeholders.

    Read the article

  • Oracle University Begins Beta Testing For New "Oracle Application Express Developer Certified Expert

    - by Paul Sorensen
    Oracle University has begun beta testing for the new Oracle Application Express Developer Certified Expert certification, which requires passing one exam - "Oracle Application Express 3.2: Developing Web Applications" exam (#1Z1-450).In this video, Marcie Young of Oracle Server Technologies takes you on a quick preview of what is on the exam, how to prepare, and what to expect: The "Oracle Application Express: Developing Web Applications" training course teaches many of of the key concepts that are tested in the exam. This course is not a requirement to take the exam, however it is highly recommended.Additionally, Marcie refers to several helpful resources that are highly recommended while preparing, including the Oracle Application Express hosted instance at apex.oracle.com and Oracle Application Express product page on OTN.You can take the "Oracle Application Express 3.2: Developing Web Applications" exam now for only $50 USD while it is in beta. Beta exams are an excellent way to directly provide your input into the final version of the certification exam as well as be one of the very first certified in the track. Furthermore - passing the beta counts for full final exam credit. Note that beta testing is offered for a limited time only.Register now at pearsonvue.com/oracle to take the exam at a Pearson VUE testing center nearest you.QUICK LINKSRegister For Exam: Pearson VUE About Certification Track: Oracle Application Express Developer Certified ExpertAbout Certification Exam: Oracle Application Express 3.2: Developing Web Applications (1Z1-450)Introductory Training (Recommended): "Oracle Application Express: Developing Web Applications"Advanced Training (Suggested): "Oracle Application Express: Advanced Workshop"Oracle Application Express Hosted Instance: apex.oracle.comOracle Application Express Product Page: on OTNLearn More: Oracle Certification Beta Exams

    Read the article

  • UPK Professional Customer Success Story: Medtronic

    - by [email protected]
    In case you missed the live event, be sure to listen to last week's UPK Customer iSeminar featuring Medtronic. This was the first iSeminar in our quarterly series to showcase UPK Professional (UPK and Knowledge Pathways). Donna Miller and Staci Gilbert gave viewers an inside look at samples of Medtronic's content as they shared their experiences, methodology and best practices for use of the solution. Here are some highlights of the call: • Medtronic initially purchased UPK Professional to support a multi-year, global SAP rollout for 9,000 end users located in 24 countries. • As time went on, they expanded their use of UPK Professional to include several of their other enterprise applications: PeopleSoft, Siebel CRM, Hyperion Financial Management, a number of SAP bolt-ons, Documentum, TrackWise, and many others. • In combination with their Saba LMS, UPK Professional has allowed Medtronic to create, deploy, track and certify consistent end user training for critical transactions and processes across their organization worldwide - essential for a company in a heavily regulated industry. • For key pieces of content or certain end user populations, some Medtronic business units localize/translate the global UPK content. Staci demonstrated examples of their SAP content which has been translated into Japanese. • In the live SAP environment, end users rely on UPK's context sensitive in-application performance support. Medtronic has found this to be very helpful post go-live, giving just-in-time support so end users are confident in a new system or when performing tasks they don't often touch (at quarter or year end). UPK also serves as Medtronic's internal Google. • Medtronic has realized savings on many fronts: reduction in support calls due to in-application performance support, elimination of their training clients, and speedier training (1.5 days rather than 5-7 days) of temporary workers by moving from ILT to a blended solution that includes UPK simulations for eLearning. Thanks again to Donna and Staci for an exceptional presentation. They offered so many great examples for anyone who's looking for ways to get more out of UPK or interested in learning about UPK Professional: Knowledge Pathways. - Karen Rihs, Oracle UPK Outbound Product Management

    Read the article

  • Leaving Microsoft

    - by Stephen Walther
    After two and a half years working with the ASP.NET team, I’ve decided that this is the right time to leave Microsoft and, with the help of some friends, re-launch my ASP.NET training and consulting company. The company has the modest name Superexpert. While working on my Ph.D. at MIT, I was surrounded by professors and students who were passionate about knowledge. During the Internet boom, I was lucky enough to work side-by-side with some very smart and hard-working people to create several successful startups. However, the people I worked with at Microsoft were among the smartest and hardest working. Microsoft hires a small number of people and gives them huge responsibilities. It continues to amaze me that so few people work on the ASP.NET team when you consider how much the team produces. I had the opportunity to work with a number of inspiring people at Microsoft. I’ll miss working with Scott Hunter, Dave Reed, Boris Moore, Eilon Lipton, Scott Guthrie, James Senior, Jim Wang, Phil Haack, Damian Edwards, Vishal Joshi, Mike Pope, Jon Young, Dmitry Robsman, Simon Calvert, Stefan Schackow, and many others. I’m proud of what we accomplished while I was working at Microsoft. We reached out to the jQuery team and changed direction from Microsoft Ajax to jQuery. We successfully contributed several important new features to the open-source jQuery project including jQuery Templates, jQuery Data-Linking, jQuery Globalization, and (as John Resig announced at the last jQuery conference) jQuery Require. I’m looking forward to returning to training and consulting. We want to focus on providing consulting on the “right way” of building ASP.NET websites, which we call Modern ASP.NET applications. By Modern ASP.NET applications, I mean applications built with ASP.NET MVC, jQuery, HTML5, and Visual Studio ALM. Additionally, we want to help companies that have existing ASP.NET Web Forms applications migrate to ASP.NET MVC. If you are interested in having us provide training for your company or you need help building a custom ASP.NET application then please contact us at [email protected] or visit our website at Superexpert.com.

    Read the article

  • Windows Azure Boot Camp coming to KC in April

    - by John Alexander
    Interested in getting up to speed on Windows Azure? Then check out this FREE boot camp, all across the US this   What is a Windows Azure Boot Camp? Windows Azure Boot Camp is a two day deep dive class to get you up to speed on developing for Windows Azure. The class includes a trainer with deep real world experience with Azure, as well as a series of labs so you can practice what you just learned. ABC is more than just a class, it is also an event in a box. If you don't see a class near you, then throw your own. We provide all of the materials and training you need to host your own class. This can be for your company, your customers, your friends, or even your family. Please let us know so we can give you all of the details.   Awesome. How much does it cost? Thanks to all of our fantabulous sponsors, this two day training event is FREE! We will provide drinks and snacks, but you will be on your own for lunch on both days. This is a training class after all.   How do I attend one? You can click here to register for the Kansas City event on April 8th and 9th or click here to see where else ABC will be… WHAT TO BRING – important!!!

    Read the article

  • Do you think that exposure to BASIC can mutilate your mind? [closed]

    - by bigown
    It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration -- Edsger W. Dijkstra I have deep respect to Dijkstra but I don't agree with everything he said/wrote. I disagree specially with this quote on linked paper wrote 35 years ago about the Dartmouth BASIC implementation. Many of my coworkers or friends programmers started with BASIC, questions below have answers that indicate many programmers had their first experience on programming at BASIC. AFAIK many good programmers started at BASIC programming. I'm not talking about Visual Basic or other "modern" dialects of BASIC running on machines full of resources. I'm talking about old times BASIC running on "toy" computer, that the programmer had to worry about saving small numbers that need not be calculated as a string to save a measly byte because the computer had only a few hundreds of them, or have to use computed goto for lack of a more powerful feature, and many other things which require the programmer to think much before doing something and forcing the programmer to be creative. If you had experience with old time BASIC on a machine with limited resources (have in mind that a simple micro-controller today has much more resources than a computer in 1975, do you think that BASIC help your mind to find better solutions, to think like an engineer or BASIC drag you to dark side of programming and mutilated you mentally? Is good to learn a programming language running on a computer full of resources where the novice programmer can do all wrong and the program runs without big problems? Or is it better to learn where the programmer can't go wrong? What can you say about the BASIC have helped you to be a better/worse programmer? Would you teach old BASIC running on a 2KB (virtual) machine to a coming programmer? Sure, only exposure to BASIC is bad. Maybe you share my opinion that modern BASIC doesn't help too much because modern BASIC, as long other programming languages, gives facilities which allow the programmer doesn't think deeper. Additional information: Why BASIC?

    Read the article

  • The newest OPN Competency Center (OPN CC) enhancements are now available

    - by mseika
    The newest OPN Competency Center (OPN CC) enhancements are now available. This release is focused on a new look and simplified navigation, and Resell Competency Tracking functionality. Some of the key features released include: 1. New Look and Feel with Simplified Navigation Users are now one click away from the most valuable resources. Additionally, there are now focused areas which allow users to navigate more effectively. Users can review their individual achievements, create a dedicated Training Plan to broaden their knowledge, or use the new Company Corner to view their company’s achievements. Your view as an Oracle employee has been modified and the Company Corner will provide links to allow access to Partner Workgroups and other links specific to your partner data. 2. Resell Competency Tracker This new functionality has been created to allow partners to track their progress toward becoming Resell Authorized. The Resell Competency Tracker highlights those Knowledge Zones where additional requirements must be achieved prior to Distribution Rights being granted and allows the partners to track their progress. This tracker is available to all users badged to a company ID and also to Internal Oracle employees who have existing access to Partner Workgroups. 3. Enhanced Training Manager Functionality The existing Training Manager has been enhanced to allow partners to create Workgroups that are either focused on the competency requirements for becoming Specialized or the competency requirements needed to apply for resell authorization. Please mark your calendars and plan on joining an internal demonstration of these features and enhancements: Wednesday, September 26 @ 7am & 8pm PT A public Beehive web conference has been scheduled Intercall: 5210981 / 2423

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >