Search Results

Search found 15798 results on 632 pages for 'authentication required'.

Page 365/632 | < Previous Page | 361 362 363 364 365 366 367 368 369 370 371 372  | Next Page >

  • Manipulating Human Tasks (for testing) by Mark Nelson

    - by JuergenKress
    A few months ago, while working on a BPM migration, I had the need to look at the status of human tasks, and to manipulate them – essentially to just have a single user take random actions on them at some interval, to help drive a set of processes that were being tested. To do this, I wrote a little utility called httool.  It reuses some of the core domain classes from my custom worklist sample (with minimal changes to make it a remote client instead of a local one). I have not got around to documenting it yet, but it is pretty simple and fairly self explanatory.  So I thought I would go ahead and share it with folks, in case anyone is interested in playing with it. You can get the code from my ci-samples repository on java.net: git clone git://java.net/ci4fmw~ci-samples It is in the httool directory. I do plan to get back to this “one day” and enhance it to be more intelligent – target particular task types, update the payload, follow a set of “rules” about what action to take – so that I can use it for more driving more interesting test scenarios.  If anyone is feeling generous with their time, and interested, please feel free to join the java.net project and hack away to your heart’s content. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Technorati Tags: Mark Nelson,Human Task,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Blogger.com kills FTP

    - by Daniel Moth
    History (you can safely ignore) Back in 2002 I came across some (almost) free Linux/Apache space and set up my first manually-created HTML-based home page, which still exists: http://www.danielmoth.com/. In 2004 I wanted to have a blog that would be hosted on a sub-folder of my domain, and at the same time I did not want to mess with setting up a blog engine myself. I found the perfect solution in blogger.com, which offered a web interface for creating blog posts (and managing the pages' template) and it would then use FTP to upload HTML pages to my space (no server-side programming/installation required at all)! FTP feature dropped by blogger.com Unfortunately, along the way Google purchased blogger.com and a couple of months ago they announced that they decided to kill the FTP feature, and they are forcing customers using that feature to have their content hosted (in an opaque way) on Google's servers. Even though I prefer having my content on my own space, I would have considered moving it to Google's servers if I could host my blog in a sub-folder and preserve my full blog URL: http://www.danielmoth.com/Blog/ (including my home pages being hosted at the root of the domain). Sadly, that is not possible. What now So I decided to move my blog somewhere else. I'll document on the next few posts how I did that (inc. a tool I wrote) in case it helps someone else in the same situation and also as a reminder to me if I need to do something like this again in the future. Comments about this post welcome at the original blog.

    Read the article

  • Professional Scrum Developer (.NET) Training in London

    - by Martin Hinshelwood
    On the 26th - 30th July in Microsoft’s offices in London Adam Cogan from SSW will be presenting the first Professional Scrum Developer course in the UK. I will be teaching this course along side Adam and it is a fantastic experience. You are split into teams and go head-to-head to deliver units of potentially shippable work in four two hour sprints. The Professional Scrum Developer course is the only course endorsed by both Microsoft and Ken Schwaber and they have worked together very effectively in brining this course to fruition. This course is the brain child of Richard Hundhausen, a Microsoft Regional Director, and both Adam and I attending the Trainer Prep in Sydney when he was there earlier this year. He is a fantastic trainer and no matter where you do this course you can be safe in the knowledge that he has trained and vetted all of the teachers. A tools version of Ken if you will Find a course and register Download this syllabus Download the Scrum Guide What is the Professional Scrum Developer course all about? Professional Scrum Developer course is a unique and intensive five-day experience for software developers. The course guides teams on how to turn product requirements into potentially shippable increments of software using the Scrum framework, Visual Studio 2010, and modern software engineering practices. Attendees will work in self-organizing, self-managing teams using a common instance of Team Foundation Server 2010. Who should attend this course? This course is suitable for any member of a software development team – architect, programmer, database developer, tester, etc. Entire teams are encouraged to attend and experience the course together, but individuals are welcome too. Attendees will self-organize to form cross-functional Scrum teams. These teams require an aggregate of skills specific to the selected case study. Please see the last page of this document for specific details. Product Owners, ScrumMasters, and other stakeholders are welcome too, but keep in mind that everyone who attends will be expected to commit to work and pull their weight on a Scrum team. What should you know by the end of the course? Scrum will be experienced through a combination of lecture, demonstration, discussion, and hands-on exercises. Attendees will learn how to do Scrum correctly while being coached and critiqued by the instructor, in the following topic areas: Form effective teams Explore and understand legacy “Brownfield” architecture Define quality attributes, acceptance criteria, and “done” Create automated builds How to handle software hotfixes Verify that bugs are identified and eliminated Plan releases and sprints Estimate product backlog items Create and manage a sprint backlog Hold an effective sprint review Improve your process by using retrospectives Use emergent architecture to avoid technical debt Use Test Driven Development as a design tool Setup and leverage continuous integration Use Test Impact Analysis to decrease testing times Manage SQL Server development in an Agile way Use .NET and T-SQL refactoring effectively Build, deploy, and test SQL Server databases Create and manage test plans and cases Create, run, record, and play back manual tests Setup a branching strategy and branch code Write more maintainable code Identify and eliminate people and process dysfunctions Inspect and improve your team’s software development process What does the week look like? This course is a mix of lecture, demonstration, group discussion, simulation, and hands-on software development. The bulk of the course will be spent working as a team on a case study application delivering increments of new functionality in mini-sprints. Here is the week at a glance: Monday morning and most of the day Friday will be spent with the computers powered off, so you can focus on sharpening your game of Scrum and avoiding the common pitfalls when implementing it. The Sprints Timeboxing is a critical concept in Scrum as well as in this course. We expect each team and student to understand and obey all of the timeboxes. The timebox duration will always be clearly displayed during each activity. Expect the instructor to enforce it. Each of the ½ day sprints will roughly follow this schedule: Component Description Minutes Instruction Presentation and demonstration of new and relevant tools & practices 60 Sprint planning meeting Product owner presents backlog; each team commits to delivering functionality 10 Sprint planning meeting Each team determines how to build the functionality 10 The Sprint The team self-organizes and self-manages to complete their tasks 120 Sprint Review meeting Each team will present their increment of functionality to the other teams = 30 Sprint Retrospective A group retrospective meeting will be held to inspect and adapt 10 Each team is expected to self-organize and manage their own work during the sprint. Pairing is highly encouraged. The instructor/product owner will be available if there are questions or impediments, but will be hands-off by default. You should be prepared to communicate and work with your team members in order to achieve your sprint goal. If you have development-related questions or get stuck, your partner or team should be your first level of support. Module 1: INTRODUCTION This module provides a chance for the attendees to get to know the instructors as well as each other. The Professional Scrum Developer program, as well as the day by day agenda, will be explained. Finally, the Scrum team will be selected and assembled so that the forming, storming, norming, and performing can begin. Trainer and student introductions Professional Scrum Developer program Agenda Logistics Team formation Retrospective Module 2: SCRUMDAMENTALS This module provides a level-setting understanding of the Scrum framework including the roles, timeboxes, and artifacts. The team will then experience Scrum firsthand by simulating a multi-day sprint of product development, including planning, review, and retrospective meetings. Scrum overview Scrum roles Scrum timeboxes (ceremonies) Scrum artifacts Simulation Retrospective It’s required that you read Ken Schwaber’s Scrum Guide in preparation for this module and course. MODULE 3: IMPLEMENTING SCRUM IN VISUAL STUDIO 2010 This module demonstrates how to implement Scrum in Visual Studio 2010 using a Scrum process template*. The team will learn the mapping between the Scrum concepts and how they are implemented in the tool. After connecting to the shared Team Foundation Server, the team members will then return to the simulation – this time using Visual Studio to manage their product development. Mapping Scrum to Visual Studio 2010 User Story work items Task work items Bug work items Demonstration Simulation Retrospective Module 4: THE CASE STUDY In this module the team is introduced to their problem domain for the week. A kickoff meeting by the Product Owner (the instructor) will set the stage for the why and what that will take during the upcoming sprints. The team will then define the quality attributes of the project and their definition of “done.” The legacy application code will be downloaded, built, and explored, so that any bugs can be discovered and reported. Introduction to the case study Download the source code, build, and explore the application Define the quality attributes for the project Define “done” How to file effective bugs in Visual Studio 2010 Retrospective Module 5: HOTFIX This module drops the team directly into a Brownfield (legacy) experience by forcing them to analyze the existing application’s architecture and code in order to locate and fix the Product Owner’s high-priority bug(s). The team will learn best practices around finding, testing, fixing, validating, and closing a bug. How to use Architecture Explorer to visualize and explore Create a unit test to validate the existence of a bug Find and fix the bug Validate and close the bug Retrospective Module 6: PLANNING This short module introduces the team to release and sprint planning within Visual Studio 2010. The team will define and capture their goals as well as other important planning information. Release vs. Sprint planning Release planning and the Product Backlog Product Backlog prioritization Acceptance criteria and tests Sprint planning and the Sprint Backlog Creating and linking Sprint tasks Retrospective At this point the team will have the knowledge of Scrum, Visual Studio 2010, and the case study application to begin developing increments of potentially shippable functionality that meet their definition of done. Module 7: EMERGENT ARCHITECTURE This module introduces the architectural practices and tools a team can use to develop a valid design on which to develop new functionality. The teams will learn how Scrum supports good architecture and design practices. After the discussion, the teams will be presented with the product owner’s prioritized backlog so that they may select and commit to the functionality they can deliver in this sprint. Architecture and Scrum Emergent architecture Principles, patterns, and practices Visual Studio 2010 modeling tools UML and layer diagrams SPRINT 1 Retrospective Module 8: TEST DRIVEN DEVELOPMENT This module introduces Test Driven Development as a design tool and how to implement it using Visual Studio 2010. To maximize productivity and quality, a Scrum team should setup Continuous Integration to regularly build every team member’s code changes and run regression tests. Refactoring will also be defined and demonstrated in combination with Visual Studio’s Test Impact Analysis to efficiently re-run just those tests which were impacted by refactoring. Continuous integration Team Foundation Build Test Driven Development (TDD) Refactoring Test Impact Analysis SPRINT 2 Retrospective Module 9: AGILE DATABASE DEVELOPMENT This module lets the SQL Server database developers in on a little secret – they can be agile too. By using the database projects in Visual Studio 2010, the database developers can join the rest of the team. The students will see how to apply Agile database techniques within Visual Studio to support the SQL Server 2005/2008/2008R2 development lifecycle. Agile database development Visual Studio database projects Importing schema and scripts Building and deploying Generating data Unit testing SPRINT 3 Retrospective Module 10: SHIP IT Teams need to know that just because they like the functionality doesn’t mean the Product Owner will. This module revisits acceptance criteria as it pertains to acceptance testing. By refining acceptance criteria into manual test steps, team members can execute the tests, recording the results and reporting bugs in a number of ways. Manual tests will be defined and executed using the Microsoft Test Manager tool. As the Sprint completes and an increment of functionality is delivered, the team will also learn why and when they should create a branch of the codeline. Acceptance criteria Testing in Visual Studio 2010 Microsoft Test Manager Writing and running manual tests Branching SPRINT 4 Retrospective Module 11: OVERCOMING DYSFUNCTION This module introduces the many types of people, process, and tool dysfunctions that teams face in the real world. Many dysfunctions and scenarios will be identified, along with ideas and discussion for how a team might mitigate them. This module will enable you and your team to move toward independence and improve your game of Scrum when you depart class. Scrum-butts and flaccid Scrum Best practices working as a team Team challenges ScrumMaster challenges Product Owner challenges Stakeholder challenges Course Retrospective What will be expected of you and you team? This is a unique course in that it’s technically-focused, team-based, and employs timeboxes. It demands that the members of the teams self-organize and self-manage their own work to collaboratively develop increments of software. All attendees must commit to: Pay attention to all lectures and demonstrations Participate in team and group discussions Work collaboratively with other team members Obey the timebox for each activity Commit to work and do your best to deliver All teams should have these skills: Understanding of Scrum Familiarity with Visual Studio 201 C#, .NET 4.0 & ASP.NET 4.0 experience*  SQL Server 2008 development experience Software testing experience * Check with the instructor ahead of time for the exact technologies Self-organising teams Another unique attribute of this course is that it’s a technical training class being delivered to teams of developers, not pairs, and not individuals. Ideally, your actual software development team will attend the training to ensure that all necessary skills are covered. However, if you wish to attend an open enrolment course alone or with just a couple of colleagues, realize that you may be placed on a team with other attendees. The instructor will do his or her best to ensure that each team is cross-functional to tackle the case study, but there are no guarantees. You may be required to try a new role, learn a new skill, or pair with somebody unfamiliar to you. This is just good Scrum! Who should NOT take this course? Because of the nature of this course, as explained above, certain types of people should probably not attend this course: Students requiring command and control style instruction – there are no prescriptive/step-by-step (think traditional Microsoft Learning) labs in this course Students who are unwilling to work within a timebox Students who are unwilling to work collaboratively on a team Students who don’t have any skill in any of the software development disciplines Students who are unable to commit fully to their team – not only will this diminish the student’s learning experience, but it will also impact their team’s learning experience Find a course and register Download this syllabus Download the Scrum Guide Technorati Tags: Scrum,SSW,Pro Scrum Dev

    Read the article

  • Expanding the Oracle Enterprise Repository with functional documentation by Marc Kuijpers

    - by JuergenKress
    Introduction Have you ever experienced the challenge to map both your functional and technical assets in one software package? Finding a software package that is able to describe the metadata about these assets and their mutual relationships? And if you found the correct software package, was it maintainable? The Oracle Enterprise Repository (OER) is a powerful SOA repository. Its core task is to map and visualize the interaction between technical assets generated by the SOA Suite and OSB. However, OER can be configured to not only contain these technical assets, but also to contain functional assets, i.e.: functional designs, use cases and a logical data model. Now that’s interesting! OER is able to show all the assets in your system and, if necessary, zoom in on one of the assets and their mutual relationships (Figure 1). This opens a set of doors to powerful features, e.g.: Impact analsysis If a functional design is adjusted, which other functional designs and use cases do I need to adjust? Traceability If a web service generates an error, in which functional and technical designs is the web service described This sounds great, but how do we get all the functional and technical documents in OER, and how are we going to keep this repository up-to-date? Read the full article. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: OER,SOA Governance,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Developing geometry-based Web Services for WebLogic | Part 1 by Ronald van Luttikhuizen

    - by JuergenKress
    In a recent project we developed Web Services that expose geographical data in their operations. This blog explains the use case for the service, gives an overview of the software architecture, and briefly discusses GML as markup language for geographical data. Part 2 of this blog provides pointers on the implementation of the service while part 3 discusses the deployment on Oracle WebLogic Server. Use Case The "BAG" (Basisregistratie Adressen en Gebouwen) is a Dutch national database containing information on all addresses and buildings in the Netherlands, and is maintained by Dutch municipalities. For several object types the BAG also maintains the associated geographical location and shape; for example for premises and cities. Read the complete article here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: Ronald van Luttikhuizen,Vennester,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • /users/tags should contain scores

    - by Sean Patrick Floyd
    I am implementing some simple JavaScript/bookmarklet based apps that show some reputation info, including the score in the User's top tags (roughly based on this previous bookmarklet of mine). Now I can get a user's top tags (using the API), and I can also get the per-tag score if the user is logged in, by dynamically parsing the tag's top users page. But it costs me one AJAX request per tag and I have to download 10+k to extract a single numeric value. It would save a lot of traffic if the tags in <api>/users/<userid>/tags had a score field. The data seems to be there, after all the top users pages use it, so it would just be a question of exposing the data. Suggested structure: "tags": [ { "name": { "description": "name of the tag", "values": "string", "optional": false, "suggested_buffer_size": 25 }, "score": { "description": "tag score, sum of up votes for answers on non-wiki questions", "values": "32-bit signed integer", "optional": false }, "count": { "description": "tag count, exact meaning depends on context", "values": "32-bit signed integer", "optional": false }, "restricted_to": { "description": "user types that can make use of this tag, lack of this field indicates it is useable by all", "values": "one of anonymous, unregistered, registered, or moderator", "optional": true }, "fulfills_required": { "description": "indicates whether this tag is one of those that is required to be on a post", "values": "boolean", "optional": false }, "user_id": { "description": "user associated with this tag, depends on context", "values": "32-bit signed integer", "optional": true } } ]

    Read the article

  • WebLogic history an interview with Laurie Pitman by Qualogy

    - by JuergenKress
    All those years that I am working with WebLogic, the BEA and Oracle era are the most well known about WebLogic evolving into a worldwide Enterprise platform for Java applications, being used by multinationals around the globe. But how did it all begin? Besides from the spare info you find on some Internet pages, I was eager to hear it in person from one of the founders of WebLogic back in 1995, before the BEA era, Laurie Pitman. Four young people, Carl Resnikoff, Paul Ambrose, Bob Pasker, and Laurie Pitman, became friends and colleagues about the time of the first release of Java in 1995. Between the four of them, they had an MA in American history, an MA in piano, an MS in library systems, a BS in chemistry, and a BS in computer science. They had come together kind of serendipitously, interested in building some web tools exclusively in Java for the emerging Internet web application market. They found many things to like about each other, some overlap in our interests, but also a lot of well-placed differences which made a partnership particularly interesting. They made it formal in January 1996 by incorporating. Read the complete article here. WebLogic Partner Community For regular information become a member in the WebLogic Partner Community please visit: http://www.oracle.com/partners/goto/wls-emea ( OPN account required). If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Wiki Technorati Tags: WebLogic history,Qualogy,WebLogic,WebLogic Community,Oracle,OPN,Jürgen Kress

    Read the article

  • What's the best license for my website?

    - by John Maxim
    I have developed a unique website but do not have a lot of fund to protect it with trademark or patents. I'm looking for suggestions so that when my supervisor gets my codes, some laws restrict anyone from copying it and claim their work. I'm in the middle of thinking, making the application a commercial one or never allowing it to be copied at all. What kind of steps am I required to take in order to make full measurements so my applications are fully-protected? I've come across a few, one under my consideration is MIT license. Some say we can have a mixture of both commercial and MIT. I would also like to be able to distribute some functions so it can be modified by others but I'd still retain the ownership. Last but not least, it's confusing when I think of protecting the whole website, and protecting codes by codes in division. How should we go about this? Thanks. N.B I have to pass it over to the supervisor as this is a Uni project. I need this to be done within 2 weeks. So time to get my App protected is a factor here.

    Read the article

  • Visual Studio Shortcut: Surround With

    - by Jeff Widmer
    I learned a new Visual Studio keyboard shortcut today that is really awesome; the “Surround With” shortcut.  You can trigger the Surround With context menu by pressing the Ctrl-K, Ctrl-S key combination when on a line of code. Ctrl-K, Ctrl-S means to hold down the Control key and then press K and then while still holding down the Control key press S. Here is where this comes in handy: You type a line of code and then realize you need to put it within an if statement block. So you type “if” and hit tab twice to insert the if statement code snippet.  Then you highlight the previous line of code that you typed, and then either drag and drop it into the if-then block or cut and paste it.  That is not too bad but it is a lot of extra key clicks and mouse moves. Now try the same with the Surround With keyboard shortcut.  Just highlight that line of code that you just typed and press Ctrl-K, Ctrl-S and choose the if statement code snippet, hit tab, and POW!... you are done!  No more code moving/indenting required. Here is what the Surround With context menu looks like: Just up or down arrow inside the drop down list to the code snippet that you want to surround your currently selected text with.  Did I mention this is AWESOME! Now it is so simple to surround lines of code with an if-then block or a try-catch-finally block... things that usually took several key clicks and maybe one or two mouse moves. And this works in both Visual Studio 2008 and Visual Studio 2010 which means it has been around for a long time and I never knew about it.   Technorati Tags: Visual Studio Keyboard Shortcut

    Read the article

  • Principles of Service-Oriented Architecture by Douwe P. van den Bos

    - by JuergenKress
    Today I hosted a session on the Principles of Service-Oriented Architecture for my colleagues at Capgemini. A very interesting session because everyone had a very clear view on the Oracle SOA Suite and a technical background. What we wanted to do was creating a common view on what a Service-Oriented Architecture is, what the benefits are that can be achieved and what is needed to create a Service-Oriented Architecture. During this very interactive session we moved from a clearly technology view on the matter (Oracle SOA Suite) to an architectural view slicing from business to technology. And this is where SOA really kicks in, because it is a philosophy. Here is the presentation on SlideShare: Principles of Service-Oriented Architecture. Read also the The Maturity of a Service-Oriented Architecture & SOA Maturity Models. Twitter & LinkedIn SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA Governance,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress,Douwe P. van den Bos

    Read the article

  • JMS Adapter Step 0 : Configuring the WLS-JMS resources

    - by [email protected]
    Before getting started with the JMS Adapter, we must configure the connection factories/JMS queues on the WLS admin console. In particular, we will be required to follow these stepsCreate a connection factory. In our case, we will create a "XA Connection Factory". This step is mandatory if you need your JMS queues to participate in a global transaction. Create the WLS JMS QueuesCreating the connection factory:1) Login to the WLS Admin console. On my setup, the url looks like "http://localhost:7001/console".2) Select Services -> Messaging -> JMS Modules -> SOAJMSModule as shown below. We can also create a new JMS Module, but, I took the easier way out by selecting the SOAJMSModule. 3) Click on "New" as shown in order to create the Connection factory.4) Select "Connection Factory" radio button and click "Next".5) Enter the Connection Factory properties as shown and click on "Finish".6) Target the connection factory to your managed server and click on "Finish". 7) Now, go back and select the Connection Factory that you've just created (see Step 2 above) . Click on "Transactions" and enable XA and click on "Save".

    Read the article

  • Ameristar Wins with Oracle GoldenGate’s Heterogeneous Real-Time Data Integration

    - by Irem Radzik
    Today we announced a press release about another successful project with Oracle GoldenGate. This time at Ameristar. Ameristar is a casino gaming company and needed a single data integration solution to connect multiple heterogeneous systems to its Teradata data warehouse. The project involves integration of Ameristar’s promotional and gaming data from 14 data sources across its 7 casino hotel properties in real time into a central Teradata data warehouse. The source systems include the Aristocrat gaming and MGT promotional management platforms running on Microsoft SQL Server 2000 databases. As you can notice, there was no Oracle Database involved in this project, but Ameristar’s IT leadership knew that  GoldenGate’s strong heterogeneous and real-time data integration capabilities is the right technology for their data warehousing project. With GoldenGate Ameristar was able to reduce data latency to the enterprise data warehouse, and use this real-time customer information for marketing teams in improving overall customer experience. Ameristar customers receive more targeted and timely campaign offers, and the company has more up-to-date visibility into financial metrics of the company. One other key benefit the company experienced with GoldenGate is in operational costs. The previous data capture solution Ameristar used was trigger based and required a lot of effort to manage. They needed dedicated IT staff to maintain it. With GoldenGate, the solution runs seamlessly without needing a fully-dedicated staff, giving the IT team at Ameristar more resources for their other IT projects. If you want to learn more about GoldenGate and the latest features for Oracle Database and non-Oracle databases, please watch our on demand webcast about Oracle GoldenGate 11g Release 2.

    Read the article

  • SPARC64 VII+ Processor Core License Factor Reduced by 33%

    - by john.shell
    The Oracle processor core license factor has been a popular topic the last few months.  For those partners new to Oracle software licensing, the processor core license factor determines the number licensed CPUs that are required when running Oracle software (those charged on a per-CPU basis) on multi-core processors.My last entry talked about the core factor reduction for our T3 processor.  The core license factor for our newly announced SPARC64 VII+ processor is 0.5, which is a 33% reduction from the 0.75 rate used with our SPARC64 VI and VII processors.What does this mean for our partners?  Increased opportunity.  This change, similar to our T3-based systems, means that our hardware is the preferred platform for Oracle software. Still a little dizzy on the breadth of Oracle's software offering?  Do a simple scan of Oracle's software price lists. Consider this your target market.This change allows you to focus on total solution price or price/performance, not server prices or per core performance (a standard IBM sales tactic). That's the offensive side of the game.  Don't forget your defense.  One of the biggest customer benefits around the M-Series is investment protection.  The combination of a simple processor/board upgrade, along with a reduction in processor core license factor, makes upgrading one of the best financial moves for our customers.    One reminder.  The update to the processor core license factor only applies to the new VII+ processor - NOT the SPARC64 VI or VII processors.  You can find the official table here.

    Read the article

  • OpenGL ES Shader help (Blending)

    - by Chris
    Earlier I required assistance getting to grips with how to retain the alpha channel of a transparent texture in my colourised texture shader program. Whilst playing with that first version of my program (before obtaining the solution to my first requirement), I managed to enable transparency for the whole texture (effectively blending via GLSL), and I quite liked this, and I would now like to know if and how it is possible to retain this blending effect, on top of the existing output without affecting the original alpha channel - as I don't know how to input this transparency via the parameter that is already being provided with the textures alpha channel. A basic example of the blending program I am referring to (minus any other functionality) is as follows... varying vec2 texCoord; uniform sampler2D texSampler; void main() { gl_FragColor = vec4(texture2D(texSampler,texCoord).xyz,0.5); } Where 0.5 is the transparency (blending effect) of the whole texture. This is the current version of my program, which provides the ability to colour a texture according the colour parameter passed to the program, and retains the alpha channel of the original texture. varying vec2 texCoord; uniform sampler2D texSampler; uniform vec3 colour; void main() { gl_FragColor = vec4(colour,1) * vec4(texture2D(texSampler,texCoord).xyz,texture2D(texSampler,texCoord).w); } I need to know if it is possible to apply transparency on top this program, without affecting the original alpha channel which I have already preserved. I hope this makes enough sense, I am sure it is possible, and if so I should imagine it is rather simple, but this has me stumped. Any help much appreachiated. Cheers, Chris

    Read the article

  • Google CDN or Akamai

    - by AlienWebguy
    We combine and minify our JS with Minify - the combined JS is then cached on Akamai. I'm suggesting to my supervisor that we put JQuery and JQueryUI on Google CDN and take it out of our combined JS. The benefits I see are parallel downloads, significantly smaller Akamai cache hit, and the high potential the user will have JQuery from Google CDN cached in his/her browser when he/she visits our site so no download will be required at all. I also pointed out how pointing to the major version CDN URL //ajax.googleapis.com/ajax/libs/jquery/1/jquery.min.js will eliminate the need for us to micro-manage our JQuery version control, and that if a new release introduces a bug, we simply point the CDN URL to a stable minor version of our choice until the issue is resolved. My supervisor disagrees and thinks keeping it on Akamai is the way to go. Any insight as to which is going to be faster here? I looked for some benchmarks and references online but they are either out-dated or merely talk about Google wanting to acquire Akamai. EDIT: Some further research has pointed me to an article mentioning how 'latest version' CDN URL's use short expires headers so it might be more optimal to use /jquery/1.7/. I'm fine with this - the general question still remains.

    Read the article

  • New Demos SOA Suite (11.1.1.6) & SOA Suite Foundation Pack (11.1.1.6)

    - by JuergenKress
    For access to the Oracle demo systems please visit OPN and talk to your Partner Expert GSE: SOA & FP (11.1.1.6) Platforms Portable Version – Available SOA 11g Platform FP 11g Platform All SOA/BPM 11g Solutions OFM Demos Corner GSE Offerings Scheduling Demos on GSE Support GSE is pleased to announce the availability of SOA and Foundation Pack 11g (11.1.1.6) Platform Portable images. Portable images now come as a VBox appliance. SOA 11.1.1.6 Platform Portable Version This portable image comes with latest SOA Suite products installed and configured. Vbox appliance facilitates easy maintenance of the image. Click here to download the portable image. FP 11.1.1.6 Platform Portable Version Foundation Pack installed and configured on SOA image and stands as a base for building cross-application integrations. Click here to download the portable image. In addition to Portable images, Global Sales Engineering would like to inform availability of Hosted version of SOA & BPM 11g (11.1.1.6) Solutions. Click here for more information. SOA Suite Foundation Pack Demo Demo Overview Business Process Artifacts Demo Architecture Bill of Materials Demo Collateral DSS Offerings OFM Demos Corner Scheduling Demos on DSS DSS Support The Foundation Pack(FP) demo showcases various tools and utilities of Foundation Pack like Project Lifecycle Workbench(PLW) JDeveloper - Service Constructor Harvesting services to PLW/ Oracle Enterprise Repository Generation of Bill of Materials (BOM) Creation of Deployment Plans / Harvestor Settings Track Foundation Pack Fusion Order demo flow in Enterprise Manager Console For more information on the demo click here. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: SOA DEmo System,DSS,SOA,sales,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • Questions, Knowledge Checks and Assessments

    - by ted.henson
    Questions should be used to reinforce concepts throughout the title. You have the option to include questions in the course, in assessments, in Knowledge Checks, or in any combination. Questions are required for creating knowledge checks and assessments. It is important to remember that questions that are not in assessments are not tracked. Be sure to structure your outline so that questions are added to the appropriate assignable unit. I usually recommend that questions appear directly below their relative section. This serves two purposes. First, it helps ensure that the related content and question stay relative to one another. Secondly, it ensures that when the "link to subject" option is used it will relate back to the relative content. Knowledge checks are created using the questions that have been added to the related assignable unit. Use Knowledge Checks to give users an additional opportunity to review what they have learned. Knowledge Check allows users to check their own knowledge without being tracked or scored. Many users like having this self check option, especially if they know they are going to be tested later. Each assignable unit can have its own Knowledge Check. Assessments provide a way to measure knowledge or understanding of the course material. The results of each assessment are scored and tracked. Assessments are created using the questions that have been added to the relative assignable unit(s). Each assignable unit, including the Title AU, can have multiple assessments. Consider how your knowledge paths will be structured when planning your assessments. For instance, you can create a multiple-activity knowledge path, with multiple assessments from the same title or assignable unit. Also remember, in Manager an assessment can be either a pre or post assessment. Pre-assessments allow the student to discover what is already known in a specific topic or subject and important if the personal course feature is being used. Post-assessments allow you test the student knowledge or understanding after completing the material.

    Read the article

  • Partnering with your Applications – The Oracle AppAdvantage Story

    - by JuergenKress
    So, what is Oracle AppAdvantage? A practical approach to adopting cloud, mobile, social and other trends A guided path to aligning IT more closely with business objectives Maximizing the value of existing investments in applications A layered approach to simplifying IT, building differentiation and bringing innovation All of the above? Enhance the value of your existing applications investment with #Oracle #AppAdvantage Aligning biz and IT expectations on Simplifying IT, building Differentiation and Innovation #AppAdvantage Adopt a pace layered approach to extracting biz value from your apps with #AppAdvantage Bringing #cloud, #social, #mobile to your apps with #Oracle #AppAdvantage Embracing Situational IT In the next IT Leaders Editorial, Rick Beers discusses the necessity of IT disruption and #AppAdvantage. Rick Beers sheds light on the Situational Leadership and the path to success #AppAdvantage. Rick Beers draws parallels with CIO’s strategic thinking and #Oracle #AppAdvantage approach. Do you have this paper in your summer reading list? Aligning biz and IT #AppAdvantage What does Situational leadership have to do with Oracle AppAdvantage? Catch the next piece in Rick Beers’ monthly series of IT Leaders Editorial and find out. #AppAdvantage Middleware Minutes with Howard Beader – August edition In the quarterly column, @hbeader discusses impact of #cloud, #mobile, #fastdata on #middleware Making #cloud, #mobile, #fastdata a part of your IT strategy with #middleware What keeps the #oracle #middleware team busy? Find out in the inaugural post in quarterly update on #middleware Recent #middleware news update along with a preview of things to come from #Oracle, in @hbeader ‘s quarterly column In his inaugural post, Howard Beader, senior director for Oracle Fusion Middleware, discusses the recent industry trends including mobile, cloud, fast data, integration and how these are shaping the IT and business requirements. SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Facebook Wiki Mix Forum Technorati Tags: AppAdvantage,SOA Community,Oracle SOA,Oracle BPM,Community,OPN,Jürgen Kress

    Read the article

  • again again again…. it is Oracle Open World 2012

    - by JuergenKress
    Again… again I crashed my knee during kite surfing. Again the right knee, again the outside meniscus, again the same doctor, again the same operation, again they could sew my meniscus, again the same physiotherapy… again I will miss OOW. OOW session you should not miss Oracle PartnerNetwork Exchange Middleware stream Focus on SOA and BPM Focus on BPM For OFM Partner Advisory Councils please contact [email protected] Keynotes and General sessions to attend: Thomas Kurian: Tuesday, October 2 8:45 a.m. 9:45 a.m., Moscone North, Hall D Hasan Rizvi: General session middleware: Tuesday, October 3 10:15 am 11:15 am, Moscone North, Hall D If you can’t make it to San Francisco watch the keynotes live on-demand Tips and tricks for OOW Plan your visit well in advance! Which keynotes & session do you want to attend? Demo Grounds are highly recommended and the best of OOW! Which 1:1 meetings do you want to arrange? Attend a Partner or Customer Advisory Council? Attend a Country or Community Reception? Attire during OOW: casual clothing, comfortable shoes and light luggage! Do not forget to drink water. Sign an international travel and health insurance before you leave home! What we want from you! Send your tweets: twitter.com/soacommunity @soacommunity and share your pictures at http://www.facebook.com/soacommunity SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: OOW,Oracle Open World,SOA Community,Oracle SOA,Oracle BPM,BPM,Community,OPN,Jürgen Kress

    Read the article

  • Failure to toubleshoot a juju charm deployment

    - by Bruno Pereira
    My environments.yaml looks like this: environments: test: type: local control-bucket: juju-a14dfae3830142d9ac23c499395c2785999 admin-secret: 6608267bbd6b447b8c90934167b2a294999 default-series: oneiric juju-origin: distro data-dir: /home/bruno/projects/juju juju bootstrap runs perfect: 2011-11-22 19:19:31,999 INFO Bootstrapping environment 'test' (type: local)... 2011-11-22 19:19:32,004 INFO Checking for required packages... 2011-11-22 19:19:33,584 INFO Starting networking... 2011-11-22 19:19:34,058 INFO Starting zookeeper... 2011-11-22 19:19:34,283 INFO Starting storage server... 2011-11-22 19:19:40,051 INFO Initializing zookeeper hierarchy 2011-11-22 19:19:40,247 INFO Starting machine agent (origin: distro)... [sudo] password for bruno: 2011-11-22 19:23:16,054 INFO Environment bootstrapped 2011-11-22 19:23:16,079 INFO 'bootstrap' command finished successfully Deploy from a known good charm is accepted (tried it with one that I am trying to create): juju deploy --repository=/home/bruno/projects/charms_repo/ local:teamspeak 2011-11-22 19:28:49,929 INFO Charm deployed as service: 'teamspeak' 2011-11-22 19:28:49,962 INFO 'deploy' command finished successfully After this I can see that juju debug-log shows activity and I can see the network indicator going on and off and activity on my hard-disk. Wait... Looking at juju status I get: services: teamspeak: charm: local:oneiric/teamspeak-1 relations: {} units: teamspeak/0: machine: 0 public-address: 192.168.122.226 relations: {} state: start_error juju debug-log does not help and I have no files under /var/log/juju or /var/lib/juju. Last juju debug-log only shows this: 2011-11-22 19:45:20,790 Machine:0: juju.agents.machine DEBUG: Units changed old:set(['wordpress/0']) new:set(['wordpress/0', 'teamspeak/0']) 2011-11-22 19:45:20,823 Machine:0: juju.agents.machine DEBUG: Starting service unit: teamspeak/0 ... 2011-11-22 19:45:21,137 Machine:0: juju.agents.machine DEBUG: Downloading charm local:oneiric/teamspeak-1 to /home/bruno/projects/juju/bruno-test/charms 2011-11-22 19:45:22,115 Machine:0: juju.agents.machine DEBUG: Starting service unit teamspeak/0 2011-11-22 19:45:22,133 Machine:0: unit.deploy INFO: Creating container teamspeak-0... 2011-11-22 19:47:04,586 Machine:0: unit.deploy INFO: Container created for teamspeak/0 2011-11-22 19:47:04,781 Machine:0: unit.deploy DEBUG: Charm extracted into container 2011-11-22 19:47:04,801 Machine:0: unit.deploy DEBUG: Starting container... 2011-11-22 19:47:07,086 Machine:0: unit.deploy INFO: Started container for teamspeak/0 2011-11-22 19:47:07,107 Machine:0: juju.agents.machine INFO: Started service unit teamspeak/0 How can I troubleshot what is happening here?

    Read the article

  • Entity Framework &amp; Transactions

    - by Sudheer Kumar
    There are many instances we might have to use transactions to maintain data consistency. With Entity Framework, it is a little different conceptually. Case 1 – Transaction b/w multiple SaveChanges(): here if you just use a transaction scope, then Entity Framework (EF) will use distributed transactions instead of local transactions. The reason is that, EF closes and opens the connection when ever required only, which means, it used 2 different connections for different SaveChanges() calls. To resolve this, use the following method. Here we are opening a connection explicitly so as not to span across multipel connections.   using (TransactionScope ts = new TransactionScope()) {     context.Connection.Open();     //Operation1 : context.SaveChanges();     //Operation2 :  context.SaveChanges()     //At the end close the connection     ts.Complete(); } catch (Exception ex) {       //Handle Exception } finally {       if (context.Connection.State == ConnectionState.Open)       {            context.Connection.Close();       } }   Case 2 – Transaction between DB & Non-DB operations: For example, assume that you have a table that keeps track of Emails to be sent. Here you want to update certain details like DataSent once if the mail was successfully sent by the e-mail client. Email eml = GetEmailToSend(); eml.DateSent = DateTime.Now; using (TransactionScope ts = new TransactionScope()) {    //Update DB    context.saveChanges();   //if update is successful, send the email using smtp client   smtpClient.Send();   //if send was successful, then commit   ts.Complete(); }   Here since you are dealing with a single context.SaveChanges(), you just need to use the TransactionScope, just before saving the context only.   Hope this was helpful!

    Read the article

  • SQL SERVER – 2012 RC0 Various Resources and Downloads

    - by pinaldave
    Microsoft SQL Server 2012 Release Candidate 0 (RC0) Microsoft SQL Server 2012 RC0 enables a cloud-ready information platform that will help organizations unlock breakthrough insights across the organization. Microsoft SQL Server 2012 Express RC Microsoft SQL Server 2012 Express RC0 is a powerful and reliable free data management system that delivers a rich set of features, data protection, and performance for embedded applications, lightweight Web Sites, applications, and local data stores. Microsoft SQL Server 2012 Semantic Language Statistics RC0 The Semantic Language Statistics Database is a required component for the Statistical Semantic Search feature in Microsoft SQL Server 2012 Semantic Language Statistics RC0. Microsoft SQL Server 2012 Release Candidate 0 (RC0) Manageability Tool Kit The Microsoft SQL Server 2012 Release Candidate 0 (RC0) Manageability Tool Kit is a collection of stand-alone packages which provide additional value for Microsoft SQL Server 2012 Release Candidate 0 (RC0). Microsoft SQL Server 2012 PowerPivot for Microsoft Excel 2010 Release Candidate 0 (RC0) Microsoft PowerPivot for Microsoft Excel 2010 provides ground-breaking technology; fast manipulation of large data sets, streamlined integration of data, and the ability to effortlessly share your analysis through Microsoft SharePoint Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Database, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • Permission issues causes Unity Segmentation fault

    - by Dj Gilcrease
    I upgraded from 13.04 to 13.10 and I can boot to the unity-greeter and login just fine, but after login I just get a black screen with a cursor. I have tried following http://help.ubuntu.com/community/BinaryDriverHowto/ATI http://help.ubuntu.com/community/RadeonDriver and the manual install of the downloaded AMD drivers. All have the same affect. Also I have read Black screen after login with cursor I get a black screen after logging in ubuntu 13.04 black screen after login Ubuntu 13.10 - Black screen after login session Ubuntu 13.04 - Black screen with unresponsive cursor Upgrade to Ubuntu 13.04 Problem - Boots into Blank Black Screen All of which were no further help then the two support articles about ATI drivers So I switched back to the default drivers and went a little further into debugging, when I do ctrl+alt+F1 and login and try unity --debug > unity_start.log then ctrl+alt+F8 the screen stays black with a cursor and when I switch back ctrl+alt+F1 the contents of the log output are http://pastebin.com/rdQG4Hb0 However when I try sudo unity --debug > unity_start_root.log then ctrl+alt+F8, unity starts and the output of the log is http://pastebin.com/Yv4RD2j7 The fact that it starts as root tells be it is either a permissions issue of some required file or there is some setting that is specific to my user that is causing the SIGSEGV. So to narrow this down I activated the guest account and tried to login and got the same black screens with only a mouse cursor, so this tells me that it is not a configuration issue, but a permissions issue, so how do I narrow down which file has the wrong permissions? Also is there anything further that may help debug this issue? Ok after a few more hours of googling I found that if I add myself to the video group I can login and see the desktop, but there are lots of other permission related issues, so I am thinking something went wonky with PolicyKit during the upgrade, is there a way to reset PolicyKit settings for a user?

    Read the article

  • unmet dependencies and broken count>0 problem

    - by Simon
    I tried installing fbreader, following all the steps, but ended up with unmet dependencies, i also think a file is referenced in two locations at once and hence killing it.. any ideas how I can fix it? i've done alot of research and tried: simon@simon-Studio-1558:~$ sudo apt-get -f install Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages were automatically installed and are no longer required: dkms patch Use 'apt-get autoremove' to remove them. The following extra packages will be installed: libzlcore0.12 The following NEW packages will be installed: libzlcore0.12 0 upgraded, 1 newly installed, 0 to remove and 61 not upgraded. 6 not fully installed or removed. Need to get 0 B/270 kB of archives. After this operation, 811 kB of additional disk space will be used. Do you want to continue [Y/n]? y (Reading database ... 179860 files and directories currently installed.) Unpacking libzlcore0.12 (from .../libzlcore0.12_0.12.10dfsg-4_i386.deb) ... dpkg: error processing /var/cache/apt/archives/libzlcore0.12_0.12.10dfsg-4_i386.deb (--unpack): trying to overwrite '/usr/lib/libzlcore.so.0.12.10', which is also in package libzlcore 0.12.10-1 No apport report written because MaxReports is reached already dpkg-deb: error: subprocess paste was killed by signal (Broken pipe) Errors were encountered while processing: /var/cache/apt/archives/libzlcore0.12_0.12.10dfsg-4_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) sorry for the formatting, but it basically isn't liking: dpkg: error processing /var/cache/apt/archives/libzlcore0.12_0.12.10dfsg-4_i386.deb (--unpack): trying to overwrite '/usr/lib/libzlcore.so.0.12.10', which is also in package libzlcore 0.12.10-1 Any ideas? Also I don't care about keeping the program, but the error is stopping sudo apt-get remove fbreader from working too.

    Read the article

  • BPMN is dead, long live BPEL!

    - by JuergenKress
    “BPMN is dead, long live BPEL” was the title of our panel discussion during the SOA & BPM Integration Days 2011. At the JAXenter my discussion summery was just published (in German). If you want to learn more about SOA & BPM make sure you register for our up-coming conference October 12th & 13th 2011 in Düsseldorf. The speakers include the top SOA and BPM experts in Germany: Thilo Frotscher & Kornelius Fuhrer & Björn Hardegen & Nicolai Josuttis & Michael Kopp & Dr. Dirk Krafzig & Jürgen Kress & Frank Leymann & Berthold Maier & Hajo Normann & Max J. Pucher & Bernd Rücker & Dr. Gregor Scheithauer & Danilo Schmiedel & Guido Schmutz & Dirk Slama & Heiko Spindler & Volker Stiehl & Bernd Trops & Clemens Utschig-Utschig & Tammo van Lessen & Dr. Hendrik Voigt & Torsten Winterberg  For details please become a member in the SOA Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) Blog Twitter LinkedIn Mix Forum Wiki Website

    Read the article

< Previous Page | 361 362 363 364 365 366 367 368 369 370 371 372  | Next Page >