Search Results

Search found 2780 results on 112 pages for 'role'.

Page 108/112 | < Previous Page | 104 105 106 107 108 109 110 111 112  | Next Page >

  • MVC Automatic Menu

    - by Nuri Halperin
    An ex-colleague of mine used to call his SQL script generator "Super-Scriptmatic 2000". It impressed our then boss little, but was fun to say and use. We called every batch job and script "something 2000" from that day on. I'm tempted to call this one Menu-Matic 2000, except it's waaaay past 2000. Oh well. The problem: I'm developing a bunch of stuff in MVC. There's no PM to generate mounds of requirements and there's no Ux Architect to create wireframe. During development, things change. Specifically, actions get renamed, moved from controller x to y etc. Well, as the site grows, it becomes a major pain to keep a static menu up to date, because the links change. The HtmlHelper doesn't live up to it's name and provides little help. How do I keep this growing list of pesky little forgotten actions reigned in? The general plan is: Decorate every action you want as a menu item with a custom attribute Reflect out all menu items into a structure at load time Render the menu using as CSS  friendly <ul><li> HTML. The MvcMenuItemAttribute decorates an action, designating it to be included as a menu item: [AttributeUsage(AttributeTargets.Method, AllowMultiple = true)] public class MvcMenuItemAttribute : Attribute {   public string MenuText { get; set; }   public int Order { get; set; }   public string ParentLink { get; set; }   internal string Controller { get; set; }   internal string Action { get; set; }     #region ctor   public MvcMenuItemAttribute(string menuText) : this(menuText, 0) { } public MvcMenuItemAttribute(string menuText, int order) { MenuText = menuText; Order = order; }       internal string Link { get { return string.Format("/{0}/{1}", Controller, this.Action); } }   internal MvcMenuItemAttribute ParentItem { get; set; } #endregion } The MenuText allows overriding the text displayed on the menu. The Order allows the items to be ordered. The ParentLink allows you to make this item a child of another menu item. An example action could then be decorated thusly: [MvcMenuItem("Tracks", Order = 20, ParentLink = "/Session/Index")] . All pretty straightforward methinks. The challenge with menu hierarchy becomes fairly apparent when you try to render a menu and highlight the "current" item or render a breadcrumb control. Both encounter an  ambiguity if you allow a data source to have more than one menu item with the same URL link. The issue is that there is no great way to tell which link a person click. Using referring URL will fail if a user bookmarked the page. Using some extra query string to disambiguate duplicate URLs essentially changes the links, and also ads a chance of collision with other query parameters. Besides, that smells. The stock ASP.Net sitemap provider simply disallows duplicate URLS. I decided not to, and simply pick the first one encountered as the "current". Although it doesn't solve the issue completely – one might say they wanted the second of the 2 links to be "current"- it allows one to include a link twice (home->deals and products->deals etc), and the logic of deciding "current" is easy enough to explain to the customer. Now that we got that out of the way, let's build the menu data structure: public static List<MvcMenuItemAttribute> ListMenuItems(Assembly assembly) { var result = new List<MvcMenuItemAttribute>(); foreach (var type in assembly.GetTypes()) { if (!type.IsSubclassOf(typeof(Controller))) { continue; } foreach (var method in type.GetMethods()) { var items = method.GetCustomAttributes(typeof(MvcMenuItemAttribute), false) as MvcMenuItemAttribute[]; if (items == null) { continue; } foreach (var item in items) { if (String.IsNullOrEmpty(item.Controller)) { item.Controller = type.Name.Substring(0, type.Name.Length - "Controller".Length); } if (String.IsNullOrEmpty(item.Action)) { item.Action = method.Name; } result.Add(item); } } } return result.OrderBy(i => i.Order).ToList(); } Using reflection, the ListMenuItems method takes an assembly (you will hand it your MVC web assembly) and generates a list of menu items. It digs up all the types, and for each one that is an MVC Controller, digs up the methods. Methods decorated with the MvcMenuItemAttribute get plucked and added to the output list. Again, pretty simple. To make the structure hierarchical, a LINQ expression matches up all the items to their parent: public static void RegisterMenuItems(List<MvcMenuItemAttribute> items) { _MenuItems = items; _MenuItems.ForEach(i => i.ParentItem = items.FirstOrDefault(p => String.Equals(p.Link, i.ParentLink, StringComparison.InvariantCultureIgnoreCase))); } The _MenuItems is simply an internal list to keep things around for later rendering. Finally, to package the menu building for easy consumption: public static void RegisterMenuItems(Type mvcApplicationType) { RegisterMenuItems(ListMenuItems(Assembly.GetAssembly(mvcApplicationType))); } To bring this puppy home, a call in Global.asax.cs Application_Start() registers the menu. Notice the ugliness of reflection is tucked away from the innocent developer. All they have to do is call the RegisterMenuItems() and pass in the type of the application. When you use the new project template, global.asax declares a class public class MvcApplication : HttpApplication and that is why the Register call passes in that type. protected void Application_Start() { AreaRegistration.RegisterAllAreas(); RegisterRoutes(RouteTable.Routes);   MvcMenu.RegisterMenuItems(typeof(MvcApplication)); }   What else is left to do? Oh, right, render! public static void ShowMenu(this TextWriter output) { var writer = new HtmlTextWriter(output);   renderHierarchy(writer, _MenuItems, null); }   public static void ShowBreadCrumb(this TextWriter output, Uri currentUri) { var writer = new HtmlTextWriter(output); string currentLink = "/" + currentUri.GetComponents(UriComponents.Path, UriFormat.Unescaped);   var menuItem = _MenuItems.FirstOrDefault(m => m.Link.Equals(currentLink, StringComparison.CurrentCultureIgnoreCase)); if (menuItem != null) { renderBreadCrumb(writer, _MenuItems, menuItem); } }   private static void renderBreadCrumb(HtmlTextWriter writer, List<MvcMenuItemAttribute> menuItems, MvcMenuItemAttribute current) { if (current == null) { return; } var parent = current.ParentItem; renderBreadCrumb(writer, menuItems, parent); writer.Write(current.MenuText); writer.Write(" / ");   }     static void renderHierarchy(HtmlTextWriter writer, List<MvcMenuItemAttribute> hierarchy, MvcMenuItemAttribute root) { if (!hierarchy.Any(i => i.ParentItem == root)) return;   writer.RenderBeginTag(HtmlTextWriterTag.Ul); foreach (var current in hierarchy.Where(element => element.ParentItem == root).OrderBy(i => i.Order)) { if (ItemFilter == null || ItemFilter(current)) {   writer.RenderBeginTag(HtmlTextWriterTag.Li); writer.AddAttribute(HtmlTextWriterAttribute.Href, current.Link); writer.AddAttribute(HtmlTextWriterAttribute.Alt, current.MenuText); writer.RenderBeginTag(HtmlTextWriterTag.A); writer.WriteEncodedText(current.MenuText); writer.RenderEndTag(); // link renderHierarchy(writer, hierarchy, current); writer.RenderEndTag(); // li } } writer.RenderEndTag(); // ul } The ShowMenu method renders the menu out to the provided TextWriter. In previous posts I've discussed my partiality to using well debugged, time test HtmlTextWriter to render HTML rather than writing out angled brackets by hand. In addition, writing out using the actual writer on the actual stream rather than generating string and byte intermediaries (yes, StringBuilder being no exception) disturbs me. To carry out the rendering of an hierarchical menu, the recursive renderHierarchy() is used. You may notice that an ItemFilter is called before rendering each item. I figured that at some point one might want to exclude certain items from the menu based on security role or context or something. That delegate is the hook for such future feature. To carry out rendering of a breadcrumb recursion is used again, this time simply to unwind the parent hierarchy from the leaf node, then rendering on the return from the recursion rather than as we go along deeper. I guess I was stuck in LISP that day.. recursion is fun though.   Now all that is left is some usage! Open your Site.Master or wherever you'd like to place a menu or breadcrumb, and plant one of these calls: <% MvcMenu.ShowBreadCrumb(this.Writer, Request.Url); %> to show a breadcrumb trail (notice lack of "=" after <% and the semicolon). <% MvcMenu.ShowMenu(Writer); %> to show the menu.   As mentioned before, the HTML output is nested <UL> <LI> tags, which should make it easy to style using abundant CSS to produce anything from static horizontal or vertical to dynamic drop-downs.   This has been quite a fun little implementation and I was pleased that the code size remained low. The main crux was figuring out how to pass parent information from the attribute to the hierarchy builder because attributes have restricted parameter types. Once I settled on that implementation, the rest falls into place quite easily.

    Read the article

  • Running a simple integration scenario using the Oracle Big Data Connectors on Hadoop/HDFS cluster

    - by hamsun
    Between the elephant ( the tradional image of the Hadoop framework) and the Oracle Iron Man (Big Data..) an english setter could be seen as the link to the right data Data, Data, Data, we are living in a world where data technology based on popular applications , search engines, Webservers, rich sms messages, email clients, weather forecasts and so on, have a predominant role in our life. More and more technologies are used to analyze/track our behavior, try to detect patterns, to propose us "the best/right user experience" from the Google Ad services, to Telco companies or large consumer sites (like Amazon:) ). The more we use all these technologies, the more we generate data, and thus there is a need of huge data marts and specific hardware/software servers (as the Exadata servers) in order to treat/analyze/understand the trends and offer new services to the users. Some of these "data feeds" are raw, unstructured data, and cannot be processed effectively by normal SQL queries. Large scale distributed processing was an emerging infrastructure need and the solution seemed to be the "collocation of compute nodes with the data", which in turn leaded to MapReduce parallel patterns and the development of the Hadoop framework, which is based on MapReduce and a distributed file system (HDFS) that runs on larger clusters of rather inexpensive servers. Several Oracle products are using the distributed / aggregation pattern for data calculation ( Coherence, NoSql, times ten ) so once that you are familiar with one of these technologies, lets says with coherence aggregators, you will find the whole Hadoop, MapReduce concept very similar. Oracle Big Data Appliance is based on the Cloudera Distribution (CDH), and the Oracle Big Data Connectors can be plugged on a Hadoop cluster running the CDH distribution or equivalent Hadoop clusters. In this paper, a "lab like" implementation of this concept is done on a single Linux X64 server, running an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0, and a single node Apache hadoop-1.2.1 HDFS cluster, using the SQL connector for HDFS. The whole setup is fairly simple: Install on a Linux x64 server ( or virtual box appliance) an Oracle Database 11g Enterprise Edition Release 11.2.0.4.0 server Get the Apache Hadoop distribution from: http://mir2.ovh.net/ftp.apache.org/dist/hadoop/common/hadoop-1.2.1. Get the Oracle Big Data Connectors from: http://www.oracle.com/technetwork/bdc/big-data-connectors/downloads/index.html?ssSourceSiteId=ocomen. Check the java version of your Linux server with the command: java -version java version "1.7.0_40" Java(TM) SE Runtime Environment (build 1.7.0_40-b43) Java HotSpot(TM) 64-Bit Server VM (build 24.0-b56, mixed mode) Decompress the hadoop hadoop-1.2.1.tar.gz file to /u01/hadoop-1.2.1 Modify your .bash_profile export HADOOP_HOME=/u01/hadoop-1.2.1 export PATH=$PATH:$HADOOP_HOME/bin export HIVE_HOME=/u01/hive-0.11.0 export PATH=$PATH:$HADOOP_HOME/bin:$HIVE_HOME/bin (also see my sample .bash_profile) Set up ssh trust for Hadoop process, this is a mandatory step, in our case we have to establish a "local trust" as will are using a single node configuration copy the new public keys to the list of authorized keys connect and test the ssh setup to your localhost: We will run a "pseudo-Hadoop cluster", in what is called "local standalone mode", all the Hadoop java components are running in one Java process, this is enough for our demo purposes. We need to "fine tune" some Hadoop configuration files, we have to go at our $HADOOP_HOME/conf, and modify the files: core-site.xml hdfs-site.xml mapred-site.xml check that the hadoop binaries are referenced correctly from the command line by executing: hadoop -version As Hadoop is managing our "clustered HDFS" file system we have to create "the mount point" and format it , the mount point will be declared to core-site.xml as: The layout under the /u01/hadoop-1.2.1/data will be created and used by other hadoop components (MapReduce = /mapred/...) HDFS is using the /dfs/... layout structure format the HDFS hadoop file system: Start the java components for the HDFS system As an additional check, you can use the GUI Hadoop browsers to check the content of your HDFS configurations: Once our HDFS Hadoop setup is done you can use the HDFS file system to store data ( big data : )), and plug them back and forth to Oracle Databases by the means of the Big Data Connectors ( which is the next configuration step). You can create / use a Hive db, but in our case we will make a simple integration of "raw data" , through the creation of an External Table to a local Oracle instance ( on the same Linux box, we run the Hadoop HDFS one node cluster and one Oracle DB). Download some public "big data", I use the site: http://france.meteofrance.com/france/observations, from where I can get *.csv files for my big data simulations :). Here is the data layout of my example file: Download the Big Data Connector from the OTN (oraosch-2.2.0.zip), unzip it to your local file system (see picture below) Modify your environment in order to access the connector libraries , and make the following test: [oracle@dg1 bin]$./hdfs_stream Usage: hdfs_stream locationFile [oracle@dg1 bin]$ Load the data to the Hadoop hdfs file system: hadoop fs -mkdir bgtest_data hadoop fs -put obsFrance.txt bgtest_data/obsFrance.txt hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$ hadoop fs -ls /user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt [oracle@dg1 bg-data-raw]$hadoop fs -ls hdfs:///user/oracle/bgtest_data/obsFrance.txt Found 1 items -rw-r--r-- 1 oracle supergroup 54103 2013-10-22 06:10 /user/oracle/bgtest_data/obsFrance.txt Check the content of the HDFS with the browser UI: Start the Oracle database, and run the following script in order to create the Oracle database user, the Oracle directories for the Oracle Big Data Connector (dg1 it’s my own db id replace accordingly yours): #!/bin/bash export ORAENV_ASK=NO export ORACLE_SID=dg1 . oraenv sqlplus /nolog <<EOF CONNECT / AS sysdba; CREATE OR REPLACE DIRECTORY osch_bin_path AS '/u01/orahdfs-2.2.0/bin'; CREATE USER BGUSER IDENTIFIED BY oracle; GRANT CREATE SESSION, CREATE TABLE TO BGUSER; GRANT EXECUTE ON sys.utl_file TO BGUSER; GRANT READ, EXECUTE ON DIRECTORY osch_bin_path TO BGUSER; CREATE OR REPLACE DIRECTORY BGT_LOG_DIR as '/u01/BG_TEST/logs'; GRANT READ, WRITE ON DIRECTORY BGT_LOG_DIR to BGUSER; CREATE OR REPLACE DIRECTORY BGT_DATA_DIR as '/u01/BG_TEST/data'; GRANT READ, WRITE ON DIRECTORY BGT_DATA_DIR to BGUSER; EOF Put the following in a file named t3.sh and make it executable, hadoop jar $OSCH_HOME/jlib/orahdfs.jar \ oracle.hadoop.exttab.ExternalTable \ -D oracle.hadoop.exttab.tableName=BGTEST_DP_XTAB \ -D oracle.hadoop.exttab.defaultDirectory=BGT_DATA_DIR \ -D oracle.hadoop.exttab.dataPaths="hdfs:///user/oracle/bgtest_data/obsFrance.txt" \ -D oracle.hadoop.exttab.columnCount=7 \ -D oracle.hadoop.connection.url=jdbc:oracle:thin:@//localhost:1521/dg1 \ -D oracle.hadoop.connection.user=BGUSER \ -D oracle.hadoop.exttab.printStackTrace=true \ -createTable --noexecute then test the creation fo the external table with it: [oracle@dg1 samples]$ ./t3.sh ./t3.sh: line 2: /u01/orahdfs-2.2.0: Is a directory Oracle SQL Connector for HDFS Release 2.2.0 - Production Copyright (c) 2011, 2013, Oracle and/or its affiliates. All rights reserved. Enter Database Password:] The create table command was not executed. The following table would be created. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081035-74-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files would be created. osch-20131022081035-74-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt Then remove the --noexecute flag and create the external Oracle table for the Hadoop data. Check the results: The create table command succeeded. CREATE TABLE "BGUSER"."BGTEST_DP_XTAB" ( "C1" VARCHAR2(4000), "C2" VARCHAR2(4000), "C3" VARCHAR2(4000), "C4" VARCHAR2(4000), "C5" VARCHAR2(4000), "C6" VARCHAR2(4000), "C7" VARCHAR2(4000) ) ORGANIZATION EXTERNAL ( TYPE ORACLE_LOADER DEFAULT DIRECTORY "BGT_DATA_DIR" ACCESS PARAMETERS ( RECORDS DELIMITED BY 0X'0A' CHARACTERSET AL32UTF8 STRING SIZES ARE IN CHARACTERS PREPROCESSOR "OSCH_BIN_PATH":'hdfs_stream' FIELDS TERMINATED BY 0X'2C' MISSING FIELD VALUES ARE NULL ( "C1" CHAR(4000), "C2" CHAR(4000), "C3" CHAR(4000), "C4" CHAR(4000), "C5" CHAR(4000), "C6" CHAR(4000), "C7" CHAR(4000) ) ) LOCATION ( 'osch-20131022081719-3239-1' ) ) PARALLEL REJECT LIMIT UNLIMITED; The following location files were created. osch-20131022081719-3239-1 contains 1 URI, 54103 bytes 54103 hdfs://localhost:19000/user/oracle/bgtest_data/obsFrance.txt This is the view from the SQL Developer: and finally the number of lines in the oracle table, imported from our Hadoop HDFS cluster SQL select count(*) from "BGUSER"."BGTEST_DP_XTAB"; COUNT(*) ---------- 1151 In a next post we will integrate data from a Hive database, and try some ODI integrations with the ODI Big Data connector. Our simplistic approach is just a step to show you how these unstructured data world can be integrated to Oracle infrastructure. Hadoop, BigData, NoSql are great technologies, they are widely used and Oracle is offering a large integration infrastructure based on these services. Oracle University presents a complete curriculum on all the Oracle related technologies: NoSQL: Introduction to Oracle NoSQL Database Using Oracle NoSQL Database Big Data: Introduction to Big Data Oracle Big Data Essentials Oracle Big Data Overview Oracle Data Integrator: Oracle Data Integrator 12c: New Features Oracle Data Integrator 11g: Integration and Administration Oracle Data Integrator: Administration and Development Oracle Data Integrator 11g: Advanced Integration and Development Oracle Coherence 12c: Oracle Coherence 12c: New Features Oracle Coherence 12c: Share and Manage Data in Clusters Oracle Coherence 12c: Oracle GoldenGate 11g: Fundamentals for Oracle Oracle GoldenGate 11g: Fundamentals for SQL Server Oracle GoldenGate 11g Fundamentals for Oracle Oracle GoldenGate 11g Fundamentals for DB2 Oracle GoldenGate 11g Fundamentals for Teradata Oracle GoldenGate 11g Fundamentals for HP NonStop Oracle GoldenGate 11g Management Pack: Overview Oracle GoldenGate 11g Troubleshooting and Tuning Oracle GoldenGate 11g: Advanced Configuration for Oracle Other Resources: Apache Hadoop : http://hadoop.apache.org/ is the homepage for these technologies. "Hadoop Definitive Guide 3rdEdition" by Tom White is a classical lecture for people who want to know more about Hadoop , and some active "googling " will also give you some more references. About the author: Eugene Simos is based in France and joined Oracle through the BEA-Weblogic Acquisition, where he worked for the Professional Service, Support, end Education for major accounts across the EMEA Region. He worked in the banking sector, ATT, Telco companies giving him extensive experience on production environments. Eugen currently specializes in Oracle Fusion Middleware teaching an array of courses on Weblogic/Webcenter, Content,BPM /SOA/Identity-Security/GoldenGate/Virtualisation/Unified Comm Suite) throughout the EMEA region.

    Read the article

  • Professional Scrum Developer (.NET) Training in London

    - by Martin Hinshelwood
    On the 26th - 30th July in Microsoft’s offices in London Adam Cogan from SSW will be presenting the first Professional Scrum Developer course in the UK. I will be teaching this course along side Adam and it is a fantastic experience. You are split into teams and go head-to-head to deliver units of potentially shippable work in four two hour sprints. The Professional Scrum Developer course is the only course endorsed by both Microsoft and Ken Schwaber and they have worked together very effectively in brining this course to fruition. This course is the brain child of Richard Hundhausen, a Microsoft Regional Director, and both Adam and I attending the Trainer Prep in Sydney when he was there earlier this year. He is a fantastic trainer and no matter where you do this course you can be safe in the knowledge that he has trained and vetted all of the teachers. A tools version of Ken if you will Find a course and register Download this syllabus Download the Scrum Guide What is the Professional Scrum Developer course all about? Professional Scrum Developer course is a unique and intensive five-day experience for software developers. The course guides teams on how to turn product requirements into potentially shippable increments of software using the Scrum framework, Visual Studio 2010, and modern software engineering practices. Attendees will work in self-organizing, self-managing teams using a common instance of Team Foundation Server 2010. Who should attend this course? This course is suitable for any member of a software development team – architect, programmer, database developer, tester, etc. Entire teams are encouraged to attend and experience the course together, but individuals are welcome too. Attendees will self-organize to form cross-functional Scrum teams. These teams require an aggregate of skills specific to the selected case study. Please see the last page of this document for specific details. Product Owners, ScrumMasters, and other stakeholders are welcome too, but keep in mind that everyone who attends will be expected to commit to work and pull their weight on a Scrum team. What should you know by the end of the course? Scrum will be experienced through a combination of lecture, demonstration, discussion, and hands-on exercises. Attendees will learn how to do Scrum correctly while being coached and critiqued by the instructor, in the following topic areas: Form effective teams Explore and understand legacy “Brownfield” architecture Define quality attributes, acceptance criteria, and “done” Create automated builds How to handle software hotfixes Verify that bugs are identified and eliminated Plan releases and sprints Estimate product backlog items Create and manage a sprint backlog Hold an effective sprint review Improve your process by using retrospectives Use emergent architecture to avoid technical debt Use Test Driven Development as a design tool Setup and leverage continuous integration Use Test Impact Analysis to decrease testing times Manage SQL Server development in an Agile way Use .NET and T-SQL refactoring effectively Build, deploy, and test SQL Server databases Create and manage test plans and cases Create, run, record, and play back manual tests Setup a branching strategy and branch code Write more maintainable code Identify and eliminate people and process dysfunctions Inspect and improve your team’s software development process What does the week look like? This course is a mix of lecture, demonstration, group discussion, simulation, and hands-on software development. The bulk of the course will be spent working as a team on a case study application delivering increments of new functionality in mini-sprints. Here is the week at a glance: Monday morning and most of the day Friday will be spent with the computers powered off, so you can focus on sharpening your game of Scrum and avoiding the common pitfalls when implementing it. The Sprints Timeboxing is a critical concept in Scrum as well as in this course. We expect each team and student to understand and obey all of the timeboxes. The timebox duration will always be clearly displayed during each activity. Expect the instructor to enforce it. Each of the ½ day sprints will roughly follow this schedule: Component Description Minutes Instruction Presentation and demonstration of new and relevant tools & practices 60 Sprint planning meeting Product owner presents backlog; each team commits to delivering functionality 10 Sprint planning meeting Each team determines how to build the functionality 10 The Sprint The team self-organizes and self-manages to complete their tasks 120 Sprint Review meeting Each team will present their increment of functionality to the other teams = 30 Sprint Retrospective A group retrospective meeting will be held to inspect and adapt 10 Each team is expected to self-organize and manage their own work during the sprint. Pairing is highly encouraged. The instructor/product owner will be available if there are questions or impediments, but will be hands-off by default. You should be prepared to communicate and work with your team members in order to achieve your sprint goal. If you have development-related questions or get stuck, your partner or team should be your first level of support. Module 1: INTRODUCTION This module provides a chance for the attendees to get to know the instructors as well as each other. The Professional Scrum Developer program, as well as the day by day agenda, will be explained. Finally, the Scrum team will be selected and assembled so that the forming, storming, norming, and performing can begin. Trainer and student introductions Professional Scrum Developer program Agenda Logistics Team formation Retrospective Module 2: SCRUMDAMENTALS This module provides a level-setting understanding of the Scrum framework including the roles, timeboxes, and artifacts. The team will then experience Scrum firsthand by simulating a multi-day sprint of product development, including planning, review, and retrospective meetings. Scrum overview Scrum roles Scrum timeboxes (ceremonies) Scrum artifacts Simulation Retrospective It’s required that you read Ken Schwaber’s Scrum Guide in preparation for this module and course. MODULE 3: IMPLEMENTING SCRUM IN VISUAL STUDIO 2010 This module demonstrates how to implement Scrum in Visual Studio 2010 using a Scrum process template*. The team will learn the mapping between the Scrum concepts and how they are implemented in the tool. After connecting to the shared Team Foundation Server, the team members will then return to the simulation – this time using Visual Studio to manage their product development. Mapping Scrum to Visual Studio 2010 User Story work items Task work items Bug work items Demonstration Simulation Retrospective Module 4: THE CASE STUDY In this module the team is introduced to their problem domain for the week. A kickoff meeting by the Product Owner (the instructor) will set the stage for the why and what that will take during the upcoming sprints. The team will then define the quality attributes of the project and their definition of “done.” The legacy application code will be downloaded, built, and explored, so that any bugs can be discovered and reported. Introduction to the case study Download the source code, build, and explore the application Define the quality attributes for the project Define “done” How to file effective bugs in Visual Studio 2010 Retrospective Module 5: HOTFIX This module drops the team directly into a Brownfield (legacy) experience by forcing them to analyze the existing application’s architecture and code in order to locate and fix the Product Owner’s high-priority bug(s). The team will learn best practices around finding, testing, fixing, validating, and closing a bug. How to use Architecture Explorer to visualize and explore Create a unit test to validate the existence of a bug Find and fix the bug Validate and close the bug Retrospective Module 6: PLANNING This short module introduces the team to release and sprint planning within Visual Studio 2010. The team will define and capture their goals as well as other important planning information. Release vs. Sprint planning Release planning and the Product Backlog Product Backlog prioritization Acceptance criteria and tests Sprint planning and the Sprint Backlog Creating and linking Sprint tasks Retrospective At this point the team will have the knowledge of Scrum, Visual Studio 2010, and the case study application to begin developing increments of potentially shippable functionality that meet their definition of done. Module 7: EMERGENT ARCHITECTURE This module introduces the architectural practices and tools a team can use to develop a valid design on which to develop new functionality. The teams will learn how Scrum supports good architecture and design practices. After the discussion, the teams will be presented with the product owner’s prioritized backlog so that they may select and commit to the functionality they can deliver in this sprint. Architecture and Scrum Emergent architecture Principles, patterns, and practices Visual Studio 2010 modeling tools UML and layer diagrams SPRINT 1 Retrospective Module 8: TEST DRIVEN DEVELOPMENT This module introduces Test Driven Development as a design tool and how to implement it using Visual Studio 2010. To maximize productivity and quality, a Scrum team should setup Continuous Integration to regularly build every team member’s code changes and run regression tests. Refactoring will also be defined and demonstrated in combination with Visual Studio’s Test Impact Analysis to efficiently re-run just those tests which were impacted by refactoring. Continuous integration Team Foundation Build Test Driven Development (TDD) Refactoring Test Impact Analysis SPRINT 2 Retrospective Module 9: AGILE DATABASE DEVELOPMENT This module lets the SQL Server database developers in on a little secret – they can be agile too. By using the database projects in Visual Studio 2010, the database developers can join the rest of the team. The students will see how to apply Agile database techniques within Visual Studio to support the SQL Server 2005/2008/2008R2 development lifecycle. Agile database development Visual Studio database projects Importing schema and scripts Building and deploying Generating data Unit testing SPRINT 3 Retrospective Module 10: SHIP IT Teams need to know that just because they like the functionality doesn’t mean the Product Owner will. This module revisits acceptance criteria as it pertains to acceptance testing. By refining acceptance criteria into manual test steps, team members can execute the tests, recording the results and reporting bugs in a number of ways. Manual tests will be defined and executed using the Microsoft Test Manager tool. As the Sprint completes and an increment of functionality is delivered, the team will also learn why and when they should create a branch of the codeline. Acceptance criteria Testing in Visual Studio 2010 Microsoft Test Manager Writing and running manual tests Branching SPRINT 4 Retrospective Module 11: OVERCOMING DYSFUNCTION This module introduces the many types of people, process, and tool dysfunctions that teams face in the real world. Many dysfunctions and scenarios will be identified, along with ideas and discussion for how a team might mitigate them. This module will enable you and your team to move toward independence and improve your game of Scrum when you depart class. Scrum-butts and flaccid Scrum Best practices working as a team Team challenges ScrumMaster challenges Product Owner challenges Stakeholder challenges Course Retrospective What will be expected of you and you team? This is a unique course in that it’s technically-focused, team-based, and employs timeboxes. It demands that the members of the teams self-organize and self-manage their own work to collaboratively develop increments of software. All attendees must commit to: Pay attention to all lectures and demonstrations Participate in team and group discussions Work collaboratively with other team members Obey the timebox for each activity Commit to work and do your best to deliver All teams should have these skills: Understanding of Scrum Familiarity with Visual Studio 201 C#, .NET 4.0 & ASP.NET 4.0 experience*  SQL Server 2008 development experience Software testing experience * Check with the instructor ahead of time for the exact technologies Self-organising teams Another unique attribute of this course is that it’s a technical training class being delivered to teams of developers, not pairs, and not individuals. Ideally, your actual software development team will attend the training to ensure that all necessary skills are covered. However, if you wish to attend an open enrolment course alone or with just a couple of colleagues, realize that you may be placed on a team with other attendees. The instructor will do his or her best to ensure that each team is cross-functional to tackle the case study, but there are no guarantees. You may be required to try a new role, learn a new skill, or pair with somebody unfamiliar to you. This is just good Scrum! Who should NOT take this course? Because of the nature of this course, as explained above, certain types of people should probably not attend this course: Students requiring command and control style instruction – there are no prescriptive/step-by-step (think traditional Microsoft Learning) labs in this course Students who are unwilling to work within a timebox Students who are unwilling to work collaboratively on a team Students who don’t have any skill in any of the software development disciplines Students who are unable to commit fully to their team – not only will this diminish the student’s learning experience, but it will also impact their team’s learning experience Find a course and register Download this syllabus Download the Scrum Guide Technorati Tags: Scrum,SSW,Pro Scrum Dev

    Read the article

  • CodePlex Daily Summary for Wednesday, October 03, 2012

    CodePlex Daily Summary for Wednesday, October 03, 2012Popular ReleasesSharePoint Column & View Permission: SharePoint Column and View Permission v1.5: Version 1.5 of this project. If you will find any bugs please let me know at enti@zoznam.sk or post your findings in Issue TrackerZ3: Z3 4.1.1 source code: Snapshot corresponding to version 4.1.1.DirectX Tool Kit: October 2012: October 2, 2012 Added ScreenGrab module Added CreateGeoSphere for drawing a geodesic sphere Put DDSTextureLoader and WICTextureLoader into the DirectX C++ namespace Renamed project files for better naming consistency Updated WICTextureLoader for Windows 8 96bpp floating-point formats Win32 desktop projects updated to use Windows Vista (0x0600) rather than Windows 7 (0x0601) APIs Tweaked SpriteBatch.cpp to workaround ARM NEON compiler codegen bugHome Access Plus+: v8.1: HAP+ Web v8.1.1003.000079318 Fixed: Issue with the Help Desk and updating a ticket as an admin 79319 Fixed: formatting issue with the booking system admin header 79321 Moved to using the arrow with a circle symbol on the homepage instead of the > and < 79541 Added: 480px wide mobile theme to login page 79541 Added: 480px wide mobile theme to home page 79541 Added: slide events for homepage 79553 Fixed: Booking System Multiple Lesson Bug 79553 Fixed: IE Error Message 79684 Fixed: jQuery issue ...System.Net.FtpClient: System.Net.FtpClient 2012.10.02: This is the first release of the new code base. It is not compatible with the old API, I repeat it is not a drop in update for projects currently using System.Net.FtpClient. New users should download this release. The old code base (Branch: System.Net.FtpClient_1) will continue to be supported while the new code matures. This release is a complete re-write of System.Net.FtpClient. The API and code are simpler than ever before. There are some new features included as well as an attempt at be...CRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1002.3): Visual Ribbon Editor 1.3.1002.3 What's New: Multi-language support for Labels/Tooltips for custom buttons and groups Support for base language other than English (1033) Connect dialog will not require organization name for ADFS / IFD connections Automatic creation of missing labels for all provisioned languages Minor connection issues fixed Notes: Before saving the ribbon to CRM server, editor will check Ribbon XML for any missing <Title> elements inside existing <LocLabel> elements...YAXLib: Yet Another XML Serialization Library for the .NET Framework: YAXLib 2.10: See change-log for the list of new features added and bugs fixedRenameApp: RenameApp 1.0: First release of RenameAppJsonToStaticTypeGenerator: JsonToStaticTypeGenerator 0.1: This is the first alpha release of JsonToStaticTypeGenerator.XiaoKyun: XiaoKyun V1.00: https://xiaokyun.codeplex.com/CatchThatException: Release 1.12: Wow a very fast change and a much better and faster writing to the text fileNaked Objects: Naked Objects Release 5.0.0: Corresponds to the packaged version 5.0.0 available via NuGet. Please note that the easiest way to install and run the Naked Objects Framework is via the NuGet package manager: just search the Official NuGet Package Source for 'nakedobjects'. It is only necessary to download the source code (from here) if you wish to modify or re-build the framework yourself. If you do wish to re-build the framework, consul the file HowToBuild.txt in the release. Major enhancementsNaked Objects 5.0 is desi...WinRT XAML Toolkit: WinRT XAML Toolkit - 1.3.0: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...D3 Loot Tracker: 1.4.1: This version will automatically save a recording session on application exit if the user didn't stop the current session.SubExtractor: Release 1029: Feature: Added option to make i and ¡ characters movie-specific for improved OCR on Spanish subs (Special Characters tab in Options) Feature: Allow switch to Word Spacing dialog directly from Spell Check dialog Fix: Added more default word spacings for accented characters Fix: Changed Word Spacing dialog to show all OCR'd characters in current sub Fix: Removed application focus grab during OCR Fix: Tightened HD subs fuzzy logic to reduce false matches in small characters Fix: Improved Arrow k...MCEBuddy 2.x: MCEBuddy 2.2.18: Reccomended download Changelog for 2.2.18 (32bit and 64bit) 1. Added support for checking if Showanalyzer has hung and cancelling it 2. New version of comskip, 0.81.48 3. Speeding up comskip 4. Fixed a build bug in 64bit 2.2.17 5. Added a new comkip.ini, better commercial detection for international channels and less aggressive. Old one has been retained as comskip_old.ini 6. Added support for Audio Offset on Conversion Task page in GUI (this overrides the profiles AudioDelay when specified)Readable Passphrase Generator: KeePass Plugin 0.7.1: See the KeePass Plugin Step By Step Guide for instructions on how to install the plugin. Changes Built against KeePass 2.20Windows 8 Toolkit - Charts and More: Beta 1.0: The First Compiled Version of my LibraryPDF.NET: PDF.NET.Ver4.5-OpenSourceCode: PDF.NET Ver4.5 ????,????Web??????。 PDF.NET Ver4.5 Open Source Code,include a sample Web application project.Visual Studio Icon Patcher: Version 1.5.2: This version contains no new images from v1.5.1 Contains the following improvements: Better support for detecting the installed languages The extract & inject commands won’t run if Visual Studio is running You may now run in extract or inject mode The p/invoke code was cleaned up based on Code Analysis recommendations When a p/invoke method fails the Win32 error message is now displayed Error messages use red text Status messages use green textNew Projects.Net Exception Reporter: A reusable and extensible exception reporter for Microsoft .NET projects.Aesha Broker: A rich client Auction House Broker application. Built upon Blizzard's new REST API. Provides a client experience which caches historical auction data to provideASP.NET Friendly URLs: A library that enables automatic resolving of extensionless URLs to ASP.NET file-based handlers, e.g. ASPX pages.Astro Power CMS: Astro Power CMS build on GraffitiCMS, a product of Telligent. GraffitiCMS stop develop, I create this project with name is Astro Power CMSaTester: Here is a good place. And now, I can upload my soruce to it. It's very good.Automacao Residencial: O Netduino é uma plataforma onde voce utiliza a linguagem C# para controlar hardware. O objetivo é criar uma estrutura de comunicaçao com o netduino.Derbster: Explore and learn about modern C# architecture and programming by implementing software to support the modern game of roller derby. Dot FPE - A free Format-preserving encryption implementation for .net: There aren't any widely available implementations of a format-preserving encryption in .NET. Thus we aim to be the first!DotNetEx: .NET Framework extended functionality for data access, working with Tasks and asynchronous programming, encryption algorithms such as SkipJack and other stuff.Elemental Development Toolchain (.NET version): A complete toolchain built around the Æthere langauge.elFinder ASP.NET Connector: The one and only .NET connector for the amazing elFinder 2.X web-based file manager. Finally you can manage your files easily right from your browser!Geosynkronisering: Prosjekt for utarbeidelse av spesifikasjoner for grensesnitt som muliggjør synkronisering av datalager med geografisk datainnhold på tvers av ulike plattformerGIII_P1: Jesli wszyscy w Ciebie zwatpili pokaz ze sie mylili !IntroduceCompany: Website gi?i thi?u doanh nghi?p - công ty.JsonToStaticTypeGenerator: This is the JsonToStaticTypeGenerator project that gives the possibility to generate c# classes out of Json data.kwerty: Coming soonMachine Learning: My machine learning project. Just to figure out things...MicroManager: MaNGOS Web-based ManagerMvcContrib3: This is the version of mvccontrib which works with ASP.Net MVC 3Oracle Destination via ODP.Net (Custom Destination Component): SSIS 2008 R2 solution (custom destination component) to write to oracle via ODP.NetOrchard Commerce History with PayPal: Project expands on Nwazet.Commerce module (and is required for this module to work). Adds a purchase history, product role associations, and PayPal.Phoenix Trans: Web Phoenix Trans v?n t?i hàng hóa trong và ngoài nu?cPowerState: PowerState is .NET application for sending Wake-On-LAN (WOL) requests to computers. It can also shutdown, log off and reboot computers using the WMI.RenameApp: RenameApp is a free and very simple to use renaming software for Windows. RenameApp allows you to easily rename files based on the specified criteria and order.Rose-Hulman User Experience Design: This project will contain labs intended for use in Rose-Hulman's Computer Science and Software Engineering department.Server d? phòng: Ðây là server d? phòng, SharePoint BCS External Connector Caching Pattern Library: Library for enabling caching on SharePoint BCS external connectors. Enables BCS .Net Assemblies to be written that are scalable and performant for search.SharpDX.WPF: This projects provides a DirectX 9, DirectX 10 and DirectX 11 support for WPF. The assembly contains DXElement - an easy to use WPF-FrameworkElement.Simple Password Generator Library: The password generator library, written in C#, is a simple assembly which allow generation of passwords with length anywhere from 1-99.SisEagle.NET: Esse sistema foi desenvolvido pra fins de apresentação do TCC referente ao ano de 2012 na UDF-BrasiliaSWebshop: SWebshop is a PHP based webshop system which allows you to insert, edit and delete data easily and is easy to use for customers.Tabular Database Powershell Cmdlets: This project provides a sample of PowerShell Cmdlets to manage Tabular models, from Analysis Services.University timetable using java: the project is using java language to create timetable (full timetable with exam tables and labs tables) and it will be free for all users with sql databaseURLShoter: This project for shorting URL for ASP.NETWeb Input Form Control: This control allow developer to create the input form by configuring the control in html modeWeibo: rtWorkoutMemo: Project descritpion(first draft): Memorise your workout. Keep archive records of your daily trening such: - series of excercise, - quantity of each serie, - weWPF - Automate Acrobat Security Policy: This WPF Tool was created to quickly password protect batches of PDF documents, using a random generator to generate the passwords.XiaoKyun: Hello Page for Web.Z3: Z3 is a high-performance theorem prover being developed at Microsoft Research.

    Read the article

  • This is the End of Business as Usual...

    - by Michael Snow
    This week, we'll be hosting our last Social Business Thought Leader Series Webcast for 2012. Our featured guest this week will be Brian Solis of Altimeter Group. As we've been going through the preparations for Brian's webcast, it became very clear that an hour's time is barely scraping the surface of the depth of Brian's insights and analysis. Accordingly, in the spirit of sharing Brian's perspective for all of our readers, we'll be featuring guest posts all this week pulled from Brian's larger collection of blog postings on his own website. If you like what you've read here this week, we highly recommend digging deeper into his tome of wisdom. Guest Post by Brian Solis, Analyst, Altimeter Group as originally featured on his site with the minor change of the video addition at the beginning of the post. This is the End of Business as Usual and the Beginning of a New Era of Relevance - Brian Solis, Principal Analyst, Altimeter Group The Times They Are A-Changin’ Come gather ’round people Wherever you roam And admit that the waters Around you have grown And accept it that soon You’ll be drenched to the bone If your time to you Is worth savin’ Then you better start swimmin’ Or you’ll sink like a stone For the times they are a-changin’. - Bob Dylan I’m sure you are wondering why I chose lyrics to open this article. If you skimmed through them, stop here for a moment. Go back through the Dylan’s words and take your time. Carefully read, and feel, what it is he’s saying and savor the moment to connect the meaning of his words to the challenges you face today. His message is as important and true today as it was when they were first written in 1964. The tide is indeed once again turning. And even though the 60s now live in the history books, right here, right now, Dylan is telling us once again that this is our time to not only sink or swim, but to do something amazing. This is your time. This is our time. But, these times are different and what comes next is difficult to grasp. How people communicate. How people learn and share. How people make decisions. Everything is different now. Think about this…you’re reading this article because it was sent to you via email. Yet more people spend their online time in social networks than they do in email. Duh. According to Nielsen, of the total time spent online 22.5% are connecting and communicating in social networks. To put that in perspective, the time spent in the likes of Facebook, Twitter, and Youtube is greater than online gaming at 9.8%, email at 7.6% and search at 4%. Imagine for a moment if you and I were connected to one another in Facebook, which just so happens to be the largest social network in the world. How big? Well, Facebook is the size today of the entire Internet in 2004. There are over 1 billion people friending, Liking, commenting, sharing, and engaging in Facebook…that’s roughly 12% of the world’s population. Twitter has over 200 million users. Ever hear of tumblr? More time is spent on this popular microblogging community than Twitter. The point is that the landscape for communication and all that’s affected by human interaction is profoundly different than how you and I learned, shared or talked to one another yesterday. This transformation is only becoming more pervasive and, it’s not going back. Survival of the Fitting But social media is just one of the channels we can use to reach people. I must be honest. I’m as much a part of tomorrow as I am of yesteryear. It’s why I spend all of my time researching the evolution of media and its impact on business and culture. Because of you, I share everything I learn in newsletters, emails, blogs, Youtube videos, and also traditional books. I’m dedicated to helping everyone not only understand, but grasp the change that’s before you. Technologies such as social, mobile, virtual, augmented, et al compel us adapt our story and value proposition and extend our reach to be part of communities we don’t realize exist. The people who will keep you in business or running tomorrow are the very people you’re not reaching today. Before you continue to read on, allow me to clarify my point of view. My inspiration for writing this is to help you augment, not necessarily replace, the programs you’re running today. We must still reach those whom matter to us in the ways they prefer to be engaged. To reach what I call the connected consumer of Geneeration-C we must too reach them in the ways they wish to be engaged. And in all of my work, how they connect, talk to one another, influence others, and make decisions are not at all like the traditional consumers of the past. Nor are they merely the kids…the Millennial. Connected consumers are representative across every age group and demographic. As you can see, use of social networks, media sharing sites, microblogs, blogs, etc. equally span across Gen Y, Gen X, and Baby Boomers. The DNA of connected customers is indiscriminant of age or any other demographic for that matter. This is more about psychographics, the linkage of people through common interests (than it is their age, gender, education, nationality or level of income. Once someone is introduced to the marvels of connectedness, the sensation becomes a contagion. It touches and affects everyone. And, that’s why this isn’t going anywhere but normalcy. Social networking isn’t just about telling people what you’re doing. Nor is it just about generic, meaningless conversation. Today’s connected consumer is incredibly influential. They’re connected to hundreds and even thousands of other like-minded people. What they experiences, what they support, it’s shared throughout these networks and as information travels, it shapes and steers impressions, decisions, and experiences of others. For example, if we revisit the Nielsen research, we get an idea of just how big this is becoming. 75% spend heavily on music. How does that translate to the arts? I’d imagine the number is equally impressive. If 53% follow their favorite brand or organization, imagine what’s possible. Just like this email list that connects us, connections in social networks are powerful. The difference is however, that people spend more time in social networks than they do in email. Everything begins with an understanding of the “5 W’s and H.E.” – Who, What, When, Where, How, and to What Extent? The data that comes back tells you which networks are important to the people you’re trying to reach, how they connect, what they share, what they value, and how to connect with them. From there, your next steps are to create a community strategy that extends your mission, vision, and value and it align it with the interests, behavior, and values of those you wish to reach and galvanize. To help, I’ve prepared an action list for you, otherwise known as the 10 Steps Toward New Relevance: 1. Answer why you should engage in social networks and why anyone would want to engage with you 2. Observe what brings them together and define how you can add value to the conversation 3. Identify the influential voices that matter to your world, recognize what’s important to them, and find a way to start a dialogue that can foster a meaningful and mutually beneficial relationship 4. Study the best practices of not just organizations like yours, but also those who are successfully reaching the type of people you’re trying to reach – it’s benching marking against competitors and benchmarking against undefined opportunities 5. Translate all you’ve learned into a convincing presentation written to demonstrate tangible opportunity to your executive board, make the case through numbers, trends, data, insights – understanding they have no idea what’s going on out there and you are both the scout and the navigator (start with a recommended pilot so everyone can learn together) 6. Listen to what they’re saying and develop a process to learn from activity and adapt to interests and steer engagement based on insights 7. Recognize how they use social media and innovate based on what you observe to captivate their attention 8. Align your objectives with their objectives. If you’re unsure of what they’re looking for…ask 9. Invest in the development of content, engagement 10. Build a community, invest in values, spark meaningful dialogue, and offer tangible value…the kind of value they can’t get anywhere else. Take advantage of the medium and the opportunity! The reality is that we live and compete in a perpetual era of Digital Darwinism, the evolution of consumer behavior when society and technology evolve faster than our ability to adapt. This is why it’s our time to alter our course. We must connect with those who are defining the future of engagement, commerce, business, and how the arts are appreciated and supported. Even though it is the end of business as usual, it is the beginning of a new age of opportunity. The consumer revolution is already underway, and the question is: How do you better understand the role you play in this production as a connected or social consumer as well as business professional? Again, this is your time to define a new era of engagement and relevance. Originally written for The National Arts Marketing Project Connect with Brian via: Twitter | LinkedIn | Facebook | Google+ --- Note from Michael: If you really like this post above - check out Brian's TEDTalk and his thought process for preparing it in this post: 12.00 Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Calibri","sans-serif"; mso-bidi-font-family:"Times New Roman";} http://www.briansolis.com/2012/10/tedtalk-reinventing-consumer-capitalism-screw-business-as-usual/

    Read the article

  • CodePlex Daily Summary for Saturday, August 16, 2014

    CodePlex Daily Summary for Saturday, August 16, 2014Popular ReleasesTEBookConverter: 1.5: Added: Turkish and French translations Added: A few interface changes Removed: SkinDynamulet: Dynamulet v0.1: DynamoDB Transaction Server v0.1Console parallel nunit tests runner: ConsoleUnitTestsRunner 1.03: bugfixingFluentx: Fluentx v1.5.3: Added few more extension methods.fastBinaryJSON: v1.4.2: v1.4.2 - bug fix circular referencesfastJSON: v2.1.2: 2.1.2 - bug fix circular referencesJPush.NET: JPush Server SDK 1.2.1 (For JPush V3): Assembly: 1.2.1.24728 JPush REST API Version: v3 JPush Documentation Reference .NET framework: v4.0 or above. Sample: class: JPushClientV3 2014 Augest 15th.SEToolbox: SEToolbox 01.043.008 Release 1: Changed ship/station names to use new DisplayName instead of Beacon/Antenna. Fixed issue with updated SE binaries 01.043.018 using new Voxel Material definitions.Google .Net API: Drive.Sample: Google .NET Client API – Drive.SampleInstructions for the Google .NET Client API – Drive.Sample</h2> http://code.google.com/p/google-api-dotnet-client/source/browse/?repo=samples#hg%2FDrive.SampleBrowse Source, or main file http://code.google.com/p/google-api-dotnet-client/source/browse/Drive.Sample/Program.cs?repo=samplesProgram.cs <h3>1. Checkout Instructions</h3> <p><b>Prerequisites:</b> Install Visual Studio, and <a href="http://mercurial.selenic.com/">Mercurial</a>.</p> ...FineUI - jQuery / ExtJS based ASP.NET Controls: FineUI v4.1.1: -??Form??????????????(???-5929)。 -?TemplateField??ExpandOnDoubleClick、ExpandOnEnter、ExpandToSelectRow????(LZOM-5932)。 -BodyPadding???????,??“5”“5 10”,???????????“5px”“5px 10px”。 -??TriggerBox?EnableEdit=false????,??????????????(Jango_Jing-5450)。 -???????????DataKeyNames???????????(yygy-6002)。 -????????????????????????(Gnid-6018)。 -??PageManager???AutoSizePanelID????,??????????????????(yygy-6008)。 -?FState???????????????,????????????????(????-5925)。 -??????OnClientClick???return?????????(FineU...DNN CMS Platform: 07.03.02: Major Highlights Fixed backwards compatibility issue with 3rd party control panels Fixed issue in the drag and drop functionality of the File Uploader in IE 11 and Safari Fixed issue where users were able to create pages with the same name Fixed issue that affected older versions of DNN that do not include the maxAllowedContentLength during upgrade Fixed issue that stopped some skins from being upgraded to newer versions Fixed issue that randomly showed an unexpected error during us...WordMat: WordMat for Mac: WordMat for Mac has a few limitations compared to the Windows version - Graph is not supported (Gnuplot, GeoGebra and Excel works) - Units are not supported yet (Coming up) The Mac version is yet as tested as the windows version.HP OneView PowerShell Library: HP OneView PowerShell Library 1.10.1193: [NOTE]: The installer has been updated, only to allow the user to display What's New at the completion of the install. The contents are the same as the original installer. Branch to HP OneView 1.10 Release. NOTE: This library version does not support older appliance versions. Fixed New-HPOVProfile to check for Firmware and BIOS management for supported platforms. Would erroneously error when neither -firmware or -bios were passed. Fixed Remove-HPOV* cmdlets which did not handle -forc...MFCMAPI: August 2014 Release: Build: 15.0.0.1042 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010/2013 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeEWSEditor: EwsEditor 1.10 Release: • Export and import of items as a full fidelity steam works - without proxy classes! - I used raw EWS POSTs. • Turned off word wrap for EWS request field in EWS POST windows. • Several windows with scrolling texts boxes were limiting content to 32k - I removed this restriction. • Split server timezone info off to separate menu item from the timezone info windows so that the timezone info window could be used without logging into a mailbox. • Lots of updates to the TimeZone window. • UserAgen...Python Tools for Visual Studio: 2.1 RC: Release notes for PTVS 2.1 RC We’re pleased to announce the release candidate for Python Tools for Visual Studio 2.1. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, editing, IntelliSense, interactive debugging, profiling, Microsoft Azure, IPython, and cross-platform debugging support. PTVS 2.1 RC is available for: Visual Studio Expre...Sense/Net ECM - Enterprise CMS: SenseNet 6.3.1 Community Edition: Sense/Net 6.3.1 Community EditionSense/Net 6.3.1 is an important step toward a more modular infrastructure, robustness and maintainability. With this release we finally introduce a packaging and a task management framework, and the Image Editor that will surely make the job of content editors more fun. Please review the changes and new features since Sense/Net 6.3 and give a feedback on our forum! Main new featuresSnAdmin (packaging framework) Task Management Image Editor OData REST A...Touchmote: Touchmote 1.0 beta 13: Changes Less GPU usage Works together with other Xbox 360 controls Bug fixesModern UI for WPF: Modern UI 1.0.6: The ModernUI assembly including a demo app demonstrating the various features of Modern UI for WPF. BREAKING CHANGE LinkGroup.GroupName renamed to GroupKey NEW FEATURES Improved rendering on high DPI screens, including support for per-monitor DPI awareness available in Windows 8.1 (see also Per-monitor DPI awareness) New ModernProgressRing control with 8 builtin styles New LinkCommands.NavigateLink routed command New Visual Studio project templates 'Modern UI WPF App' and 'Modern UI W...ClosedXML - The easy way to OpenXML: ClosedXML 0.74.0: Multiple thread safe improvements including AdjustToContents XLHelper XLColor_Static IntergerExtensions.ToStringLookup Exception now thrown when saving a workbook with no sheets, instead of creating a corrupt workbook Fix for hyperlinks with non-ASCII Characters Added basic workbook protection Fix for error thrown, when a spreadsheet contained comments and images Fix to Trim function Fix Invalid operation Exception thrown when the formula functions MAX, MIN, and AVG referenc...New Projects2113110030: name: pham van long code: 2113110030 subject: oop2113110033: Name: Nguyen Hoang Minh Class: CCQ1311LA Object: OOP2113110284: name: Vuong Thành Ðô id:2113110284 class: CCQ1311LA2113110286_OOP_kiemtra: Mon:OOP Tên: Lê Th? Ng?c Huy?nCRM Queue Monitor: A small tool to monitor queues in Microsoft dynamics CRM 2011 and following versions. It displays the number of items in the queues and the latest item.Dice Bag: A D20 Role Playing Game Dice Bag - A selection of dice for use in the D20 RPG System that can be rolled to any quantity with an applied modifier.DM.Dual-coloredBall: DM.Dual-coloredBallFB Account Data Miner by Bipul Raman: A software which can be use to extract basic metadata of a Facebook profile without logging in to Facebook.huynhtanphat-2113170373: Mon: OPP Name: Huynh Tan Phatkieuquanghuy_OOP: Suject: OOP Name: kieuquanghuy Class: CCQ1311LAMySale: A simple home point-of-sale application, designed for garage sales, and lemonade stalls alike.nguyennhubaongan_OOP: MON: OOP NAME: NGUY?N-NHU-B?O-NGÂNOPP-2113110288: Mon: OOP Ten: Bui Dinh Hoai Nam Pequeño RAE: Una aplicación Windows para utilizar los Web services del Diccionario de la Real Academia Española en línea.phungthiphuonglien_OOP: MONHOC: OOP NAME: PHUNG THI PHUONG LIEN MSSV: 2113110287sharpFlipWall: This is a simple executive toy for Unity that leverages Kinect v2 and Shaders to generate a wall of blocks that move based on player informaitonTranThanhDanh-2113110282: Mon: OPP Name: Tran Thanh DanhWsSequence: Run a number of WS in a sequence?????: ??????????: ???????????: gdsg?????: ???????????: fds??????: gdr??????: gfdg?????: ???????????: ???????????: ???????????: ggerger?????: htryhrt??????: trjty?????: ???????????: sdf?????: ?????QQ:2281595668,?????,????,????。??????????????????????。???????????,????????,????????????????????????????。???????,??????????????。????????????,????????,?????????????: ????????????: ertyer?????: ???????????: gsdrfgds??????: fds?????: ??????????: vcdfxgdsf??????: fdgher??????: fdsf?????: ??????????: ??????????: ???????????: hiuhui?????: ??????????: ??????????: ???????????: fdsfs?????: ??????????: ??????????: ??????????: ??????????: ??????????: ??????????: ??????????: ???????????: ???????????: ??????????: ??????????: fgsdf??????: vdsfd?????: ?????QQ:2281595668,?????,????,????。??????????1998?????????。????????????,???????????????,?????????????,???????????,?2003?????????????,????????????????????????????????: ???????????: ???????????: hfdg?????: ???????????: gfdgfd??????: fdsfd?????: fdsf??????: fghdt?????: ??????????: ??????????: ??????????: gfdgfd????????: gfjhtf??????: ????????????: vdcf??????: fvgdfg??????: ???????????: ??????????: ???????????: jvbhvhv?????: ??????????: ???????????: ???????????: ??????????: ???????????: ????????????: ????????????: ???????????: ???????????: ????????????: fdsfds?????: ???????????: ???????????: ??????????: ?????QQ:2281595668,?????,????,????。???????????????????,??????????????,????,?????????????????????,??????。   ??,??????????????????????。????????,????????????,??????????: ??????????: ??????????: ??????????: ?????QQ:2281595668,?????,????,????。??????????????,??????、????、?????????????????????????,???????????? ??。???????????,??????????,???????????,????2000?,??????????,?????: ???????????: ????????????: ???????????: ???????????: ytryrt??????: ???????????: ???????????: ???????????: ??????????: ???????????: gdfgfd?????: ???????????: gfd??????: ???????????: ???????????: fdsf??????: ????????????: ????????????: gfdtgdr?????: ???????????: fdsfd?????: ??????????: ???????????: ????????????: ????????????: ???????????: ???????????: terwtq?????: ??????????: gdfg??????: ????????????: gfdg?????: ??????????: ??????????: ?????QQ:2281595668,?????,????,????。?????????????,????????????????、??????、???????、???????、?????、???、??????。 ??????????????,???????????,??????,???????,?????????????????: gdfsg?????: fdsf??????: hdfhdf?????: ???????????: ????????????: fgherh?????: ??????????: ?????QQ:2281595668,?????,????,????。?????????,????????????????????,???????????????????????????????????????????????????????????、?????????、??????、????????、?????????????: ???????????: ???????????: ???????????: ????????????: fgdstf??????: ???????????: gfdgfd??????: fdsfd??????: ????????????: ???????????: ???????????: fdsfd?????: ???????????: gfdgfdg?????: ??????????: ?????QQ:2281595668,?????,????,????。????????????????,???????,???QQ;??2008?8??????????????????,????????,?????????,?????????????,?????????????????. ?????????,??????????: ??????????: ???????????: ????????????: fgnhgf??????: gredg?????: ??????????: ??????????: ?????QQ:2281595668,?????,????,????。?????????????,???????????????????、?????????????。?????????????????,?????????????,????????????,?????,????????????????,?????,?????????: ????????????: ???????????: ???????????: ????????????: ???????????: gdfgedf??????: fdsf??????: ?????????????: ?????????????: fdsf??????: ???????????: ?????QQ:2281595668,?????,????,????。??????????????,?????????????????????,???????????????????????!?????????????????????????????????????????????、?????????、??????、??????: fdsf?????: ???????????: fdsf??????: fdsf?????: ???????????: vuv?????: ???????????: grfgfd?????: ??????????: ??????????: ??????????: ??????????: fdsf??????: ghrd?????: ????????????: ?????????????: sgdfg?????: ???????????: grfgdf?????: ???????????: hftghj?????: ???????????: ??????????????: gdfgfd?????: ???????????: fdsf?????: ???????????: fdsf??????: htgrfh?????: ??????????: ??????????: fds??????: sdfds??????: hgfh?????: ??????????: ?????QQ:2281595668,?????,????,????。??????????????,??????、????、?????????????????????????,???????????? ??。???????????,??????????,???????????,????2000?,??????????,??????: ????????????: ???????????: fdsfds????????????: fdsf?????: ??????????: ???????????: gfdgfd?????: ??????????: ??????????: ??????????: ?????QQ:2281595668,?????,????,????。??????????????,?????????????????????,???????????????????????!?????????????????????????????????????????????、?????????、??????、??????: ?????QQ:2281595668,?????,????,????。????????2002?,????????,???????????????,??,?????????。????98?????????????,?????????????,?????????????????,???????,????,?????????????: ??????????: ???????????: ????????????: fd??????: fds?????: ?????QQ:2281595668,?????,????,????。??????????、??、?????????,???????????,?????????????!???????、??、?????、???.????、???、???、???、???、???、?????、?????、???????、????。?????????: ??????????: fdsfds???????: fdsfdsf?????: ???????????: ???????????: ??????????: ???????????: ????????????: fdsf?????: ???????????: ????????????: gdfsgds?????: gttrey??????: cxzc???????: ?????????????: ???????????: ???????????: gdfgfd?????: ??????????: ???????????: ????????????: ????????????: ????????????: gfdgdf?????: ???????????: ytu?????: ???????????: yytry???????: ghmkuygk??????: ????????????: ????????????: ????????????: ???????????: ???????????: ????????????: vuhgvu??????: ??????QQ:2281595668,?????,????,????。?????????,????????????????????,????????????????????????????????????????????????????????????、?????????、??????、????????、???????????: ???????????: hfgh??????: hgfh?????: ?????QQ:2281595668,?????,????,????。??????????????,???????????。??????????????。??、??、????????????。????:????,??????!??????????????,?????,???,???,??,???,???,???,?????????: ????????????: ttgers??????: iui

    Read the article

  • CodePlex Daily Summary for Saturday, May 08, 2010

    CodePlex Daily Summary for Saturday, May 08, 2010New ProjectsBizSpark Camp Sample Code: This sample project demonstrates best practices when developing Windows Phone 7 appliactions with Azure. **This code is meant only as a sample f...CLB Article Module: The CLB Article Module is a simple DotNetNuke Module for Article writers. This module is designed to provide a simple location for article/screenc...cvplus: a computer vision libraryDemina: Demina is a simple keyframe based 2d skeletal animation system. It's developed in C# using XNA.Expo Live: Expo LiveFeedMonster: FeedMonster is a Cross platform Really Simple Syndication reader made to allow you to easily follow updates to your favorite sites. Its made using ...GamePad to KeyBoard: GamePad to KeyBoard (GP2KB) is a small app that lets you define an associated keyboard key for each button in your XBox 360 controller. This way, y...huanhuan's project: moajfiodsafasLazyNet: Lazynet Allows users to save the proxy settings and IP addresses of wireless networks. This makes it easier for users that move a lot between diffe...mfcfetionapi: mfcfetionapiMocking BizTalk: Mocking BizTalk is a set of tools aimed at bringing Mock Object concepts in BizTalk solution testing. Actually it consists in a set of MockPipelin...nkSWFControl: nkSWFControl is a ASP.NET control that provides numberous ways for publishing SWF movies in your pages. The project goal is to become defacto ...Property Pack for EPiServer CMS: With EPiServer CMS 6 it's possible to create a wide variety of custom property types that have customized settings in each usage. This Property Pac...SPRoleAssignment: SPRoleAssignment makes it easier for programmers to assign custom item level permissions. You can now easily grant or remove permissions to certain...Thats-Me Dot Net API: Thats-Me Dot Net API is a library which implements the Thats-Me.ch API for the Dot Net Framework.The Information Literacy Education Learning Environment (ILE): Written in C# utilizing the .net framework the Information Literacy Education (ILE) learning environment is designed to deliver information litera...tinytian: Tools, shared.Unidecode Sharp: UnidecodeSharp is a C# port of Python and Perl Unidecode module. Intended to transliterate an Unicode object into an ASCII string, no matter what l...Wave 4: Wave 4 is a platform for professional bloggers, integrated with many social networking services.New ReleasesµManager for MaNGOS: 0.8.6: Improvements: Supports up to MaNGOS revision 9692 (may support higher revisions, this is dependent on any changes made to the database structure b...BFBC2 PRoCon: PRoCon 0.3.5.0: Release Notes CommingBizSpark Camp Sample Code: Initial Check-in: There are two downloads available for this project. There is a windows phone 7 application that calls webservices hosted in Azure. And there is a...Bojinx: Bojinx Core V4.5.11: Minor release that fixes several minor bugs and adds more error prevention logic. Fixed: You will no longer get a null pointer error when loading ...CuttingEdge.Logging: CuttingEdge.Logging v1.2.2: CuttingEdge.Logging is a library that helps the programmer output log statements to a variety of output targets in .NET applications. The design is...DotNetNuke® Events: 05.01.00 Beta 1: Please take note: this is a Beta releaseThis is a not a formal Release of DNN Events 05.01.00. This is a beta version, which is not extensively te...dylan.NET: dylan.NET v. 9.8: This is the latest version of dylan.NET features the new locimport statement, improved error handling and other new stuff and bug fixes.Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.0.9 beta 2 Released: Hi, This release contains the following enhancements: * New Property named ClosestPlotDistance has been implemented in Axis. It defines the d...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.5.2 beta 2 Released: Hi, This release contains the following enhancements: * New Property named ClosestPlotDistance has been implemented in Axis. It defines the d...GamePad to KeyBoard: GP2KB v0.1: First public version of GamePad to KeyBoard.Global: Version 0.1: This is the first beta stable release of this assembly. Check out the Test console app to see how some of the stuff works. EnjoyHtml Agility Pack: 1.4.0 Stable: 1.4.0 Adds some serious new features to Html Agility Pack to make it work nicer in a LINQ driven .NET World. The HtmlNodeCollection and HtmlAttribu...LazyNet: Beta Release: This is the Beta Version, All functionality has been implemented but needs further testing for bugs.Mercury Particle Engine: Mercury Particle Engine 3.1.0.0: Long overdue release of the latest version of Mercury Particle Engine, this release is changeset #66009NazTek.Extension.Clr4: NazTek.Extension.Clr4 Binary Cab: Includes bin, config, and chm filesnetDumbster: netDumbster 1.0: netDumbster release 1.0Opalis Community Releases: Workflow Examples (May 7, 2010): The Opalis team is providing some sample Workflows (policies) for customers and partners to help you get started in creating your own workflows in ...patterns & practices SharePoint Guidance: SPG2010 Drop10: SharePoint Guidance Drop Notes Microsoft patterns and practices ****************************************** ***************************************...Shake - C# Make: Shake v0.1.9: First public release including samples. Basic tasks: - MSBuild task (msbuild command line with parameters) - SVN command line client with checkout...SPRoleAssignment: Role Assigment Project [Eng]: SPRoleAssignment makes it easier for programmers to assign custom item level permissions. You can now easily grant or remove permissions to certain...SQL Server Metadata Toolkit 2008: Alpha 7 SQL Server Metadata Toolkit: The changes in this are to the Parser, and a change to handle WITH CTE's within the Analyzers. The Parser now handles CAST better, EXEC, EXECUTE, ...StackOverflow Desktop Client in C# and WPF: StackOverflow Client 0.5: Made the popup thinner, removed the extra icon and title. The questions on the popup automatically change every 5 seconds.TimeSpanExt: TimeSpanExt 1.0: This had been stuck in beta for a long time, so I'm glad to finally make the full release.Transcriber: Transcriber V0.2.0: First alpha release. See Issue Tracker for known bugs and incomplete features.Unidecode Sharp: UnidecodeSharp 0.04: First release. Tested and stable. Versions duplicates Perl and Python ones.Visual Studio 2010 AutoScroller Extension: AutoScroller v0.3: A Visual studio 2010 auto-scroller extension. Simply hold down your middle mouse button and drag the mouse in the direction you wish to scroll, fu...Visual Studio 2010 Test Case Import Utilities: V 1.0 (RC2): Work Item Migrator With the prior releases of Test Case Migrator ( Beta and RC1), it was possible to: migrate test cases (along with test steps) ...Word Add-in For Ontology Recognition: Technology Preview, Beta 2 - May 2010: This is a technology preview release of the Word Add-in for Ontology Recognition for Word. These are some of the updates included in this version:...Most Popular ProjectsWBFS ManagerRawrAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight Toolkitpatterns & practices – Enterprise LibraryWindows Presentation Foundation (WPF)Microsoft SQL Server Community & SamplesASP.NETPHPExcelMost Active Projectspatterns & practices – Enterprise LibraryAJAX Control FrameworkRawrThe Information Literacy Education Learning Environment (ILE)Caliburn: An Application Framework for WPF and SilverlightBlogEngine.NETpatterns & practices - UnityjQuery Library for SharePoint Web ServicesNB_Store - Free DotNetNuke Ecommerce Catalog ModuleTweetSharp

    Read the article

  • CodePlex Daily Summary for Tuesday, October 01, 2013

    CodePlex Daily Summary for Tuesday, October 01, 2013Popular ReleasesDotNetNuke® Form and List: 06.00.06: DotNetNuke Form and List 06.00.06 Changes to 6.0.6•Add in Sql to remove 'text on row' setting for UserDefinedTable to make SQL Azure compatible. •Add new azureCompatible element to manifest. •Added a fix for importing templates. Changes to 6.0.2•Fix: MakeThumbnail was broken if the application pool was configured to .Net 4 •Change: Data is now stored in nvarchar(max) instead of ntext Changes to 6.0.1•Scripts now compatible with SQL Azure. Changes to 6.0.0•Icons are shown in module action b...BlackJumboDog: Ver5.9.6: 2013.09.30 Ver5.9.6 (1)SMTP???????、???????????????? (2)WinAPI??????? (3)Web???????CGI???????????????????????Microsoft Ajax Minifier: Microsoft Ajax Minifier 5.2: Mostly internal code tweaks. added -nosize switch to turn off the size- and gzip-calculations done after minification. removed the comments in the build targets script for the old AjaxMin build task (discussion #458831). Fixed an issue with extended Unicode characters encoded inside a string literal with adjacent \uHHHH\uHHHH sequences. Fixed an IndexOutOfRange exception when encountering a CSS identifier that's a single underscore character (_). In previous builds, the net35 and net20...AJAX Control Toolkit: September 2013 Release: AJAX Control Toolkit Release Notes - September 2013 Release Version 7.0930September 2013 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - Instructions for using the AJAX Control Toolkit with ASP.NET 4.5 can b...WDTVHubGen - Adds Metadata, thumbnails and subtitles to WDTV Live Hubs: WDTVHubGen.v2.1.4.apifix-alpha: WDTVHubGen.v2.1.4.apifix-alpha is for testers to figure out if we got the NEW api plugged in ok. thanksVisual Log Parser: VisualLogParser: Portable Visual Log Parser for Dotnet 4.0Random searcher i pochodne: Generatorek playlisty: Generuje playlisty w formacie .m3u. Na razie beta z bety - ale juz dziala i mozna uzywac.sb0t v.5: sb0t 5.15: Fixed bug in join filter. Fixed bug in pm blocking. Added new Crypto and Entities static classes to scripting. Updated the default node list.Trace Reader for Microsoft Dynamics CRM: Trace Reader (1.2013.9.29): Initial releaseAudioWordsDownloader: AudioWordsDownloader 1.1 build 88: New features list of words (mp3 files) is available upon typing when a download path is defined list of download paths is added paths history settings added Bug fixed case mismatch in word search field fixed path not exist bug fixed when history has been used path, when filled from dialog, not stored refresh autocomplete list after path change word sought is deleted when path is changed at the end sought word list is deleted word list not refreshed download ends. word lis...HD-Trailers.NET Downloader: HD-Trailer.Net Downloader v 2.1.5: This started out as an effort to improve the search for the corr3ct IMDB page for the movie. I think I have done that here. I have run about 200 movies and the correct movie was identified in all cases including some entries that were problematic in the past. I also swatted several bugs that popped up under special circumstances and resulted in exceptions. This version should be quite a bit better than previous versions. Let me know if there are any issues.Wsus Package Publisher: Release v1.3.1309.28: Fix a bug, where WPP crash when running on a computer where Windows was installed in another language than Fr, En or De, and launching the Update Creation Wizard. Fix a bug, where WPP crash if some Multi-Thread job are launch with more than 64 items. Add a button to abort "Install This Update" wizard. Allow WPP to remember which columns are shown last time. Make URL clickable on the Update Information Tab. Add a new feature, when Double-Clicking on an update, the default action exec...Tweetinvi a friendly Twitter C# API: Alpha 0.8.3.0: Version 0.8.3.0 emphasis on the FIlteredStream and ease how to manage Exceptions that can occur due to the network or any other issue you might encounter. Will be available through nuget the 29/09/2013. FilteredStream Features provided by the Twitter Stream API - Ability to track specific keywords - Ability to track specific users - Ability to track specific locations Additional features - Detect the reasons the tweet has been retrieved from the Filtered API. You have access to both the ma...AcDown?????: AcDown????? v4.5: ??●AcDown??????????、??、??、???????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ??v4.5 ???? AcPlay????????v3.5 ????????,???????????30% ?? ???????GoodManga.net???? ?? ?????????? ?? ??Acfun?????????? ??Bilibili??????????? ?????????flvcd???????? ??SfAcg????????????? ???????????? ???????????????? ????32...C# Intellisense for Notepad++: Release v1.0.6.0: Added support for classless scripts To avoid the DLLs getting locked by OS use MSI file for the installation.SimpleExcelReportMaker: Serm 0.02: SourceCode and SampleMagick.NET: Magick.NET 6.8.7.001: Magick.NET linked with ImageMagick 6.8.7.0. Breaking changes: - ToBitmap method of MagickImage returns a png instead of a bmp. - Changed the value for full transparency from 255(Q8)/65535(Q16) to 0. - MagickColor now uses floats instead of Byte/UInt16.Media Companion: Media Companion MC3.578b: With the feedback received over the renaming of Movie Folders, and files, there has been some refinement done. As well as I would like to introduce Blu-Ray movie folder support, for Pre-Frodo and Frodo onwards versions of XBMC. To start with, Context menu option for renaming movies, now has three sub options: Movie & Folder, Movie only & Folder only. The option Manual Movie Rename needs to be selected from Movie Preferences, but the autoscrape boxes do not need to be selected. Blu Ray Fo...FFXIV Crafting Simulator: Crafting Simulator 2.3: - Major refactoring of the code behind. - Added a current durability and a current CP textbox.DNN CMS Platform: 07.01.02: Major HighlightsAdded the ability to manage the Vanity URL prefix Added the ability to filter members in the member directory by role Fixed issue where the user could inadvertently click the login button multiple times Fixed issues where core classes could not be used in out of process cache provider Fixed issue where profile visibility submenu was not displayed correctly Fixed issue where the member directory was broken when Convert URL to lowercase setting was enabled Fixed issu...New Projects.netProject: .Net Project 3TINafs-m: Secure file storageASP.NET dhtmlxChart Class: Create different types of charts and render them to webforms.ASP.NET dhtmxGantt Class: Add tasks Add dependencies Use lighboxBestCodeTrainer: Project BestCode is for Training. Black Dragon Online Shop: An online shop with basic functionalities implemented with ASP.NET MVC as a team project in Telerik Academy 2012/2013.Central fovea -X: ????: 1.???????????Cell,??RGB,HSL,X,Y 2.???????? 3.???????????? 4.????????,???????? 5.????????CorvusSKK: SKK-like Japanese Input Method for Windowsdaneshjoo: ?? ??? ???????DBA Toolbox: A tool for the production DBA. A place to store all those scripts, tools, references and queries that make your job as a DBA easier.Devpad IDE: Basic integrated development environment.DXUT for Direct3D 11: Latest version of the Microsoft DXUT framework for Direct3D 11 Win32 desktop applications formerly in the now legacy DirectX SDK.Dynamics CRM Messaging Integration: This project shows how to implement a simple custom messaging solution using Microsoft Dynamics CRM and based on its standard development components.Kockafölde: Kliens program.Medusa's Ebay: Teamwork assignment @ TelerikAcademy 2012/2013. Project is for creating simple (or more complicated) interneTeamwork assignment @ TelerikAcademy 2012/2013.Nauplius.PAS: Nauplius.PAS allows end users to leverage the PowerPoint Conversion Services within SharePoint 2013 to convert PowerPoint documents from one format to another.PayBox payment gateway provider for NB_Store: PayBox payment provider for NB_StorePet passport: This is a teamwork projectProject Euler solutions: Solutions to the ProjectEuler problemsRoboSubSync: Fixes an unsynchronized subtitle in any language according to an already synced subtitle in english, with the help of some advanced algorithm magic... ShopBook: Shopbooksliders: jQuery slidersSQLite to JSON: A simple program to convert SQLite data to JSON.Super64: This is a Commodore 64 emulator that is determined to have extremely unnecessary accuracy.TeamProject1: This is just Demo ProjectXnaHelper: A collection of tools for making 2D XNA Games

    Read the article

  • [GEEK SCHOOL] Network Security 2: Preventing Disaster with User Account Control

    - by Ciprian Rusen
    In this second lesson in our How-To Geek School about securing the Windows devices in your network, we will talk about User Account Control (UAC). Users encounter this feature each time they need to install desktop applications in Windows, when some applications need administrator permissions in order to work and when they have to change different system settings and files. UAC was introduced in Windows Vista as part of Microsoft’s “Trustworthy Computing” initiative. Basically, UAC is meant to act as a wedge between you and installing applications or making system changes. When you attempt to do either of these actions, UAC will pop up and interrupt you. You may either have to confirm you know what you’re doing, or even enter an administrator password if you don’t have those rights. Some users find UAC annoying and choose to disable it but this very important security feature of Windows (and we strongly caution against doing that). That’s why in this lesson, we will carefully explain what UAC is and everything it does. As you will see, this feature has an important role in keeping Windows safe from all kinds of security problems. In this lesson you will learn which activities may trigger a UAC prompt asking for permissions and how UAC can be set so that it strikes the best balance between usability and security. You will also learn what kind of information you can find in each UAC prompt. Last but not least, you will learn why you should never turn off this feature of Windows. By the time we’re done today, we think you will have a newly found appreciation for UAC, and will be able to find a happy medium between turning it off completely and letting it annoy you to distraction. What is UAC and How Does it Work? UAC or User Account Control is a security feature that helps prevent unauthorized system changes to your Windows computer or device. These changes can be made by users, applications, and sadly, malware (which is the biggest reason why UAC exists in the first place). When an important system change is initiated, Windows displays a UAC prompt asking for your permission to make the change. If you don’t give your approval, the change is not made. In Windows, you will encounter UAC prompts mostly when working with desktop applications that require administrative permissions. For example, in order to install an application, the installer (generally a setup.exe file) asks Windows for administrative permissions. UAC initiates an elevation prompt like the one shown earlier asking you whether it is okay to elevate permissions or not. If you say “Yes”, the installer starts as administrator and it is able to make the necessary system changes in order to install the application correctly. When the installer is closed, its administrator privileges are gone. If you run it again, the UAC prompt is shown again because your previous approval is not remembered. If you say “No”, the installer is not allowed to run and no system changes are made. If a system change is initiated from a user account that is not an administrator, e.g. the Guest account, the UAC prompt will also ask for the administrator password in order to give the necessary permissions. Without this password, the change won’t be made. Which Activities Trigger a UAC Prompt? There are many types of activities that may trigger a UAC prompt: Running a desktop application as an administrator Making changes to settings and files in the Windows and Program Files folders Installing or removing drivers and desktop applications Installing ActiveX controls Changing settings to Windows features like the Windows Firewall, UAC, Windows Update, Windows Defender, and others Adding, modifying, or removing user accounts Configuring Parental Controls in Windows 7 or Family Safety in Windows 8.x Running the Task Scheduler Restoring backed-up system files Viewing or changing the folders and files of another user account Changing the system date and time You will encounter UAC prompts during some or all of these activities, depending on how UAC is set on your Windows device. If this security feature is turned off, any user account or desktop application can make any of these changes without a prompt asking for permissions. In this scenario, the different forms of malware existing on the Internet will also have a higher chance of infecting and taking control of your system. In Windows 8.x operating systems you will never see a UAC prompt when working with apps from the Windows Store. That’s because these apps, by design, are not allowed to modify any system settings or files. You will encounter UAC prompts only when working with desktop programs. What You Can Learn from a UAC Prompt? When you see a UAC prompt on the screen, take time to read the information displayed so that you get a better understanding of what is going on. Each prompt first tells you the name of the program that wants to make system changes to your device, then you can see the verified publisher of that program. Dodgy software tends not to display this information and instead of a real company name, you will see an entry that says “Unknown”. If you have downloaded that program from a less than trustworthy source, then it might be better to select “No” in the UAC prompt. The prompt also shares the origin of the file that’s trying to make these changes. In most cases the file origin is “Hard drive on this computer”. You can learn more by pressing “Show details”. You will see an additional entry named “Program location” where you can see the physical location on your hard drive, for the file that’s trying to perform system changes. Make your choice based on the trust you have in the program you are trying to run and its publisher. If a less-known file from a suspicious location is requesting a UAC prompt, then you should seriously consider pressing “No”. What’s Different About Each UAC Level? Windows 7 and Windows 8.x have four UAC levels: Always notify – when this level is used, you are notified before desktop applications make changes that require administrator permissions or before you or another user account changes Windows settings like the ones mentioned earlier. When the UAC prompt is shown, the desktop is dimmed and you must choose “Yes” or “No” before you can do anything else. This is the most secure and also the most annoying way to set UAC because it triggers the most UAC prompts. Notify me only when programs/apps try to make changes to my computer (default) – Windows uses this as the default for UAC. When this level is used, you are notified before desktop applications make changes that require administrator permissions. If you are making system changes, UAC doesn’t show any prompts and it automatically gives you the necessary permissions for making the changes you desire. When a UAC prompt is shown, the desktop is dimmed and you must choose “Yes” or “No” before you can do anything else. This level is slightly less secure than the previous one because malicious programs can be created for simulating the keystrokes or mouse moves of a user and change system settings for you. If you have a good security solution in place, this scenario should never occur. Notify me only when programs/apps try to make changes to my computer (do not dim my desktop) – this level is different from the previous in in the fact that, when the UAC prompt is shown, the desktop is not dimmed. This decreases the security of your system because different kinds of desktop applications (including malware) might be able to interfere with the UAC prompt and approve changes that you might not want to be performed. Never notify – this level is the equivalent of turning off UAC. When using it, you have no protection against unauthorized system changes. Any desktop application and any user account can make system changes without your permission. How to Configure UAC If you would like to change the UAC level used by Windows, open the Control Panel, then go to “System and Security” and select “Action Center”. On the column on the left you will see an entry that says “Change User Account Control settings”. The “User Account Control Settings” window is now opened. Change the position of the UAC slider to the level you want applied then press “OK”. Depending on how UAC was initially set, you may receive a UAC prompt requiring you to confirm this change. Why You Should Never Turn Off UAC If you want to keep the security of your system at decent levels, you should never turn off UAC. When you disable it, everything and everyone can make system changes without your consent. This makes it easier for all kinds of malware to infect and take control of your system. It doesn’t matter whether you have a security suite or antivirus installed or third-party antivirus, basic common-sense measures like having UAC turned on make a big difference in keeping your devices safe from harm. We have noticed that some users disable UAC prior to setting up their Windows devices and installing third-party software on them. They keep it disabled while installing all the software they will use and enable it when done installing everything, so that they don’t have to deal with so many UAC prompts. Unfortunately this causes problems with some desktop applications. They may fail to work after you enable UAC. This happens because, when UAC is disabled, the virtualization techniques UAC uses for your applications are inactive. This means that certain user settings and files are installed in a different place and when you turn on UAC, applications stop working because they should be placed elsewhere. Therefore, whatever you do, do not turn off UAC completely! Coming up next … In the next lesson you will learn about Windows Defender, what this tool can do in Windows 7 and Windows 8.x, what’s different about it in these operating systems and how it can be used to increase the security of your system.

    Read the article

  • So&hellip; What is a SharePoint Developer?

    - by Mark Rackley
    A few days ago Stacy Draper and I were chatting about what it means to be a SharePoint Developer. That actually turns about to be a conversation with lots of shades of grey. Stacy thought it would make a good blog post… well, I can’t promise this to be a GOOD blog post… So, anyway, I decided to let off a little bomb this morning by posting the following tweet on Twitter: @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? Now, I knew this is a debate that has been going on since the first SharePoint Designer User put SharePoint Developer on their resume. There are probably several blogs out there on the subject, but with the wildfire that is jQuery and a few other new features out there I believe it is an important subject to tackle again. I got a lot of great feedback as well on Twitter. The entire twitter conversation is at the end of this blog posting. Thanks everyone for their opinions. Who cares? Why does it matter? Can’t we all just get along? Yes it matters… everything must be labeled and put in it’s proper place. Pigeon holing is the only way to go!  Just kidding.. I’m not near that anal, but yes! It is important to be able to properly identify the skill set of those people on your team and correctly identify the role you are wanting to hire. Saying you are a “SharePoint Developer” is just too vague and just barely begins to answer the question. Also, knowing who’s on your team and what they can do will ensure you give your clients the best people for the job. A Developer writes code right? So, a Developer uses Visual Studio! Whoa, hold on there Sparky. Even if I concede that to be a developer you have to write code then you still can’t say a SharePoint Developer has to use Visual Studio.  So, you can spell C#, how well can you write XSLT? How’s your jQuery? Sorry bud, that’s code whether you like it or not. There are many ways to write code in SharePoint that have nothing to do with cracking open Visual Studio. So, what are the different ways to develop in SharePoint then? How many different ways can you “develop” in SharePoint?? A lot… Out of the box features In SharePoint you can create a site, create a custom list on that site, do basic calculations in a calculated column, set up alerts, and add all sorts of web parts to a page. Let’s face it.. that IS development! javaScript/jQuery Perhaps you’ve heard by now about this thing called jQuery? It’s all over the place and the answer to a lot of people’s prayers. However be careful, with great power comes great responsibility. Remember, javaScript is executed on the client side and if you abuse it your performance could be affected. Also, Marc Anderson (@sympmarc) wrote a pretty awesome javaScript library called SPServices.  This allows you to access SharePoint’s Web Services using jQuery. How freakin cool is that? With these tools at your disposal the number of things you CAN’T do without Visual Studio grows smaller and smaller. This is definitely development no matter what anyone else says and there is no Visual Studio involved. SharePoint Designer Ahhh.. The cause of and the answer to all of your SharePoint development problems. With SharePoint Designer you can use DataView Web Parts, develop (there’s that word again) your branding, and even connect to external datasources.  There’s a lot you can do in SharePoint Designer. It’s got it’s shortcomings, but it is an invaluable tool in the SharePoint developers toolbox. InfoPath So, can InfoPath development really be considered SharePoint development? I would say yes. You can connect to SharePoint lists, populate fields in a SharePoint list, and even write code in InfoPath. Sounds like SharePoint development to me. Visual Studio – Web Services/WCF So, get this. You can write code for SharePoint and not have a clue what the 12 hive is, what “site actions” means, or know how to do ANYTHING in SharePoint? Poppycock! You say? SharePoint Web Services I say… With SharePoint Web Services you can totally interact with SharePoint without knowing anything about SharePoint. I don’t recommend it of course, but it’s possible. What can you write using SharePoint Web Services? How about a little application called SharePoint Designer? Visual Studio – Object Model And here we are finally:  the SharePoint Object Model.  When you hear “SharePoint Developer” most people think of someone opening Visual Studio and creating a custom web part, workflow, event receiver, etc.. etc.. but I hope that by now I have made the point that this is NOT the only form of SharePoint Development! Again… Who cares? Just crack open Visual Studio for everything! Problem solved! Let’s ponder for a moment, shall we? The business comes to you with a requirement that involves some pretty fancy business calculations, and a complicated view that they do NOT want to look like SharePoint. “No Problem” you proclaim you mighty SharePoint Developer. You go back to your cube, chuckle at the latest Dilbert comic, and crack open Visual Studio. Then you build your custom web part… fight with all the deployment, migration, and UAT that you must go through and proclaim victory two weeks later!!!! Well done my good sir/ma’am! Oh wait… it turns out Sally who is not a “developer” did the exact same thing with a Dataview web part and some jQuery and it’s been in production for two weeks? #CockinessFail I know there are many ASP.NET developers out there that can create a custom control and wrap it to be a SharePoint Web Part.  That does NOT mean they are SharePoint Developers though as far as I’m concerned and I personally would much rather have someone on my team that can manipulate the heck (yes, I said ‘heck’) out of SharePoint using Dataview Web Parts, jQuery, and a roll of duct tape. Just because you know how to write code in Visual Studio does not mean you are a SharePoint Developer. What’s the conclusion here? How do we define ‘it’ and what ‘it’ is called? Fortunately, this is MY blog. I don’t have to give answers, I can stir the pot, laugh and leave you to ponder what it means! There is obviously no right or wrong answer here (unless you disagree with me,then you are flat out wrong). Anyway, there are many opinions.  Here’s mine.  If you put SharePoint Developer on your resume make sure to clearly specify HOW you develop in SharePoint and what tools you use. If we must label these gurus of jQuery and SPD, how about “SharePoint Client Developer” or “SharePoint Front End Developer”? Just throwing out an idea. Whatever we call them, to say they are not developers is short-sighted, arrogant, and unfair. Of course, then we need to figure out what to call all those other SharePoint development types.  Twitter Conversation @next_connect: RT @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? | I say no.... @mikegil:  @mrackley re: yr Developer question: SPD expert <> SP Developer. Can be "sous-developer," though. #SharePoint #SPD @WonderLaura:  Rt @mrackley Can someone be considered a SharePoint Dev if all they know how to do is work in SPD? -- My opinion is that devs write code. @exnav29:  Rt @mrackley Can someone be considered a SharePoint Dev if all they know how to do is work in SPD? => I think devs would use VS as well @ssKevin:  @WonderLaura @mrackley does that mean strictly vb and c# when it comes to #SharePoint ? @jimmywim:  @exnav29 @mrackley nah, I'd say they were a power user. Devs know their way around the 12 hive ;) @sympmarc:  RT @mrackley: Can someone be considered a SharePoint Developer if all they know how to do is work in SPD? -> Fighting words. @sympmarc:  @next_connect @mrackley Besides, we prefer to be called "hacks". ;+) @next_connect:  @sympmarc The important thing is that you don't have to develop code to solve problems and create solutions. @mrackley @mrackley:  @sympmarc @next_connect not tryin to pick fight.. just try and find consensus on definition @usher:  @mrackley I'd still argue that you have a DevLite title that's out there for the collaboration engineers (@sympmarc @next_connect) @next_connect: @usher I agree. I've called it Light Dev/ Configuration before. @sympmarc @mrackley @usher:  @next_connect I like DevLite, low calorie but still same great taste :) @mrackley @sympmarc @mrackley:  @next_connect @usher @sympmarc I don't think there's any "lite" to someone who can bend jQuery and XSLT to their will. @usher:  @mrackley okay, so would you refer to someone that writes user controls and assemblies something different (@next_connect @sympmarc) @usher:  @mrackley when looking for a developer that can write .net code, it's a bit different than an XSLT/jQuery designer. @sympmarc @next_connect @jimmywim:  @mrackley @sympmarc @next_connect I reckon a "dev" does managed code and works in the 12 hive @sympmarc:  @jimmywim @mrackley @next_connect We had a similar debate a few days ago @toddbleeker et al @sympmarc:  @sympmarc @jimmywim @mrackley @next_connect @toddbleeker @stevenmfowler More abt my Middle Tier term, but still connected. Meet bus need. @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect I used "No Assembly Required" in the past. I also suggested "Supplimenting the SharePoint DOM" @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect Others suggested Information Worker Solutions/Enhancements @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler I also like "SharePoint Scripting Solutions". All the technologies are script. @jimmywim:  @toddbleeker @sympmarc @mrackley @next_connect I like the IW solutions one... @toddbleeker:  @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler This is like the debate that never ends: it is definitely not called Middle Tier. @jimmywim:  @toddbleeker @sympmarc @mrackley @next_connect @stevenmfowler "Scripting" these days makes me think PowerShell... @sympmarc:  @toddbleeker @jimmywim @mrackley @next_connect @stevenmfowler If it forces a debate on h2 best solve bus probs, I'll keep sayin Middle Tier. @usher:  @sympmarc so we know what we're looking for, we just can't define a name? @toddbleeker @jimmywim @mrackley @next_connect @stevemfowler @sympmarc:  @usher @sympmarc @toddbleeker @jimmywim @mrackley @next_connect @stevemfowler The naming seems to matter more than the substance. :-( @jimmywim:  @sympmarc @usher @toddbleeker @mrackley @next_connect @stevemfowler work brkdn defines tasks, defines tools needed, can then b grp'd by user @WonderLaura:  @mrackley @toddbleeker @jimmywim @sympmarc @usher @next_connect Funny you're asking. @johnrossjr and I spent hours this week on the subject. @stevenmfowler:  RT @toddbleeker: @sympmarc @jimmywim @mrackley @next_connect @stevenmfowler it is definitely not called Middle Tier. < I'm with Todd

    Read the article

  • Installing SharePoint 2010 and PowerPivot for SharePoint on Windows 7

    - by smisner
    Many people like me want (or need) to do their business intelligence development work on a laptop. As someone who frequently speaks at various events or teaches classes on all subjects related to the Microsoft business intelligence stack, I need a way to run multiple server products on my laptop with reasonable performance. Once upon a time, that requirement meant only that I had to load the current version of SQL Server and the client tools of choice. In today's post, I'll review my latest experience with trying to make the newly released Microsoft BI products work with a Windows 7 operating system. The entrance of Microsoft Office SharePoint Server 2007 into the BI stack complicated matters and I started using Virtual Server to establish a "suitable" environment. As part of the team that delivered a lot of education as part of the Yukon pre-launch activities (that would be SQL Server 2005 for the uninitiated), I was working with four - yes, four - virtual servers. That was a pretty brutal workload for a 2GB laptop, which worked if I was very, very careful. It could also be a finicky and unreliable configuration as I learned to my dismay at one TechEd session several years ago when I had to reboot a very carefully cached set of servers just minutes before my session started. Although it worked, it came back to life very, very slowly much to the displeasure of the audience. They couldn't possibly have been less pleased than me. At that moment, I resolved to get the beefiest environment I could afford and consolidate to a single virtual server. Enter the 4GB 64-bit laptop to preserve my sanity and my livelihood. Likewise, for SQL Server 2008, I managed to keep everything within a single virtual server and I could function reasonably well with this approach. Now we have SQL Server 2008 R2 plus Office SharePoint Server 2010. That means a 64-bit operating system. Period. That means no more Virtual Server. That means I must use Hyper-V or another alternative. I've heard alternatives exist, but my few dabbles in this area did not yield positive results. It might have been just me having issues rather than any failure of those technologies to adequately support the requirements. My first run at working with the new BI stack configuration was to set up a 64-bit 4GB laptop with a dual-boot to run Windows Server 2008 R2 with Hyper-V. However, I was generally not happy with running Windows Server 2008 R2 on my laptop. For one, I couldn't put it into sleep mode, which is helpful if I want to prepare for a presentation beforehand and then walk to the podium without the need to hold my laptop in its open state along the way (my strategy at the TechEd session long, long ago). Secondly, it was finicky with projectors. I had issues from time to time and while I always eventually got it to work, I didn't appreciate those nerve-wracking moments wondering whether this would be the time that it wouldn't work. Somewhere along the way, I learned that it was possible to load SharePoint 2010 in a Windows 7 which piqued my interest. I had just acquired a new laptop running Windows 7 64-bit, and thought surely running the BI stack natively on my laptop must be better than running Hyper-V. (I have not tried booting to Hyper-V VHD yet, but that's on my list of things to try so the jury of one is still out on this approach.) Recently, I had to build up a server with the RTM versions of SQL Server 2008 R2 and Sharepoint Server 2010 and decided to follow suit on my Windows 7 Ultimate 64-bit laptop. The process is slightly different, but I'm happy to report that it IS possible, although I had some fits and starts along the way. DISCLAIMER: These products are NOT intended to be run in production mode on the Windows 7 operating system. The configuration described in this post is strictly for development or learning purposes and not supported by Microsoft. If you have trouble, you will NOT get help from them. I might be able to help, but I provide no guarantees of my ability or availablity to help. I won't provide the step-by-step instructions in this post as there are other resources that provide these details, but I will provide an overview of my approach, point you to the relevant resources, describe some of the problems I encountered, and explain how I addressed those problems to achieve my desired goal. Because my goal was not simply to set up SharePoint Server 2010 on my laptop, but specifically PowerPivot for SharePoint, I started out by referring to the installation instructions at the PowerPiovt-Info site, but mainly to confirm that I was performing steps in the proper sequence. I didn't perform the steps in Part 1 because those steps are applicable only to a server operating system which I am not running on my laptop. Then, the instructions in Part 2, won't work exactly as written for the same reason. Instead, I followed the instructions on MSDN, Setting Up the Development Environment for SharePoint 2010 on Windows Vista, Windows 7, and Windows Server 2008. In general, I found the following differences in installation steps from the steps at PowerPivot-Info: You must copy the SharePoint installation media to the local drive so that you can edit the config.xml to allow installation on a Windows client. You also have to manually install the prerequisites. The instructions provides links to each item that you must manually install and provides a command-line instruction to execute which enables required Windows features. I will digress for a moment to save you some grief in the sequence of steps to perform. I discovered later that a missing step in the MSDN instructions is to install the November CTP Reporting Services add-in for SharePoint. When I went to test my SharePoint site (I believe I tested after I had a successful PowerPivot installation), I ran into the following error: Could not load file or assembly 'RSSharePointSoapProxy, Version=10.0.0.0, Culture=neutral, PublicKeyToken=89845dcd8080cc91' or one of its dependencies. The system cannot find the file specified. I was rather surprised that Reporting Services was required. Then I found an article by Alan le Marquand, Working Together: SQL Server 2008 R2 Reporting Services Integration in SharePoint 2010,that instructed readers to install the November add-in. My first reaction was, "Really?!?" But I confirmed it in another TechNet article on hardware and software requirements for SharePoint Server 2010. It doesn't refer explicitly to the November CTP but following the link took me there. (Interestingly, I retested today and there's no longer any reference to the November CTP. Here's the link to download the latest and greatest Reporting Services Add-in for SharePoint Technologies 2010.) You don't need to download the add-in anymore if you're doing a regular server-based installation of SharePoint because it installs as part of the prerequisites automatically. When it was time to start the installation of SharePoint, I deviated from the MSDN instructions and from the PowerPivot-Info instructions: On the Choose the installation you want page of the installation wizard, I chose Server Farm. On the Server Type page, I chose Complete. At the end of the installation, I did not run the configuration wizard. Returning to the PowerPivot-Info instructions, I tried to follow the instructions in Part 3 which describe installing SQL Server 2008 R2 with the PowerPivot option. These instructions tell you to choose the New Server option on the Setup Role page where you add PowerPivot for SharePoint. However, I ran into problems with this approach and got installation errors at the end. It wasn't until much later as I was investigating an error that I encountered Dave Wickert's post that installing PowerPivot for SharePoint on Windows 7 is unsupported. Uh oh. But he did want to hear about it if anyone succeeded, so I decided to take the plunge. Perseverance paid off, and I can happily inform Dave that it does work so far. I haven't tested absolutely everything with PowerPivot for SharePoint but have successfully deployed a workbook and viewed the PowerPivot Management Dashboard. I have not yet tested the data refresh feature, but I have installed. Continue reading to see how I accomplished my objective. I unintalled SQL Server 2008 R2 and started again. I had different problems which I don't recollect now. However, I uninstalled again and approached installation from a different angle and my next attempt succeeded. The downside of this approach is that you must do all of the things yourself that are done automatically when you install PowerPivot as a new server. Here are the steps that I followed: Install SQL Server 2008 R2 to get a database engine instance installed. Run the SharePoint configuration wizard to set up the SharePoint databases. In Central Administration, create a Web application using classic mode authentication as per a TechNet article on PowerPivot Authentication and Authorization. Then I followed the steps I found at How to: Install PowerPivot for SharePoint on an Existing SharePoint Server. Especially important to note - you must launch setup by using Run as administrator. I did not have to manually deploy the PowerPivot solution as the instructions specify, but it's good to know about this step because it tells you where to look in Central Administration to confirm a successful deployment. I did spot some incorrect steps in the instructions (at the time of this writing) in How To: Configure Stored Credentials for PowerPivot Data Refresh. Specifically, in the section entitled Step 1: Create a target application and set the credentials, both steps 10 and 12 are incorrect. They tell you to provide an actual Windows user name and password on the page where you are simply defining the prompts for your application in the Secure Store Service. To add the Windows user name and password that you want to associate with the application - after you have successfully created the target application - you select the target application and then click Set credentials in the ribbon. Lastly, I followed the instructions at How to: Install Office Data Connectivity Components on a PowerPivot server. However, I have yet to test this in my current environment. I did have several stops and starts throughout this process and edited those out to spare you from reading non-essential information. I believe the explanation I have provided here accurately reflect the steps I followed to produce a working configuration. If you follow these steps and get a different result, please let me know so that together we can work through the issue and correct these instructions. I'm sure there are many other folks in the Microsoft BI community that will appreciate the ability to set up the BI stack in a Windows 7 environment for development or learning purposes. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • CodePlex Daily Summary for Monday, June 07, 2010

    CodePlex Daily Summary for Monday, June 07, 2010New ProjectsAgile Personal Development Methodology: 本项目并不是软件开发项目,它是一个关注个人能力发展的项目,希望通过大家的积极参与和实践,在价值观、原则和最佳实践的基础上不断丰富和完善这些内容。我将主要从生活、工作和个人三个主要方面来指导个人如何快速地提高自己的能力。在工作方面,首先关注的是IT技术人员。 希望本方法论的不断完善,能够对不同阶...Altairis Mail Toolkit: Altairis Mail Toolkit is a component for .NET developers making easy to send templated and localized e-mail messages. Uses standard resource mechan...AmazonPriceTicker: AmazonPriceTicker ist ein kleines ASP.NET-Projekt für unser Studium.CC.Hearts Screen Saver: A complete screensaver that draws pretty hearts. Supports all standard screensaver functionality (preview window, options, multi-monitor). Written...Controle Financeiro com DDD: Projeto para fins de estudo das tencologias: - DDD - TDD - Asp Net MVCDebugWriterTextBox: This is modified TextBox which can catch up Debug.Write() and display log. Also it can write log data to file - all you need is to set up file name!DEIConsultingDev: DEI Consulting DEVEasy Augmented Reality Suite for Silverlight: Easy Augmented Reality Suite for SilverlightEvent Broker: Simple event broker with an hierarchical implementation.fleet It: A WPF Client for Team Space. Developed using WPF and C#. Various URL Shortening services integrated.Gest-Bus: Gest-BusHNV Project: Projects created by Hoàng, Ngọc, VũHotel Management System: Hotel Management System : concerned in making a complete website for a hotel every thing in hotel in just one web application : Finance , managemen...ISEN découverte majeure application mobile: ISEN découverte majeure application mobile contrôle d'un pc par un téléphone portable avec plateforme windows 7 LiveSequence: Based in the sequence diagramming control of Nauman Leghari.NHash: This is a simple project that integrates with Explorer and Computes MD5 and SHA1.NHibernate Sidekick Library: NHibernate Sidekick is a library intended to assist in the development of multi-tiered applications using the NHibernate ORM framework.ScrewTurn ASP.NET Proxy Membership Provider: Plugin for the ScrewTurn Wiki System to use Standard ASP.NET Membership and Role providers.Silverlight SNS: We are going to develop a SNS web application based on Silver lightSilverPoiMap: SilverPoiMap provides interactive searching and management of Points of Interest. It is a Facebook client application which allow you to connect to...Simple Resource Localization Editor: Simple editor that simplifies localization and synchronization of .NET resource files (.resx).SimpleBlog.NET MVC: A very simple blog engine developed in ASP.NET 3.5 MVC2 and C#.Siverlight Project: Hope that everybody continue to develop it.SpaceConquest: Incorporated standard design patterns to build a peer to peer game in Java. The game rules were similar in complexity to games like Civilization an...Yasts: Yasts - Yet Another Space Trading Sim - This is a learning environment project.New ReleasesAgile Personal Development Methodology: 敏捷个人-认识自我,管理自我.pdf: 去年我在blog上写了个人管理系列的一些blog,其中一些文章深受大家的喜欢。想到写这个系列是源于在实施敏捷Scrum方法时,对方法实施是否对人的水平需要高要求的一些思考。自组织团队是建立在敏捷个人之上的,没有个人就没有团队,实施Scrum对人要求不高,但想实施得好,那么对人的要求肯定不低。 ...Altairis Mail Toolkit: 1.0.0 RTM: Initial releaseAndrew's XNA Helpers: Andrew's Xna Helpers V1.1: Currently only supports X86 projects since portions of the code have to be reworked to work with the xbox. I do plan to code it for the 360 though....C#Mail: Higuchi.Mail.dll (2010.6.6 ver): Higuchi.Mail.dll (2010.4.30 ver)Community Forums NNTP bridge: Community Forums NNTP Bridge V31: Release of the Community Forums NNTP Bridge to access the social and anwsers MS forums with a single, open source NNTP bridge. This release has ad...Html Agility Pack: Experimental Xpath Updates: In efforts to update make Html Agility Packs Xpath support to be closer to the System.Xml.Xpath implementation I have updated HAP to have all nodes...imdb movie downloader: myImdb 0.9.2: myImdb 0.9.2 Fully changed...and added some more features.. working with XML movie list... Used Backgroundworker more clever results and guess m...InfoService: InfoService v1.6 - MPE1 or RAR Package: InfoService Release v1.6.0.136 Please read Plugin installation for installation instructions.ISEN découverte majeure application mobile: appli traitement d'image: but : capturer, redimensionner l'image de l'écran d'un pc.Jeremi Stadler: Clipboard Manager: Version 1.0.5.14 It's finally here! I have been working on this one the whole night but it's worth it ;) The program catches clipboard changes an...mesoBoard: mesoBoard 0.9 - Beta: mesoBoard version 0.9 beta Released under the New BSD License. http://mesoboard.codeplex.com/license http://mesoboard.comMiniTwitter: 1.13: MiniTwitter 1.13 変更内容 修正 ダイレクトメッセージの取得に失敗するバグを修正 タイムラインを編集すると落ちるバグを修正 リストのインポートで最初の 20 人しか取得できない問題を修正 追加 URL マウスオーバーでの画像のプレビュー機能を実装MiniTwitter: 1.13.1: MiniTwitter 1.13.1 更新内容 修正 タイムライン追加、編集ダイアログでキャンセルを選ぶと落ちるバグを修正MSTS Editors & Tools: Simis Editor v0.4: Simis Editor v0.4 Open and Save dialogs support full filename filters from BNFs (e.g. "tsection.dat"). Added statusbar and menu help text. Adde...NHibernate Sidekick Library: 0.6.0: v0.6.0 - Initial Release TODONLog - Advanced .NET Logging: Nightly Build 2010.06.06.001: Changes since the last build:2010-06-05 21:50:21 Jarek Kowalski fixed doc for ${document-uri} 2010-06-05 20:21:53 Jarek Kowalski removed Layout re...NQ - Component-oriented Framework: NQ Core 0.90: Main changes in 0.90 release: Introduction of an additional built-in component loader to load component and service information from XML files ins...PhluffyFotos: v4 Windows Azure: This release has been updated to Visual Studio 2010 as well as the latest StorageClient library. Make sure you run the Provision.cmd in order to b...RIA Services Essentials: Book Club Application (June 6, 2010): The initial release of the BookClub application based on the MIX presentation with a few changes: 1. Some bug fixes 2. Added the ability to Like a ...Samurai.Workflow: 1.1 Stable Release: Removed reference to WPF assemblies so non-WPF applications can use the workflow. For WPF apps, the workflow will use reflection to seek out UI thr...SilverPoiMap: SilverPoiMap Beta: Beta VersionSimple Resource Localization Editor: Release 1: First release.Siverlight Project: Auto Arrange Panel Project: Auto Arrange Panel ProjectSmart Voice: Smart Voice 0.2: Changelog: Corrected a few bugsSmart Voice: Smart Voice 0.2.1: Changelog: Fixed a major bug that was slowing the application Added opt in for usage data In order to contribute with user data, please change t...Spider Compiler: Release 0.1: Contains the setup for the spider compiler. This release includes the changeset #66980.Spider Compiler: Release 0.2: This release includes the changeset #67003.The Lounge Repository: Lounge Repo Binary Release: All in one binary download of the Lounge Repository. Improvements: -More tolerant towards schema changes -Bug fixed regarding array normalization ...UrzaGatherer: UrzaGatherer v2.0.1: This release integrate the support of the Full Database Backup.Web/Cloud Applications Development Framework | Visual WebGui: 6.4 beta 3: This is the last beta version before the official release of Visual WebGui, including support for Visual Studio 2010 and it is fully featured with ...WinGet: Alpha 2: This is the second release of WinGet (0.5.0.2). Is is alpha quality not suitable for use in a production environment and it has bugs (see Known Iss...WoW Character Viewer: WoW Viewer: Newly redesigned layout of the original WoW Character Viewer. Faster access, cleaner layout.Most Popular ProjectsCommunity Forums NNTP bridgeASP.NET MVC Time PlannerMoonyDesk (windows desktop widgets)NeatUploadOutSyncFileupload AJAXViperWorks IgnitionAgUnit - Silverlight unit testing with ReSharperSQL Nexus ToolSmith Async .NET Memcached ClientMost Active ProjectsCommunity Forums NNTP bridgepatterns & practices – Enterprise LibraryRawrGMap.NET - Great Maps for Windows Forms & PresentationStyleCopN2 CMSsmark C# LibraryFarseer Physics EngineIonics Isapi Rewrite FilterNB_Store - Free DotNetNuke Ecommerce Catalog Module

    Read the article

  • Issue 15: SVP Focus

    - by rituchhibber
         SVP FOCUS FOCUS -- Chris Baker SVP Oracle Worldwide ISV-OEM-Java Sales Chris Baker is the Global Head of ISV/OEM Sales responsible for working with ISV/OEM partners to maximise Oracle's business through those partners, whilst maximising those partners’ business to their end users. Chris works with partners, customers, innovators, investors and employees to develop innovative business solutions using Oracle products, services and skills. RESOURCES -- Oracle PartnerNetwork (OPN) OPN Solutions Catalog Oracle Exastack Program Oracle Exastack Optimized Oracle Cloud Computing Oracle Engineered Systems Oracle and Java SUBSCRIBE FEEDBACK PREVIOUS ISSUES "By taking part in marketing activities, our partners accelerate their sales cycles." -- Firstly, could you please explain Oracle's current strategy for ISV partners, globally and in EMEA? Oracle customers use independent software vendor (ISV) applications to run their businesses. They use them to generate revenue and to fulfil obligations to their own customers. Our strategy is very straight-forward. We want all of our ISV partners and OEMs to concentrate on the things that they do the best—building applications to meet the unique industry and functional requirements of their customer. We want to ensure that we deliver a best-in-class application platform so ISVs are free to concentrate their effort on their application functionality and user experience We invest over four billion dollars in research and development every year, and we want our ISVs to benefit from all of that investment in operating systems, virtualisation, databases, middleware, engineered systems, and other hardware. By doing this, we help them to reduce their costs, gain more consistency and agility for quicker implementations, and also rapidly differentiate themselves from other application vendors. It's all about simplification because we believe that around 25 to 30 percent of the development costs incurred by many ISVs are caused by customising infrastructure and have nothing to do with their applications. Our strategy is to enable our ISV partners to standardise their application platform using engineered architecture, so they can write once to the Oracle stack and deploy seamlessly in the cloud, on-premise, or in hybrid deployments. It's really important that architecture is the same in order to keep cost and time overheads at a minimum, so we provide standardisation and an environment that enables our ISVs to concentrate on the core business that makes them the most money and brings them success. How do you believe this strategy is helping the ISVs to work hand-in-hand with Oracle to ensure that end customers get the industry-leading solutions that they need? We work with our ISVs not just to help them be successful, but also to help them market themselves. We have something called the 'Oracle Exastack Ready Program', which enables ISVs to publicise themselves as 'Ready' to run the core software platforms that run on Oracle's engineered systems including Exadata and Exalogic. So, for example, they can become 'Database Ready' which means that they use the latest version of Oracle Database and therefore can run their application without modification on Exadata or the Oracle Database Appliance. Alternatively, they can become WebLogic Ready, Oracle Linux Ready and Oracle Solaris Ready which means they run on the latest release and therefore can run their application, with no new porting work, on Oracle Exalogic. Those 'Ready' logos are important in helping ISVs advertise to their customers that they are using the latest technologies which have been fully tested. We now also have Exadata Ready and Exalogic Ready programmes which allow ISVs to promote the certification of their applications on these platforms. This highlights these partners to Oracle customers as having solutions that run fluently on the Oracle Exadata Database Machine, the Oracle Exalogic Elastic Cloud or one of our other engineered systems. This makes it easy for customers to identify solutions and provides ISVs with an avenue to connect with Oracle customers who are rapidly adopting engineered systems. We have also taken this programme to the next level in the shape of 'Oracle Exastack Optimized' for partners whose applications run best on the Oracle stack and have invested the time to fully optimise application performance. We ensure that Exastack Optimized partner status is promoted and supported by press releases, and we help our ISVs go to market and differentiate themselves through the use of our technology and the standardisation it delivers. To date we have had several hundred organisations successfully work through our Exastack Optimized programme. How does Oracle's strategy of offering pre-integrated open platform software and hardware allow ISVs to bring their products to market more quickly? One of the problems for many ISVs is that they have to think very carefully about the technology on which their solutions will be deployed, particularly in the cloud or hosted environments. They have to think hard about how they secure these environments, whether the concern is, for example, middleware, identity management, or securing personal data. If they don't use the technology that we build-in to our products to help them to fulfil these roles, they then have to build it themselves. This takes time, requires testing, and must be maintained. By taking advantage of our technology, partners will now know that they have a standard platform. They will know that they can confidently talk about implementation being the same every time they do it. Very large ISV applications could once take a year or two to be implemented at an on-premise environment. But it wasn't just the configuration of the application that took the time, it was actually the infrastructure - the different hardware configurations, operating systems and configurations of databases and middleware. Now we strongly believe that it's all about standardisation and repeatability. It's about making sure that our partners can do it once and are then able to roll it out many different times using standard componentry. What actions would you recommend for existing ISV partners that are looking to do more business with Oracle and its customer base, not only to maximise benefits, but also to maximise partner relationships? My team, around the world and in the EMEA region, is available and ready to talk to any of our ISVs and to explore the possibilities together. We run programmes like 'Excite' and 'Insight' to help us to understand how we can help ISVs with architecture and widen their environments. But we also want to work with, and look at, new opportunities - for example, the Machine-to-Machine (M2M) market or 'The Internet of Things'. Over the next few years, many millions, indeed billions of devices will be collecting massive amounts of data and communicating it back to the central systems where ISVs will be running their applications. The only way that our partners will be able to provide a single vendor 'end-to-end' solution is to use Oracle integrated systems at the back end and Java on the 'smart' devices collecting the data—a complete solution from device to data centre. So there are huge opportunities to work closely with our ISVs, using Oracle's complete M2M platform, to provide the infrastructure that enables them to extract maximum value from the data collected. If any partners don't know where to start or who to contact, then they can contact me directly at [email protected] or indeed any of our teams across the EMEA region. We want to work with ISVs to help them to be as successful as they possibly can through simplification and speed to market, and we also want all of the top ISVs in the world based on Oracle. What opportunities are immediately opened to new ISV partners joining the OPN? As you know OPN is very, very important. New members will discover a huge amount of content that instantly becomes accessible to them. They can access a wealth of no-cost training and enablement materials to build their expertise in Oracle technology. They can download Oracle software and use it for development projects. They can help themselves become more competent by becoming part of a true community and uncovering new opportunities by working with Oracle and their peers in the Oracle Partner Network. As well as publishing massive amounts of information on OPN, we also hold our global Oracle OpenWorld event, at which partners play a huge role. This takes place at the end of September and the beginning of October in San Francisco. Attending ISV partners have an unrivalled opportunity to contribute to elements such as the OpenWorld / OPN Exchange, at which they can talk to other partners and really begin thinking about how they can move their businesses on and play key roles in a very large ecosystem which revolves around technology and standardisation. Finally, are there any other messages that you would like to share with the Oracle ISV community? The crucial message that I always like to reinforce is architecture, architecture and architecture! The key opportunities that ISVs have today revolve around standardising their architectures so that they can confidently think: "I will I be able to do exactly the same thing whenever a customer is looking to deploy on-premise, hosted or in the cloud". The right architecture is critical to being competitive and to really start changing the game. We want to help our ISV partners to do just that; to establish standard architecture and to seize the opportunities it opens up for them. New market opportunities like M2M are enormous - just look at how many devices are all around you right now. We can help our partners to interface with these devices more effectively while thinking about their entire ecosystem, rather than just the piece that they have traditionally focused upon. With standardised architecture, we can help people dramatically improve their speed, reach, agility and delivery of enhanced customer satisfaction and value all the way from the Java side to their centralised systems. All Oracle ISV partners must take advantage of these opportunities, which is why Oracle will continue to invest in and support them. Oracle OpenWorld 2010 Whether you attended Oracle OpenWorld 2009 or not, don't forget to save the date now for Oracle OpenWorld 2010. The event will be held a little earlier next year, from 19th-23rd September, so please don't miss out. With thousands of sessions and hundreds of exhibits and demos already lined up, there's no better place to learn how to optimise your existing systems, get an inside line on upcoming technology breakthroughs, and meet with your partner peers, Oracle strategists and even the developers responsible for the products and services that help you get better results for your end customers. Register Now for Oracle OpenWorld 2010! Perhaps you are interested in learning more about Oracle OpenWorld 2010, but don't wish to register at this time? Great! Please just enter your contact information here and we will contact you at a later date. How to Exhibit at Oracle OpenWorld 2010 Sponsorship Opportunities at Oracle OpenWorld 2010 Advertising Opportunities at Oracle OpenWorld 2010 -- Back to the welcome page

    Read the article

  • Microsoft&rsquo;s new technical computing initiative

    - by Randy Walker
    I made a mental note from earlier in the year.  Microsoft literally buys computers by the truckload.  From what I understand, it’s a typical practice amongst large software vendors.  You plug a few wires in, you test it, and you instantly have mega tera tera flops (don’t hold me to that number).  Microsoft has been trying to plug away at their cloud services (named Azure).  Which, for the layman, means Microsoft runs your software on their computers, and as demand increases you can allocate more computing power on the fly. With this in mind, it doesn’t surprise me that I was recently sent an executive email concerning Microsoft’s new technical computing initiative.  I find it to be a great marketing idea with actual substance behind their real work.  From the programmer academic perspective, in college we dreamed about this type of processing power.  This has decades of computer science theory behind it. A copy of the email received.  (note that I almost deleted this email, thinking it was spam due to it’s length) We don't often think about how complex life really is. Take the relatively simple task of commuting to and from work: it is, in fact, a complicated interplay of variables such as weather, train delays, accidents, traffic patterns, road construction, etc. You can however, take steps to shorten your commute - using a good, predictive understanding of a few of these variables. In fact, you probably are already taking these inputs and instinctively building a predictive model that you act on daily to get to your destination more quickly. Now, when we apply the same method to very complex tasks, this modeling approach becomes much more challenging. Recent world events clearly demonstrated our inability to process vast amounts of information and variables that would have helped to more accurately predict the behavior of global financial markets or the occurrence and impact of a volcano eruption in Iceland. To make sense of issues like these, researchers, engineers and analysts create computer models of the almost infinite number of possible interactions in complex systems. But, they need increasingly more sophisticated computer models to better understand how the world behaves and to make fact-based predictions about the future. And, to do this, it requires a tremendous amount of computing power to process and examine the massive data deluge from cameras, digital sensors and precision instruments of all kinds. This is the key to creating more accurate and realistic models that expose the hidden meaning of data, which gives us the kind of insight we need to solve a myriad of challenges. We have made great strides in our ability to build these kinds of computer models, and yet they are still too difficult, expensive and time consuming to manage. Today, even the most complicated data-rich simulations cannot fully capture all of the intricacies and dependencies of the systems they are trying to model. That is why, across the scientific and engineering world, it is so hard to say with any certainty when or where the next volcano will erupt and what flight patterns it might affect, or to more accurately predict something like a global flu pandemic. So far, we just cannot collect, correlate and compute enough data to create an accurate forecast of the real world. But this is about to change. Innovations in technology are transforming our ability to measure, monitor and model how the world behaves. The implication for scientific research is profound, and it will transform the way we tackle global challenges like health care and climate change. It will also have a huge impact on engineering and business, delivering breakthroughs that could lead to the creation of new products, new businesses and even new industries. Because you are a subscriber to executive e-mails from Microsoft, I want you to be the first to know about a new effort focused specifically on empowering millions of the world's smartest problem solvers. Today, I am happy to introduce Microsoft's Technical Computing initiative. Our goal is to unleash the power of pervasive, accurate, real-time modeling to help people and organizations achieve their objectives and realize their potential. We are bringing together some of the brightest minds in the technical computing community across industry, academia and science at www.modelingtheworld.com to discuss trends, challenges and shared opportunities. New advances provide the foundation for tools and applications that will make technical computing more affordable and accessible where mathematical and computational principles are applied to solve practical problems. One day soon, complicated tasks like building a sophisticated computer model that would typically take a team of advanced software programmers months to build and days to run, will be accomplished in a single afternoon by a scientist, engineer or analyst working at the PC on their desktop. And as technology continues to advance, these models will become more complete and accurate in the way they represent the world. This will speed our ability to test new ideas, improve processes and advance our understanding of systems. Our technical computing initiative reflects the best of Microsoft's heritage. Ever since Bill Gates articulated the then far-fetched vision of "a computer on every desktop" in the early 1980's, Microsoft has been at the forefront of expanding the power and reach of computing to benefit the world. As someone who worked closely with Bill for many years at Microsoft, I am happy to share with you that the passion behind that vision is fully alive at Microsoft and is carried out in the creation of our new Technical Computing group. Enabling more people to make better predictions We have seen the impact of making greater computing power more available firsthand through our investments in high performance computing (HPC) over the past five years. Scientists, engineers and analysts in organizations of all sizes and sectors are finding that using distributed computational power creates societal impact, fuels scientific breakthroughs and delivers competitive advantages. For example, we have seen remarkable results from some of our current customers: Malaria strikes 300,000 to 500,000 people around the world each year. To help in the effort to eradicate malaria worldwide, scientists at Intellectual Ventures use software that simulates how the disease spreads and would respond to prevention and control methods, such as vaccines and the use of bed nets. Technical computing allows researchers to model more detailed parameters for more accurate results and receive those results in less than an hour, rather than waiting a full day. Aerospace engineering firm, a.i. solutions, Inc., needed a more powerful computing platform to keep up with the increasingly complex computational needs of its customers: NASA, the Department of Defense and other government agencies planning space flights. To meet that need, it adopted technical computing. Now, a.i. solutions can produce detailed predictions and analysis of the flight dynamics of a given spacecraft, from optimal launch times and orbit determination to attitude control and navigation, up to eight times faster. This enables them to avoid mistakes in any areas that can cause a space mission to fail and potentially result in the loss of life and millions of dollars. Western & Southern Financial Group faced the challenge of running ever larger and more complex actuarial models as its number of policyholders and products grew and regulatory requirements changed. The company chose an actuarial solution that runs on technical computing technology. The solution is easy for the company's IT staff to manage and adjust to meet business needs. The new solution helps the company reduce modeling time by up to 99 percent - letting the team fine-tune its models for more accurate product pricing and financial projections. Our Technical Computing direction Collaborating closely with partners across industry and academia, we must now extend the reach of technical computing even further to help predictive modelers and data explorers make faster, more accurate predictions. As we build the Technical Computing initiative, we will invest in three core areas: Technical computing to the cloud: Microsoft will play a leading role in bringing technical computing power to scientists, engineers and analysts through the cloud. Existing high- performance computing users will benefit from the ability to augment their on-premises systems with cloud resources that enable 'just-in-time' processing. This platform will help ensure processing resources are available whenever they are needed-reliably, consistently and quickly. Simplify parallel development: Today, computers are shipping with more processing power than ever, including multiple cores, but most modern software only uses a small amount of the available processing power. Parallel programs are extremely difficult to write, test and trouble shoot. However, a consistent model for parallel programming can help more developers unlock the tremendous power in today's modern computers and enable a new generation of technical computing. We are delivering new tools to automate and simplify writing software through parallel processing from the desktop... to the cluster... to the cloud. Develop powerful new technical computing tools and applications: We know scientists, engineers and analysts are pushing common tools (i.e., spreadsheets and databases) to the limits with complex, data-intensive models. They need easy access to more computing power and simplified tools to increase the speed of their work. We are building a platform to do this. Our development efforts will yield new, easy-to-use tools and applications that automate data acquisition, modeling, simulation, visualization, workflow and collaboration. This will allow them to spend more time on their work and less time wrestling with complicated technology. Thinking bigger There is so much left to be discovered and so many questions yet to be answered in the fascinating world around us. We believe the technical computing community will show us that we have not seen anything yet. Imagine just some of the breakthroughs this community could make possible: Better predictions to help improve the understanding of pandemics, contagion and global health trends. Climate change models that predict environmental, economic and human impact, accessible in real-time during key discussions and debates. More accurate prediction of natural disasters and their impact to develop more effective emergency response plans. With an ambitious charter in hand, this new team is ready to build on our progress to-date and execute Microsoft's technical computing vision over the months and years ahead. We will steadily invest in the right technologies, tools and talent, and work to bring together the technical computing community. I invite you to visit www.modelingtheworld.com today. We welcome your ideas and feedback. I look forward to making this journey with you and others who want to answer the world's biggest questions, discover solutions to problems that seem impossible and uncover a host of new opportunities to change the world we live in for the better. Bob

    Read the article

  • Get Started using Build-Deploy-Test Workflow with TFS 2012

    - by Jakob Ehn
    TFS 2012 introduces a new type of Lab environment called Standard Environment. This allows you to setup a full Build Deploy Test (BDT) workflow that will build your application, deploy it to your target machine(s) and then run a set of tests on that server to verify the deployment. In TFS 2010, you had to use System Center Virtual Machine Manager and involve half of your IT department to get going. Now all you need is a server (virtual or physical) where you want to deploy and test your application. You don’t even have to install a test agent on the machine, TFS 2012 will do this for you! Although each step is rather simple, the entire process of setting it up consists of a bunch of steps. So I thought that it could be useful to run through a typical setup.I will also link to some good guidance from MSDN on each topic. High Level Steps Install and configure Visual Studio 2012 Test Controller on Target Server Create Standard Environment Create Test Plan with Test Case Run Test Case Create Coded UI Test from Test Case Associate Coded UI Test with Test Case Create Build Definition using LabDefaultTemplate 1. Install and Configure Visual Studio 2012 Test Controller on Target Server First of all, note that you do not have to have the Test Controller running on the target server. It can be running on another server, as long as the Test Agent can communicate with the test controller and the test controller can communicate with the TFS server. If you have several machines in your environment (web server, database server etc..), the test controller can be installed either on one of those machines or on a dedicated machine. To install the test controller, simply mount the Visual Studio Agents media on the server and browse to the vstf_controller.exe file located in the TestController folder. Run through the installation, you might need to reboot the server since it installs .NET 4.5. When the test controller is installed, the Test Controller configuration tool will launch automatically (if it doesn’t, you can start it from the Start menu). Here you will supply the credentials of the account running the test controller service. Note that this account will be given the necessary permissions in TFS during the configuration. Make sure that you have entered a valid account by pressing the Test link. Also, you have to register the test controller with the TFS collection where your test plan is located (and usually the code base of course) When you press Apply Settings, all the configuration will be done. You might get some warnings at the end, that might or might not cause a problem later. Be sure to read them carefully.   For more information about configuring your test controllers, see Setting Up Test Controllers and Test Agents to Manage Tests with Visual Studio 2. Create Standard Environment Now you need to create a Lab environment in Microsoft Test Manager. Since we are using an existing physical or virtual machine we will create a Standard Environment. Open MTM and go to Lab Center. Click New to create a new environment Enter a name for the environment. Since this environment will only contain one machine, we will use the machine name for the environment (TargetServer in this case) On the next page, click Add to add a machine to the environment. Enter the name of the machine (TargetServer.Domain.Com), and give it the Web Server role. The name must be reachable both from your machine during configuration and from the TFS app tier server. You also need to supply an account that is a local administration on the target server. This is needed in order to automatically install a test agent later on the machine. On the next page, you can add tags to the machine. This is not needed in this scenario so go to the next page. Here you will specify which test controller to use and that you want to run UI tests on this environment. This will in result in a Test Agent being automatically installed and configured on the target server. The name of the machine where you installed the test controller should be available on the drop down list (TargetServer in this sample). If you can’t see it, you might have selected a different TFS project collection. Press Next twice and then Verify to verify all the settings: Press finish. This will now create and prepare the environment, which means that it will remote install a test agent on the machine. As part of this installation, the remote server will be restarted. 3-5. Create Test Plan, Run Test Case, Create Coded UI Test I will not cover step 3-5 here, there are plenty of information on how you create test plans and test cases and automate them using Coded UI Tests. In this example I have a test plan called My Application and it contains among other things a test suite called Automated Tests where I plan to put test cases that should be automated and executed as part of the BDT workflow. For more information about Coded UI Tests, see Verifying Code by Using Coded User Interface Tests   6. Associate Coded UI Test with Test Case OK, so now we want to automate our Coded UI Test and have it run as part of the BDT workflow. You might think that you coded UI test already is automated, but the meaning of the term here is that you link your coded UI Test to an existing Test Case, thereby making the Test Case automated. And the test case should be part of the test suite that we will run during the BDT. Open the solution that contains the coded UI test method. Open the Test Case work item that you want to automate. Go to the Associated Automation tab and click on the “…” button. Select the coded UI test that you corresponds to the test case: Press OK and the save the test case For more information about associating an automated test case with a test case, see How to: Associate an Automated Test with a Test Case 7. Create Build Definition using LabDefaultTemplate Now we are ready to create a build definition that will implement the full BDT workflow. For this purpose we will use the LabDefaultTemplate.11.xaml that comes out of the box in TFS 2012. This build process template lets you take the output of another build and deploy it to each target machine. Since the deployment process will be running on the target server, you will have less problem with permissions and firewalls than if you were to remote deploy your solution. So, before creating a BDT workflow build definition, make sure that you have an existing build definition that produces a release build of your application. Go to the Builds hub in Team Explorer and select New Build Definition Give the build definition a meaningful name, here I called it MyApplication.Deploy Set the trigger to Manual Define a workspace for the build definition. Note that a BDT build doesn’t really need a workspace, since all it does is to launch another build definition and deploy the output of that build. But TFS doesn’t allow you to save a build definition without adding at least one mapping. On Build Defaults, select the build controller. Since this build actually won’t produce any output, you can select the “This build does not copy output files to a drop folder” option. On the process tab, select the LabDefaultTemplate.11.xaml. This is usually located at $/TeamProject/BuildProcessTemplates/LabDefaultTemplate.11.xaml. To configure it, press the … button on the Lab Process Settings property First, select the environment that you created before: Select which build that you want to deploy and test. The “Select an existing build” option is very useful when developing the BDT workflow, because you do not have to run through the target build every time, instead it will basically just run through the deployment and test steps which speeds up the process. Here I have selected to queue a new build of the MyApplication.Test build definition On the deploy tab, you need to specify how the application should be installed on the target server. You can supply a list of deployment scripts with arguments that will be executed on the target server. In this example I execute the generated web deploy command file to deploy the solution. If you for example have databases you can use sqlpackage.exe to deploy the database. If you are producing MSI installers in your build, you can run them using msiexec.exe and so on. A good practice is to create a batch file that contain the entire deployment that you can run both locally and on the target server. Then you would just execute the deployment batch file here in one single step. The workflow defines some variables that are useful when running the deployments. These variables are: $(BuildLocation) The full path to where your build files are located $(InternalComputerName_<VM Name>) The computer name for a virtual machine in a SCVMM environment $(ComputerName_<VM Name>) The fully qualified domain name of the virtual machine As you can see, I specify the path to the myapplication.deploy.cmd file using the $(BuildLocation) variable, which is the drop folder of the MyApplication.Test build. Note: The test agent account must have read permission in this drop location. You can find more information here on Building your Deployment Scripts On the last tab, we specify which tests to run after deployment. Here I select the test plan and the Automated Tests test suite that we saw before: Note that I also selected the automated test settings (called TargetServer in this case) that I have defined for my test plan. In here I define what data that should be collected as part of the test run. For more information about test settings, see Specifying Test Settings for Microsoft Test Manager Tests We are done! Queue your BDT build and wait for it to finish. If the build succeeds, your build summary should look something like this:

    Read the article

  • CodePlex Daily Summary for Tuesday, November 05, 2013

    CodePlex Daily Summary for Tuesday, November 05, 2013Popular ReleasesSocial Network Importer for NodeXL: SocialNetImporter(v.1.9.1): This new version includes: - Include me option is back - Fixed the login bug reported latelyCaptcha MVC: Captcha MVC 2.5: v 2.5: Added support for MVC 5. The DefaultCaptchaManager is no longer throws an error if the captcha values was entered incorrectly. Minor changes. v 2.4.1: Fixed issues with deleting incorrect values of the captcha token in the SessionStorageProvider. This could lead to a situation when the captcha was not working with the SessionStorageProvider. Minor changes. v 2.4: Changed the IIntelligencePolicy interface, added ICaptchaManager as parameter for all methods. Improved font size ...DNN Blog: 06.00.01: 06.00.01 ReleaseThis is the first bugfix release of the new v6 blog module. These are the changes: Added some robustness in v5-v6 scripts to cater for some rare upgrade scenarios Changed the name of the module definition to avoid clash with Evoq Social Addition of sitemap providerStock Track: Version 1.2 Stable: Overhaul and re-think of the user interface in normal mode. Added stock history view in normal mode. Allows user to enter orders in normal mode. Allow advanced user to run database queries within the program. Improved sales statistics feature, able to calculate against a single category. Updated database script file. Now compatible with lower version of SQL Server.VG-Ripper & PG-Ripper: VG-Ripper 2.9.50: changes NEW: Added Support for "ImageHostHQ.com" links NEW: Added Support for "ImgMoney.net" links NEW: Added Support for "ImgSavy.com" links NEW: Added Support for "PixTreat.com" links Bug fixesVidCoder: 1.5.11 Beta: Added Encode Details window. Exposes elapsed time, ETA, current and average FPS, running file size, current pass and pass progress. Open it by going to Windows -> Encode Details while an encode is running. Subtitle dialog now disables the "Burn In" checkbox when it's either unavailable or it's the only option. It also disables the "Forced Only" when the subtitle type doesn't support the "Forced" flag. Updated HandBrake core to SVN 5872. Fixed crash in the preview window when a source fil...Wsus Package Publisher: Release v1.3.1311.02: Add three new Actions in Custom Updates : Work with Files (Copy, Delete, Rename), Work with Folders (Add, Delete, Rename) and Work with Registry Keys (Add, Delete, Rename). Fix a bug, where after resigning an update, the display is not refresh. Modify the way WPP sort rows in 'Updates Detail Viewer' and 'Computer List Viewer' so that dates are correctly sorted. Add a Tab in the settings form to set Proxy settings when WPP needs to go on Internet. Fix a bug where 'Manage Catalogs Subsc...uComponents: uComponents v6.0.0: This release of uComponents will compile against and support the new API in Umbraco v6.1.0. What's new in uComponents v6.0.0? New DataTypesImage Point XML DropDownList XPath Templatable List New features / Resolved issuesThe following workitems have been implemented and/or resolved: 14781 14805 14808 14818 14854 14827 14868 14859 14790 14853 14790 DataType Grid 14788 14810 14873 14833 14864 14855 / 14860 14816 14823 Drag & Drop support for rows Su...SharePoint 2013 Global Metadata Navigation: SP 2013 Metadata Navigation Sandbox Solution: SharePoint 2013 Global Metadata Navigation sandbox solution version 1.0. Upload to your site collection solution store and activate. See the Documentation tab for detailed instructions.SmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.1: New FeaturesAdded option Limit to current basket subtotal to HadSpentAmount discount rule Items in product lists can be labelled as NEW for a configurable period of time Product templates can optionally display a discount sign when discounts were applied Added the ability to set multiple favicons depending on stores and/or themes Plugin management: multiple plugins can now be (un)installed in one go Added a field for the HTML body id to store entity (Developer) New property 'Extra...CodeGen Code Generator: CodeGen 4.3.2: Changes in this release include: Removed old tag tokens from several example templates. Fixed a bug which was causing the default author and company names not to be picked up from the registry under .NET. Added several additional tag loop expressions: <IF FIRST_TAG>, <IF LAST_TAG>, <IF MULTIPLE_TAGS> and<IF SINGLE_TAG>. Upgraded to Synergy/DE 10.1.1b, Visual Studio 2013 and Installshield Limited Edition 2013.SQL Power Doc: Version 1.0.2.2 BETA 1: Fixes for issues introduced with PowerShell 4.0 with serialization/deserialization. Fixed an issue with the max length of an Excel cell being exceeded by AD groups with a large number of members.Community Forums NNTP bridge: Community Forums NNTP Bridge V54 (LiveConnect): This is the first release which can be used with the new LiveConnect authentication. Fixes the problem that the authentication will not work after 1 hour. Also a logfile will now be stored in "%AppData%\Community\CommunityForumsNNTPServer". If you have any problems please feel free to sent me the file "LogFile.txt".AutoAudit: AutoAudit 3.20f: Here is a high level list of the things I have changed between AutoAudit 2.00h and 3.20f ... Note: 3.20f corrects a minor bug found in 3.20e in the _RowHistory UDF when using column names with spaces. 1. AutoAudit has been tested on SQL Server 2005, 2008, 2008R2 and 2012. 2. Added the capability for AutoAudit to handle primary keys with up to 5 columns. 3. Added the capability for AutoAudit to save changes for a subset of the columns in a table. 4. Normalized the Audit table and created...Aricie - Friendlier Url Provider: Aricie - Friendlier Url Provider Version 2.5.3: This is mainly a maintenance release to stabilize the new Url Group paradigm. As usual, don't forget to install the Aricie - Shared extension first Highlights Fixed: UI bugs Min Requirements: .Net 3.5+ DotNetNuke 4.8.1+ Aricie - Shared 1.7.7+Aricie Shared: Aricie.Shared Version 1.7.7: This is mainly a maintenance version. Fixes in Property Editor: list import/export Min Requirements: DotNetNuke 4.8.1+ .Net 3.5+WPF Extended DataGrid: WPF Extended DataGrid 2.0.0.9 binaries: Fixed issue with ICollectionView containg null values (AutoFilter issue)Community TFS Build Extensions: October 2013: The October 2013 release contains Scripts - a new addition to our delivery. These are a growing library of PowerShell scripts to use with VS2013. See our documentation for more on scripting. VS2010 Activities(target .NET 4.0) VS2012 Activities (target .NET 4.5) VS2013 Activities (target .NET 4.5.1) Community TFS Build Manager VS2012 Community TFS Build Manager VS2013 The Community TFS Build Managers for VS2010, 2012 and 2013 can also be found in the Visual Studio Gallery where upda...SuperSocket, an extensible socket server framework: SuperSocket 1.6 stable: Changes included in this release: Process level isolation SuperSocket ServerManager (include server and client) Connect to client from server side initiatively Client certificate validation New configuration attributes "textEncoding", "defaultCulture", and "storeLocation" (certificate node) Many bug fixes http://docs.supersocket.net/v1-6/en-US/New-Features-and-Breaking-ChangesBarbaTunnel: BarbaTunnel 8.1: Check Version History for more information about this release.New ProjectsAForge.NET FFmpeg - LGPL Version: AForge.Video.FFMPEG recompiled under LGPLv3 which can be distributed with proprietary projects, without requiring you to release all of your proprietary source.Albatross Shell: Albatross Shell is a WPF based Shell program that can host other WPF application modules.Analysis Services ServerActivity Monitor (ASSAM) 2014: Analysis Services Server Activity Monitor (ASSAM 2014) is an upgraded version of the Analysis Services Activity Monitor 2012.AsuraCode: AsuraCode 0.1Chimera - Automated Testing Suite: Chimera is an automated testing suite, intended to simplify and automate system and tests deployment. Oriented around the MVC architecture, it has web based UI.Create Sitemap for your Website (Sitemap Writer): Sitemap Writer help create sitemap for your website. It recursively look for links on your each page and add up in list to write a site map. DataSync: ????HCS Analyzer: HCS Analyzerk? toán thu? dona: chuong trình k? toán thu? donaLanguage improver: startedOrchard Abstractions: An Orchard module for providing higher-level abstractions to Orchard services.Orchard Abstractions - Examples: Examples for Orchard Abstractions (https://orchardabstractions.codeplex.com/)Role Based Views in Microsoft Dynamics CRM 2011: This tool will allow you to configure visibility and setting the default view of a logged in user based on security roles.SF.DotNet.Utilities: DotNet.UtilitiesShot: A real time C# web application framework designed in the image of ASP.NET.Soccer Managment System: Soccer Management SystemSTAR FOX XNA: Remake of a classic videogame. In this case, that videogame will be Star Fox (1993) for SNES game console.The Dargon Project: DargonTweetFeed: A small widget that enables you to display a twitter search or profile on your Sitecore siteWPF Themes HighRobotics: WPF Themes by High Robotics company????????? ????????: ?????????, ??????????? ?????????? ???????? ????????: ??????????? ????? ????? ??????????, ?????????? ????????? ??????.

    Read the article

  • Announcing: Great Improvements to Windows Azure Web Sites

    - by ScottGu
    I’m excited to announce some great improvements to the Windows Azure Web Sites capability we first introduced earlier this summer.  Today’s improvements include: a new low-cost shared mode scaling option, support for custom domains with shared and reserved mode web-sites using both CNAME and A-Records (the later enabling naked domains), continuous deployment support using both CodePlex and GitHub, and FastCGI extensibility.  All of these improvements are now live in production and available to start using immediately. New “Shared” Scaling Tier Windows Azure allows you to deploy and host up to 10 web-sites in a free, shared/multi-tenant hosting environment. You can start out developing and testing web sites at no cost using this free shared mode, and it supports the ability to run web sites that serve up to 165MB/day of content (5GB/month).  All of the capabilities we introduced in June with this free tier remain the same with today’s update. Starting with today’s release, you can now elastically scale up your web-site beyond this capability using a new low-cost “shared” option (which we are introducing today) as well as using a “reserved instance” option (which we’ve supported since June).  Scaling to either of these modes is easy.  Simply click on the “scale” tab of your web-site within the Windows Azure Portal, choose the scaling option you want to use with it, and then click the “save” button.  Changes take only seconds to apply and do not require any code to be changed, nor the app to be redeployed: Below are some more details on the new “shared” option, as well as the existing “reserved” option: Shared Mode With today’s release we are introducing a new low-cost “shared” scaling mode for Windows Azure Web Sites.  A web-site running in shared mode is deployed in a shared/multi-tenant hosting environment.  Unlike the free tier, though, a web-site in shared mode has no quotas/upper-limit around the amount of bandwidth it can serve.  The first 5 GB/month of bandwidth you serve with a shared web-site is free, and then you pay the standard “pay as you go” Windows Azure outbound bandwidth rate for outbound bandwidth above 5 GB. A web-site running in shared mode also now supports the ability to map multiple custom DNS domain names, using both CNAMEs and A-records, to it.  The new A-record support we are introducing with today’s release provides the ability for you to support “naked domains” with your web-sites (e.g. http://microsoft.com in addition to http://www.microsoft.com).  We will also in the future enable SNI based SSL as a built-in feature with shared mode web-sites (this functionality isn’t supported with today’s release – but will be coming later this year to both the shared and reserved tiers). You pay for a shared mode web-site using the standard “pay as you go” model that we support with other features of Windows Azure (meaning no up-front costs, and you pay only for the hours that the feature is enabled).  A web-site running in shared mode costs only 1.3 cents/hr during the preview (so on average $9.36/month). Reserved Instance Mode In addition to running sites in shared mode, we also support scaling them to run within a reserved instance mode.  When running in reserved instance mode your sites are guaranteed to run isolated within your own Small, Medium or Large VM (meaning no other customers run within it).  You can run any number of web-sites within a VM, and there are no quotas on CPU or memory limits. You can run your sites using either a single reserved instance VM, or scale up to have multiple instances of them (e.g. 2 medium sized VMs, etc).  Scaling up or down is easy – just select the “reserved” instance VM within the “scale” tab of the Windows Azure Portal, choose the VM size you want, the number of instances of it you want to run, and then click save.  Changes take effect in seconds: Unlike shared mode, there is no per-site cost when running in reserved mode.  Instead you pay only for the reserved instance VMs you use – and you can run any number of web-sites you want within them at no extra cost (e.g. you could run a single site within a reserved instance VM or 100 web-sites within it for the same cost).  Reserved instance VMs start at 8 cents/hr for a small reserved VM.  Elastic Scale-up/down Windows Azure Web Sites allows you to scale-up or down your capacity within seconds.  This allows you to deploy a site using the shared mode option to begin with, and then dynamically scale up to the reserved mode option only when you need to – without you having to change any code or redeploy your application. If your site traffic starts to drop off, you can scale back down the number of reserved instances you are using, or scale down to the shared mode tier – all within seconds and without having to change code, redeploy, or adjust DNS mappings.  You can also use the “Dashboard” view within the Windows Azure Portal to easily monitor your site’s load in real-time (it shows not only requests/sec and bandwidth but also stats like CPU and memory usage). Because of Windows Azure’s “pay as you go” pricing model, you only pay for the compute capacity you use in a given hour.  So if your site is running most of the month in shared mode (at 1.3 cents/hr), but there is a weekend when it gets really popular and you decide to scale it up into reserved mode to have it run in your own dedicated VM (at 8 cents/hr), you only have to pay the additional pennies/hr for the hours it is running in the reserved mode.  There is no upfront cost you need to pay to enable this, and once you scale back down to shared mode you return to the 1.3 cents/hr rate.  This makes it super flexible and cost effective. Improved Custom Domain Support Web sites running in either “shared” or “reserved” mode support the ability to associate custom host names to them (e.g. www.mysitename.com).  You can associate multiple custom domains to each Windows Azure Web Site.  With today’s release we are introducing support for A-Records (a big ask by many users). With the A-Record support, you can now associate ‘naked’ domains to your Windows Azure Web Sites – meaning instead of having to use www.mysitename.com you can instead just have mysitename.com (with no sub-name prefix).  Because you can map multiple domains to a single site, you can optionally enable both a www and naked domain for a site (and then use a URL rewrite rule/redirect to avoid SEO problems). We’ve also enhanced the UI for managing custom domains within the Windows Azure Portal as part of today’s release.  Clicking the “Manage Domains” button in the tray at the bottom of the portal now brings up custom UI that makes it easy to manage/configure them: As part of this update we’ve also made it significantly smoother/easier to validate ownership of custom domains, and made it easier to switch existing sites/domains to Windows Azure Web Sites with no downtime. Continuous Deployment Support with Git and CodePlex or GitHub One of the more popular features we released earlier this summer was support for publishing web sites directly to Windows Azure using source control systems like TFS and Git.  This provides a really powerful way to manage your application deployments using source control.  It is really easy to enable this from a website’s dashboard page: The TFS option we shipped earlier this summer provides a very rich continuous deployment solution that enables you to automate builds and run unit tests every time you check in your web-site, and then if they are successful automatically publish to Azure. With today’s release we are expanding our Git support to also enable continuous deployment scenarios and integrate with projects hosted on CodePlex and GitHub.  This support is enabled with all web-sites (including those using the “free” scaling mode). Starting today, when you choose the “Set up Git publishing” link on a website’s “Dashboard” page you’ll see two additional options show up when Git based publishing is enabled for the web-site: You can click on either the “Deploy from my CodePlex project” link or “Deploy from my GitHub project” link to walkthrough a simple workflow to configure a connection between your website and a source repository you host on CodePlex or GitHub.  Once this connection is established, CodePlex or GitHub will automatically notify Windows Azure every time a checkin occurs.  This will then cause Windows Azure to pull the source and compile/deploy the new version of your app automatically.  The below two videos walkthrough how easy this is to enable this workflow and deploy both an initial app and then make a change to it: Enabling Continuous Deployment with Windows Azure Websites and CodePlex (2 minutes) Enabling Continuous Deployment with Windows Azure Websites and GitHub (2 minutes) This approach enables a really clean continuous deployment workflow, and makes it much easier to support a team development environment using Git: Note: today’s release supports establishing connections with public GitHub/CodePlex repositories.  Support for private repositories will be enabled in a few weeks. Support for multiple branches Previously, we only supported deploying from the git ‘master’ branch.  Often, though, developers want to deploy from alternate branches (e.g. a staging or future branch). This is now a supported scenario – both with standalone git based projects, as well as ones linked to CodePlex or GitHub.  This enables a variety of useful scenarios.  For example, you can now have two web-sites - a “live” and “staging” version – both linked to the same repository on CodePlex or GitHub.  You can configure one of the web-sites to always pull whatever is in the master branch, and the other to pull what is in the staging branch.  This enables a really clean way to enable final testing of your site before it goes live. This 1 minute video demonstrates how to configure which branch to use with a web-site. Summary The above features are all now live in production and available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. We’ll have even more new features and enhancements coming in the weeks ahead – including support for the recent Windows Server 2012 and .NET 4.5 releases (we will enable new web and worker role images with Windows Server 2012 and .NET 4.5 next month).  Keep an eye out on my blog for details as these new features become available. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • CodePlex Daily Summary for Wednesday, December 19, 2012

    CodePlex Daily Summary for Wednesday, December 19, 2012Popular ReleasesToolbox for Dynamics CRM 2011: XrmToolBox (1.0.0.0): Initial Release Contains following tools: Attribute Bulk Updater Iconator Role Updater Scripts Finder SiteMap Editor Solution Import View Layout Replicator WebResources Manager Also include the sample tool described in the documentation "How to develop your own tool" More tools to come later...NB_Store - Free DotNetNuke Ecommerce Catalog Module: NB_Store v2.3.3 Rel0: DNN 5.3.2 and above, and DNN 6 Compatible - Some release notes are discussed here. Important : v2.3.3 requires the Back Office module to be placed onto a seperate page to the productlist. During the upgrade process, this module retains your existing configuration by not overwriting any existing settings and templates. If you wish to reset your configuration to gain all new settings and templates, please review this KB article. Please view the following documentation if you are installing ...F# PowerPack with F# Compiler Source Drops: PowerPack for FSharp 3.0 + .NET 4.x + VS2010: This is a release of the old FSharp Power Pack binaries recompiled for F# 3.0, .NET 4.0/4.5 and Silveright 5. NOTE: This is for F# 3.0 & .NET 4.0 or F# 3.0 & Silverlight 5 NOTE: The assemblies are no longer strong named NOTE: The assemblies are not added to the GAC NOTE: In some cases functionality overlaps with F# 3.0, e.g. SI Units of measurecodeSHOW: codeSHOW AppPackage Release 16: This release is a package file built out of Visual Studio that you can side load onto your machine if for some reason you don't have access to the Windows Store. To install it, just unzip and run the .ps1 file using PowerShell (right click | run with PowerShell). Attention: if you want to download the source code for codeSHOW, you're in the wrong place. You need to go to the SOURCE CODE tab.Move Mouse: Move Mouse 2.5.3: FIXED - Issue where it errors on load if the screen saver interval is over 333 minutes.LINUX????????: LINUX????????: LINUX????????cnbeta: cnbeta: cnbetaCSDN ??: CSDN??????: CSDN??????PowerShell Community Extensions: 2.1.1 Production: PowerShell Community Extensions 2.1.1 Release NotesDec 16, 2012 This version of PSCX supports both Windows PowerShell 2.0 and 3.0. Bug fix for HelpUri error with the Get-Help proxy command. See the ReleaseNotes.txt download above for more information.VidCoder: 1.4.11 Beta: Added Hungarian translation, thanks to Brechler Zsolt. Update HandBrake core to SVN 5098. This update should fix crashes on some files. Updated the enqueue split button to fit in better with the active Windows theme. Updated presets to use x264 preset/profile/level.???: Cnblogs: CNBLOGSSandcastle Help File Builder: SHFB v1.9.6.0 with Visual Studio Package: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This new release contains bug fixes and feature enhancements. There are some potential breaking changes in this release as some features of the Help File Builder have been moved into...Electricity, Gas and Temperature Monitoring with Netduino Plus: V1.0.1 Netduino Plus Monitoring: This is the first stable release from the Netduino Plus Monitoring program. Bugfixing The code is enhanced at some places in respect to the V0.6.1 version There is a possibility to add multiple S0 meters Website for realtime display of data Website for configuring the Netduino Comments are welcome! Additions will not be made to this version. This is the first and last major Netduino Plus V1 release. The new development will take place with the Netduino Plus V2 development board in mi...Windows Store Application Library: Windows Store Application Library - 1.1.1.0: Windows Store Application Library StoreAppLib can be installed from Visual Studio "Manage NuGet Packages" tool. This release of Windows Store Application Library contains following controls and utilities. DatePicker control TimePicker control PageHeaderTextBlock control with Global Navigation Menu Popup manager AppBarButton control TapEffect animation DateTimeConverter ConcatenationConverter CountConverterCRM 2011 Visual Ribbon Editor: Visual Ribbon Editor (1.3.1116.8): [FIX] Fixed issue not displaying CRM system button images correctly (incorrect path in file VisualRibbonEditor.exe.config)My Expenses Windows Store LOB App Demo: My Expenses Version 1: This is version 1 of the MyExpenses Windows 8 line of business demo app. The app is written in XAML and C#. It calls a WCF service that works with a SQL Server database. The app uses the Callisto toolkit. You can get it at https://github.com/timheuer/callisto. The Expenses.sql file contains the SQL to create the Expenses database. The ExpensesWCFService.zip file contains the WCF service, also written in C#. You should create a WCF service. Create an Entity Framework model and point it to...Manage upload files for SP 2010: Manage upload files (SP 2010) 1.0.0: Version 1.0.0: Manage upload file size on specific application page Simple in use Simple installation Work on: SharePoint Foundation 2010 Sharepoint Server 2010 (Standard, Enterprise) Localized: English RussianMCEBuddy Viewer: MCEBuddy Viewer 1.0.0 Beta: MCEBuddy Viewer for MCEBuddy 2.x Windows Media Center Add-On to view the current job status of MCEbuddy. Requirements: 1. Installed MCEBuddy 2.x (tested on release 2.3.7-2.3.10) http://mcebuddy2x.codeplex.com 2. Windows Media Center (tested on Windows 7 Pro 32-bit&64-bit, Windows 8 Pro with WMC 64-bit) Before installing the new version of Plug-in uninstall old version through windows control panelBlackJumboDog: Ver5.7.4: 2012.12.13 Ver5.7.4 (1)Web???????、???????????????????????????????????????????VFPX: ssClasses A1.0: My initial release. See https://vfpx.codeplex.com/wikipage?title=ssClasses&referringTitle=Home for a brief description of what is inside this releaseNew ProjectsAutonomic Middleware: Reflective middleware using workflow that makes the middleware autonomic.Battleforge: Assault: A card battle game. This project is being built using Silverlight 4 in C#.ChurajPatterns: Design patterns testing and overview console application.FimExplorer: Tool for executing XPath queries against FIM without going through it's web portal.Financial Management Studio: This is a WPF application for the financial management. Will use bunch of technologies such as: MVVM, Microsoft Extension Framework and Entities Framework.iJob: Do u no what is this?Developed with C#.net3.5 MSSQL2008Interaction Production Server: A software based networked lighting controller. It brokers many protocols together to allow easy setup and development of new stage lighting controllers.Interface weaver Custom Tool: A CustomTool for vs auto implementing interface based on aspects (auto INotifyPropertychanged data context implementation, auto command mapping...).IntroductionWeb: This is the second time for us to build something extraordinaryiSteam: iSteam is a product being developed focusing on the Steamworks platform, to advertise and centralize everything about steam in one application.iSun_PLS: iSun PLS is a Photo libary System, to help you manager your pictures on your server!LSharp: A Scripting Language Interpreter for .NETMassive Dangerzone: Don't try this at home.MinGuid: An alternative / replacement for those long Guid strings, simply call the function in the library which generates the Unique 16 character string.Nx Portable Framework: Framework aiming to speed up cross platform movile developement.Orchard Training Demo module: Demo Orchard module for training purposes. This module completes the training materials at http://english.orchardproject.hu/Training/Oriol ASAP: ASAPPanopticonize API: This project provides code samples for the panopticonize.com video thumbnailing API.Registry Editor Navigator: The small utility that takes a path and makes Regedit open to that path. Similar to RegJump but it has UI, uses clipboard and has a global keyboard hook. SafetyExam: ??????????????Simple Queuing System: This project is created to show you how queuing system can be implemented in MSMQ in C#. The sample is implemented from real life queuing system. TypeScript plugin enhancements for VS 2010 add-in: This add-in allows to - Compile .ts files by saveVisual Studio Help Downloader 2012: Tool for downloading the Visual Studio 2012 help packages for offline useWebGoat.NET test repository: THis is a clone of the WebGoat.NET project hosted at GitHub, provided here for those who need to be able to have a TFS-accessible copy of WebGoat.WebPythonHost: A .NET application hosting web python environment.Wpf Sqlite Management: about 1 month i publish this project and use avalon edit and avalon doc in my project WPFSchematics: The WPFSchematics is a vector graphic interactive system,and based on WPF graphics technology.XEasyFileSearch: easy and quick file searcherZweiBooks: Plugin for a Bukkit-Server.

    Read the article

  • Clusterware 11gR2 &ndash; Setting up an Active/Passive failover configuration

    - by Gilles Haro
    Oracle is providing a large range of interesting solutions to ensure High Availability of the database. Dataguard, RAC or even both configurations (as recommended by Oracle for a Maximum Available Architecture - MAA) are the most frequently found and used solutions. But, when it comes to protecting a system with an Active/Passive architecture with failover capabilities, people often thinks to other expensive third party cluster systems. Oracle Clusterware technology, which comes along at no extra-cost with Oracle Database or Oracle Unbreakable Linux, is - in the knowing of most people - often linked to Oracle RAC and therefore, is seldom used to implement failover solutions. Oracle Clusterware 11gR2  (a part of Oracle 11gR2 Grid Infrastructure)  provides a comprehensive framework to setup automatic failover configurations. It is actually possible to make "failover-able'", and then to protect, almost any kind of application (from the simple xclock to the most complex Application Server). Quoting Oracle: “Oracle Clusterware is a portable cluster software that allows clustering of single servers so that they cooperate as a single system. Oracle Clusterware also provides the required infrastructure for Oracle Real Application Clusters (RAC). In addition Oracle Clusterware enables the protection of any Oracle application or any other kind of application within a cluster.” In the next couple of lines, I will try to present the different steps to achieve this goal : Have a fully operational 11gR2 database protected by automatic failover capabilities. I assume you are fluent in installing Oracle Database 11gR2, Oracle Grid Infrastructure 11gR2 on a Linux system and that ASM is not a problem for you (as I am using it as a shared storage). If not, please have a look at Oracle Documentation. As often, I made my tests using an Oracle VirtualBox environment. The scripts are tested and functional on my system. Unfortunately, there can always be a typo or a mistake. This blog entry does not replace a course around the Clusterware Framework. I just hope it will let you see how powerful it is and that it will give you the whilst to go further with it...  Note : This entry has been revised (rev.2) following comments from Philip Newlan. Prerequisite 2 Linux boxes (OELCluster01 and OELCluster02) at the same OS level. I used OEL 5 Update 5 with an Enterprise Kernel. Shared Storage (SAN). On my VirtualBox system, I used Openfiler to simulate the SAN Oracle 11gR2 Database (11.2.0.1) Oracle 11gR2 Grid Infrastructure (11.2.0.1)   Step 1 - Install the software Using asmlib, create 3 ASM disks (ASM_CRS, ASM_DTA and ASM_FRA) Install Grid Infrastructure for a cluster (OELCluster01 and OELCluster02 are the 2 nodes of the cluster) Use ASM_CRS to store Voting Disk and OCR. Use SCAN. Install Oracle Database Standalone binaries on both nodes. Use asmca to check/mount the disk groups on 2 nodes Use dbca to create and configure a database on the primary node Let's name it DB11G. Copy the pfile, password file to the second node. Create adump directoty on the second node.   Step 2 - Setup the resource to be protected After its creation with dbca, the database is automatically protected by the Oracle Restart technology available with Grid Infrastructure. Consequently, it restarts automatically (if possible) after a crash (ex: kill -9 smon). A database resource has been created for that in the Cluster Registry. We can observe this with the command : crsctl status resource that shows and ora.dba11g.db entry. Let's save the definition of this resource, for future use : mkdir -p /crs/11.2.0/HA_scripts chown oracle:oinstall /crs/11.2.0/HA_scripts crsctl status resource ora.db11g.db -p > /crs/11.2.0/HA_scripts/myResource.txt Although very interesting, Oracle Restart is not cluster aware and cannot restart the database on any other node of the cluster. So, let's remove it from the OCR definitions, we don't need it ! srvctl stop database -d DB11G srvctl remove database -d DB11G Instead of it, we need to create a new resource of a more general type : cluster_resource. Here are the steps to achieve this : Create an action script :  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh #!/bin/bash export ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 export ORACLE_SID=DB11G case $1 in 'start')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   startup EOF   RET=0   ;; 'stop')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   shutdown immediate EOF   RET=0   ;; 'clean')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   shutdown abort    ##for i in `ps -ef | grep -i $ORACLE_SID | awk '{print $2}' ` ;do kill -9 $i; done EOF   RET=0   ;; 'check')    ok=`ps -ef | grep smon | grep $ORACLE_SID | wc -l`    if [ $ok = 0 ]; then      RET=1    else      RET=0    fi    ;; '*')      RET=0   ;; esac if [ $RET -eq 0 ]; then    exit 0 else    exit 1 fi   This script must provide, at least, methods to start, stop, clean and check the database. It is self-explaining and contains nothing special. Just be aware that it must be runnable (+x), it runs as Oracle user (because of the ACL property - see later) and needs to know about the environment. Also make sure it exists on every node of the cluster. Moreover, as of 11.2, the clean method is mandatory. It must provide the “last gasp clean up”, for example, a shutdown abort or a kill –9 of all the remaining processes. chmod +x /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh scp  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh   oracle@OELCluster02:/crs/11.2.0/HA_scripts Create a new resource file, based on the information we got from previous  myResource.txt . Name it myNewResource.txt. myResource.txt  is shown below. As we can see, it defines an ora.database.type resource, named ora.db11g.db. A lot of properties are related to this type of resource and do not need to be used for a cluster_resource. NAME=ora.db11g.db TYPE=ora.database.type ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_FAILURE_TEMPLATE= ACTION_SCRIPT= ACTIVE_PLACEMENT=1 AGENT_FILENAME=%CRS_HOME%/bin/oraagent%CRS_EXE_SUFFIX% AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=1 CHECK_TIMEOUT=600 CLUSTER_DATABASE=false DB_UNIQUE_NAME=DB11G DEFAULT_TEMPLATE=PROPERTY(RESOURCE_CLASS=database) PROPERTY(DB_UNIQUE_NAME= CONCAT(PARSE(%NAME%, ., 2), %USR_ORA_DOMAIN%, .)) ELEMENT(INSTANCE_NAME= %GEN_USR_ORA_INST_NAME%) DEGREE=1 DESCRIPTION=Oracle Database resource ENABLED=1 FAILOVER_DELAY=0 FAILURE_INTERVAL=60 FAILURE_THRESHOLD=1 GEN_AUDIT_FILE_DEST=/oracle/admin/DB11G/adump GEN_USR_ORA_INST_NAME= GEN_USR_ORA_INST_NAME@SERVERNAME(oelcluster01)=DB11G HOSTING_MEMBERS= INSTANCE_FAILOVER=0 LOAD=1 LOGGING_LEVEL=1 MANAGEMENT_POLICY=AUTOMATIC NLS_LANG= NOT_RESTARTING_TEMPLATE= OFFLINE_CHECK_INTERVAL=0 ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 PLACEMENT=restricted PROFILE_CHANGE_TEMPLATE= RESTART_ATTEMPTS=2 ROLE=PRIMARY SCRIPT_TIMEOUT=60 SERVER_POOLS=ora.DB11G SPFILE=+DTA/DB11G/spfileDB11G.ora START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STATE_CHANGE_TEMPLATE= STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h USR_ORA_DB_NAME=DB11G USR_ORA_DOMAIN=haroland USR_ORA_ENV= USR_ORA_FLAGS= USR_ORA_INST_NAME=DB11G USR_ORA_OPEN_MODE=open USR_ORA_OPI=false USR_ORA_STOP_MODE=immediate VERSION=11.2.0.1.0 I removed database type related entries from myResource.txt and modified some other to produce the following myNewResource.txt. Notice the NAME property that should not have the ora. prefix Notice the TYPE property that is not ora.database.type but cluster_resource. Notice the definition of ACTION_SCRIPT. Notice the HOSTING_MEMBERS that enumerates the members of the cluster (as returned by the olsnodes command). NAME=DB11G.db TYPE=cluster_resource DESCRIPTION=Oracle Database resource ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_SCRIPT=/crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh PLACEMENT=restricted ACTIVE_PLACEMENT=0 AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=10 DEGREE=1 ENABLED=1 HOSTING_MEMBERS=oelcluster01 oelcluster02 LOGGING_LEVEL=1 RESTART_ATTEMPTS=1 START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h Register the resource. Take care of the resource type. It needs to be a cluster_resource and not a ora.database.type resource (Oracle recommendation) .   crsctl add resource DB11G.db  -type cluster_resource -file /crs/11.2.0/HA_scripts/myNewResource.txt Step 3 - Start the resource crsctl start resource DB11G.db This command launches the ACTION_SCRIPT with a start and a check parameter on the primary node of the cluster. Step 4 - Test this We will test the setup using 2 methods. crsctl relocate resource DB11G.db This command calls the ACTION_SCRIPT  (on the two nodes)  to stop the database on the active node and start it on the other node. Once done, we can revert back to the original node, but, this time we can use a more "MS$ like" method :Turn off the server on which the database is running. After short delay, you should observe that the database is relocated on node 1. Conclusion Once the software installed and the standalone database created (which is a rather common and usual task), the steps to reach the objective are quite easy : Create an executable action script on every node of the cluster. Create a resource file. Create/Register the resource with OCR using the resource file. Start the resource. This solution is a very interesting alternative to licensable third party solutions. References Clusterware 11gR2 documentation Oracle Clusterware Resource Reference Clusterware for Unbreakable Linux Using Oracle Clusterware to Protect A Single Instance Oracle Database 11gR1 (to have an idea of complexity) Oracle Clusterware on OTN   Gilles Haro Technical Expert - Core Technology, Oracle Consulting   

    Read the article

  • Clusterware 11gR2 &ndash; Setting up an Active/Passive failover configuration

    - by Gilles Haro
    Oracle provides many interesting ways to ensure High Availability. Dataguard configurations, RAC configurations or even both (as recommended for a Maximum Available Architecture - MAA) are the most frequently found. But when it comes to protecting a system with an Active/Passive architecture with failover capabilities, one often thinks to expensive third party cluster systems. Oracle Clusterware technology, which comes free with Oracle Database, is – in the knowing of most people - often linked to Oracle RAC and therefore, is rarely used to implement failover solutions. 11gR2 Clusterware – which is part of Oracle Grid Infrastructure - provides a comprehensive framework to setup automatic failover configurations. It is actually possible to make “failover-able'” and, therefore to protect, almost every kind of application (from xclock to the more complex Application Server) In the next couple of lines, I will try to present the different steps to achieve this goal : Have a fully operational 11gR2 database protected by automatic failover capabilities. I assume you are fluent in installing Oracle Database 11gR2, Oracle Grid Infrastructure 11gR2 on a Linux system and that ASM is not a problem for you (as I am using it as a shared storage). If not, please have a look at Oracle Documentation. As often, I made my tests using an Oracle VirtualBox environment. The scripts are tested and functional. Unfortunately, there can always be a typo or a mistake. This blog entry is not a course around the Clusterware Framework. I just hope it will let you see how powerful it is and that it will give you the whilst to go further with it…   Prerequisite 2 Linux boxes (OELCluster01 and OELCluster02) at the same OS level. I used OEL 5 Update 5 with Enterprise Kernel. Shared Storage (SAN). On my VirtualBox system, I used Openfiler to simulate the SAN Oracle 11gR2 Database (11.2.0.1) Oracle 11gR2 Grid Infrastructure (11.2.0.1)   Step 1 – Install the software Using asmlib, create 3 ASM disks (ASM_CRS, ASM_DTA and ASM_FRA) Install Grid Infrastructure for a cluster (OELCluster01 and OELCluster02 are the 2 nodes of the cluster) Use ASM_CRS to store Voting Disk and OCR. Use SCAN. Install Oracle Database Standalone binaries on both nodes. Use asmca to check/mount the disk groups on 2 nodes Use dbca to create and configure a database on the primary node Let’s name it DB11G. Copy the pfile, password file to the second node. Create adump directoty on the second node.   Step 2 - Setup the resource to be protected After its creation with dbca, the database is automatically protected by the Oracle Restart technology available with Grid Infrastructure. Consequently, it restarts automatically (if possible) after a crash (ex: kill –9 smon). A database resource has been created for that in the Cluster Registry. We can observe this with the command : crsctl status resource that shows and ora.dba11g.db entry. Let’s save the definition of this resource, for future use : mkdir –p /crs/11.2.0/HA_scripts chown oracle:oinstall /crs/11.2.0/HA_scripts crsctl status resource ora.db11g.db -p > /crs/11.2.0/HA_scripts/myResource.txt Although very interesting, Oracle Restart is not cluster aware and cannot restart the database on any other node of the cluster. So, let’s remove it from the OCR definitions, we don’t need it ! srvctl stop database -d DB11G srvctl remove database -d DB11G Instead of it, we need to create a new resource of a more general type : cluster_resource. Here are the steps to achieve this : Create an action script :  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh #!/bin/bash export ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 export ORACLE_SID=DB11G case $1 in 'start')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   startup EOF   RET=0   ;; 'stop')   $ORACLE_HOME/bin/sqlplus /nolog <<EOF   connect / as sysdba   shutdown immediate EOF   RET=0   ;; 'check')    ok=`ps -ef | grep smon | grep $ORACLE_SID | wc -l`    if [ $ok = 0 ]; then      RET=1    else      RET=0    fi    ;; '*')      RET=0   ;; esac if [ $RET -eq 0 ]; then    exit 0 else    exit 1 fi   This script must provide, at least, methods to start, stop and check the database. It is self-explaining and contains nothing special. Just be aware that it is run as Oracle user (because of the ACL property – see later) and needs to know about the environment. It also needs to be present on every node of the cluster. chmod +x /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh scp  /crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh   oracle@OELCluster02:/crs/11.2.0/HA_scripts Create a new resource file, based on the information we got from previous  myResource.txt . Name it myNewResource.txt. myResource.txt  is shown below. As we can see, it defines an ora.database.type resource, named ora.db11g.db. A lot of properties are related to this type of resource and do not need to be used for a cluster_resource. NAME=ora.db11g.db TYPE=ora.database.type ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_FAILURE_TEMPLATE= ACTION_SCRIPT= ACTIVE_PLACEMENT=1 AGENT_FILENAME=%CRS_HOME%/bin/oraagent%CRS_EXE_SUFFIX% AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=1 CHECK_TIMEOUT=600 CLUSTER_DATABASE=false DB_UNIQUE_NAME=DB11G DEFAULT_TEMPLATE=PROPERTY(RESOURCE_CLASS=database) PROPERTY(DB_UNIQUE_NAME= CONCAT(PARSE(%NAME%, ., 2), %USR_ORA_DOMAIN%, .)) ELEMENT(INSTANCE_NAME= %GEN_USR_ORA_INST_NAME%) DEGREE=1 DESCRIPTION=Oracle Database resource ENABLED=1 FAILOVER_DELAY=0 FAILURE_INTERVAL=60 FAILURE_THRESHOLD=1 GEN_AUDIT_FILE_DEST=/oracle/admin/DB11G/adump GEN_USR_ORA_INST_NAME= GEN_USR_ORA_INST_NAME@SERVERNAME(oelcluster01)=DB11G HOSTING_MEMBERS= INSTANCE_FAILOVER=0 LOAD=1 LOGGING_LEVEL=1 MANAGEMENT_POLICY=AUTOMATIC NLS_LANG= NOT_RESTARTING_TEMPLATE= OFFLINE_CHECK_INTERVAL=0 ORACLE_HOME=/oracle/product/11.2.0/dbhome_1 PLACEMENT=restricted PROFILE_CHANGE_TEMPLATE= RESTART_ATTEMPTS=2 ROLE=PRIMARY SCRIPT_TIMEOUT=60 SERVER_POOLS=ora.DB11G SPFILE=+DTA/DB11G/spfileDB11G.ora START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STATE_CHANGE_TEMPLATE= STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h USR_ORA_DB_NAME=DB11G USR_ORA_DOMAIN=haroland USR_ORA_ENV= USR_ORA_FLAGS= USR_ORA_INST_NAME=DB11G USR_ORA_OPEN_MODE=open USR_ORA_OPI=false USR_ORA_STOP_MODE=immediate VERSION=11.2.0.1.0 I removed database type related entries from myResource.txt and modified some other to produce the following myNewResource.txt. Notice the NAME property that should not have the ora. prefix Notice the TYPE property that is not ora.database.type but cluster_resource. Notice the definition of ACTION_SCRIPT. Notice the HOSTING_MEMBERS that enumerates the members of the cluster (as returned by the olsnodes command). NAME=DB11G.db TYPE=cluster_resource DESCRIPTION=Oracle Database resource ACL=owner:oracle:rwx,pgrp:oinstall:rwx,other::r-- ACTION_SCRIPT=/crs/11.2.0/HA_scripts/my_ActivePassive_Cluster.sh PLACEMENT=restricted ACTIVE_PLACEMENT=0 AUTO_START=restore CARDINALITY=1 CHECK_INTERVAL=10 DEGREE=1 ENABLED=1 HOSTING_MEMBERS=oelcluster01 oelcluster02 LOGGING_LEVEL=1 RESTART_ATTEMPTS=1 START_DEPENDENCIES=hard(ora.DTA.dg,ora.FRA.dg) weak(type:ora.listener.type,uniform:ora.ons,uniform:ora.eons) pullup(ora.DTA.dg,ora.FRA.dg) START_TIMEOUT=600 STOP_DEPENDENCIES=hard(intermediate:ora.asm,shutdown:ora.DTA.dg,shutdown:ora.FRA.dg) STOP_TIMEOUT=600 UPTIME_THRESHOLD=1h Register the resource. Take care of the resource type. It needs to be a cluster_resource and not a ora.database.type resource (Oracle recommendation) .   crsctl add resource DB11G.db  -type cluster_resource -file /crs/11.2.0/HA_scripts/myNewResource.txt Step 3 - Start the resource crsctl start resource DB11G.db This command launches the ACTION_SCRIPT with a start and a check parameter on the primary node of the cluster. Step 4 - Test this We will test the setup using 2 methods. crsctl relocate resource DB11G.db This command calls the ACTION_SCRIPT  (on the two nodes)  to stop the database on the active node and start it on the other node. Once done, we can revert back to the original node, but, this time we can use a more “MS$ like” method :Turn off the server on which the database is running. After short delay, you should observe that the database is relocated on node 1. Conclusion Once the software installed and the standalone database created (which is a rather common and usual task), the steps to reach the objective are quite easy : Create an executable action script on every node of the cluster. Create a resource file. Create/Register the resource with OCR using the resource file. Start the resource. This solution is a very interesting alternative to licensable third party solutions.   References Clusterware 11gR2 documentation Oracle Clusterware Resource Reference   Gilles Haro Technical Expert - Core Technology, Oracle Consulting   

    Read the article

  • CodePlex Daily Summary for Monday, December 03, 2012

    CodePlex Daily Summary for Monday, December 03, 2012Popular Releasesmenu4web: menu4web 1.1 - free javascript menu: menu4web 1.1 has been tested with all major browsers: Firefox, Chrome, IE, Opera and Safari. Minified m4w.js library is less than 9K. Includes 22 menu examples of different styles. Can be freely distributed under The MIT License (MIT).Quest: Quest 5.3 Beta: New features in Quest 5.3 include: Grid-based map (sponsored by Phillip Zolla) Changable POV (sponsored by Phillip Zolla) Game log (sponsored by Phillip Zolla) Customisable object link colour (sponsored by Phillip Zolla) More room description options (by James Gregory) More mathematical functions now available to expressions Desktop Player uses the same UI as WebPlayer - this will make it much easier to implement customisation options New sorting functions: ObjectListSort(list,...Mi-DevEnv: Development 0.1: First Drop This is the first drop of files now placed under source control. Today the system ran end to end, creating a virtual machine and installing multiple products without a single prompt or key press being required. This is a snapshot of the first release. All files are under source control. Assumes Hyper-V under Server 2012 or Windows 8, using Windows Management Framework with PowerShell 3.Chinook Database: Chinook Database 1.4: Chinook Database 1.4 This is a sample database available in multiple formats: SQL scripts for multiple database vendors, embeded database files, and XML format. The Chinook data model is available here. ChinookDatabase1.4_CompleteVersion.zip is a complete package for all supported databases/data sources. There are also packages for each specific data source. Supported Database ServersDB2 EffiProz MySQL Oracle PostgreSQL SQL Server SQL Server Compact SQLite Issues Resolved293...RiP-Ripper & PG-Ripper: RiP-Ripper 2.9.34: changes FIXED: Thanks Function when "Download each post in it's own folder" is disabled FIXED: "PixHub.eu" linksCleverBobCat: CleverBobCat 1.1.2: Changes: - Control loss of cart when decoupled fixed - Some problems with energy transfer usage if disabled system fixedD3 Loot Tracker: 1.5.6: Updated to work with D3 version 1.0.6.13300DirectQ: DirectQ II 2012-11-29: A (slightly) modernized port of Quake II to D3D9. You need SM3 or better hardware to run this - if you don't have it, then don't even bother. It should work on Windows Vista, 7 or 8; it may also work on XP but I haven't tested. Known bugs include: Some mods may not work. This is unfortunately due to the nature of Quake II's game DLLs; sometimes a recompile of the game DLL is all that's needed. In any event, ensure that the game DLL is compatible with the last release of Quake II first (...Magelia WebStore Open-source Ecommerce software: Magelia WebStore 2.2: new UI for the Administration console Bugs fixes and improvement version 2.2.215.3JayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2.5: What's new in JayData 1.2.5For detailed release notes check the release notes. Handlebars template engine supportImplement data manager applications with JayData using Handlebars.js for templating. Include JayDataModules/handlebars.js and begin typing the mustaches :) Blogpost: Handlebars templates in JayData Handlebars helpers and model driven commanding in JayData Easy JayStorm cloud data managementManage cloud data using the same syntax and data management concept just like any other data ...nopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.70: Highlight features & improvements: • Performance optimization. • Search engine optimization. ID-less URLs for products, categories, and manufacturers. • Added ACL support (access control list) on products and categories. • Minify and bundle JavaScript files. • Allow a store owner to decide which billing/shipping address fields are enabled/disabled/required (like it's already done for the registration page). • Moved to MVC 4 (.NET 4.5 is required). • Now Visual Studio 2012 is required to work ...SQL Server Partition Management: Partition Management Release 3.0: Release 3.0 adds support for SQL Server 2012 and is backward compatible with SQL Server 2008 and 2005. The release consists of: • A Readme file • The Executable • The source code (Visual Studio project) Enhancements include: -- Support for Columnstore indexes in SQL Server 2012 -- Ability to create TSQL scripts for staging table and index creation operations -- Full support for global date and time formats, locale independent -- Support for binary partitioning column types -- Fixes to is...NHook - A debugger API: NHook 1.0: x86 debugger Resolve symbol from MS Public server Resolve RVA from executable's image Add breakpoints Assemble / Disassemble target process assembly More information here, you can also check unit tests that are real sample code.PDF Library: PDFLib v2.0: Release notes This new version include many bug fixes and include support for stream objects and cross-reference object streams. New FeatureExtract images from the PDFDocument.Editor: 2013.5: Whats new for Document.Editor 2013.5: New Read-only File support New Check For Updates support Minor Bug Fix's, improvements and speed upsMCEBuddy 2.x: MCEBuddy 2.3.10: Critical Update to 2.3.9: Changelog for 2.3.10 (32bit and 64bit) 1. AsfBin executable missing from build 2. Removed extra references from build to avoid conflict 3. Showanalyzer installation now checked on remote engine machine Changelog for 2.3.9 (32bit and 64bit) 1. Added support for WTV output profile 2. Added support for minimizing MCEBuddy to the system tray 3. Added support for custom archive folder 4. Added support to disable subdirectory monitoring 5. Added support for better TS fil...DotNetNuke® Community Edition CMS: 07.00.00: Major Highlights Fixed issue that caused profiles of deleted users to be available Removed the postback after checkboxes are selected in Page Settings > Taxonomy Implemented the functionality required to edit security role names and social group names Fixed JavaScript error when using a ";" semicolon as a profile property Fixed issue when using DateTime properties in profiles Fixed viewstate error when using Facebook authentication in conjunction with "require valid profile fo...CODE Framework: 4.0.21128.0: See change notes in the documentation section for details on what's new.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.76: Fixed a typo in ObjectLiteralProperty.IsConstant that caused all object literals to be treated like they were constants, and possibly moved around in the code when they shouldn't be.Kooboo CMS: Kooboo CMS 3.3.0: New features: Dropdown/Radio/Checkbox Lists no longer references the userkey. Instead they refer to the UUID field for input value. You can now delete, export, import content from database in the site settings. Labels can now be imported and exported. You can now set the required password strength and maximum number of incorrect login attempts. Child sites can inherit plugins from its parent sites. The view parameter can be changed through the page_context.current value. Addition of c...New ProjectsASP.NET Youtube Clone: ASP.NET Youtube Clone is a complete script with basic and advance features which allow you to build complex social media sharing website in asp.net, c#, vb.net.Assembly - Halo Research Tool: Assembly is a program designed to aid in the development of creative modifications for the Xbox 360 Halo games. It also includes a .NET library for programmers.Async ContentPlaceHolder: Load your ASP.Net content placeholder asynchronously.Automate Microsoft System Center Configuration Manager 2012 with Powershell: Scripts to automate Microsoft System Center 2012 Configuration Manager with PowershellAzMan Contrib: AzMan Contrib aims to provide a better experience when using AzMan with ASP.NET WebForms and ASP.NET MVC.Badhshala Jackpot: Facebook application that features Slotmachine with 3 slots where each slot's position is predicted randomly. Tools Used: ASP.Net MVC 4, SQL Server, csharpsdk BREIN Messaging Infrastructure: This project allows for hiding & encapsulating an (WS based) infrastructure by providing the means for dynamic message routing. The gateway thereby enhances the messaging channels to enforce any amount of policies upon in- and outcoming messages. CricketWorldCup2011Trivia: Simple trivia game written in C# based on the 2011 Cricket World Cup.Flee#: A C# Port of the Flee evaluator. It's an expression evaluator for C# that compiles the expressions into IL code, resulting in very fast and efficient execution.Hamcast for multi station coordination: Amateur Radio multiple station operation tends to have loggers and operators striving to get particular information from each other, like what IP address and so forth, so I write this small multicast utility to help them. Supports N1MM and other popular loggers.LDAP/AD Claim Provider For SharePoint 2010: This claim provider implements search on LDAP and AD for SAML authentication (claims mode) in SharePoint 2010MicroData Parser: This library is preliminary implementation of the HTML5 MicroData specification, in C#.PCC.Framework: NET???????????ResumeSharp: ResumeSharp is a resume building tool designed to help keep your resume up-to-date easily. Additionally, you can quickly generate targeted resumes on the fly. It's developed in C#.Sharepoint SPConstant generator: This utility creates a hierarchally representation of a WSS 3.0 / MOSS 2007 Site Collection and generates a C# Source Code File (SPConstant.cs) with a nested structure of structs with static const string fields. This enables you to do the following: SPList list = web.Lists[SPConstant.Lists.Tasklist.Name]; You will then just have to regenerate the SPConstant file (eg. from within VS 2005 or from Command line) to update the name. Description is added to the XML-comments in the generated file ...SoftServe Tasks: a couple of tasks from the 'softserve' courses.Solid Edge Community Extensions: Solid Edge SDKStar Fox XNA Edition: Development of videogames for Microsoft platforms using XNA Game Studio. Remake of the classic videogame Star Fox (1993) for SNES game console.TinySimpleCMS: a tiny and simple cmsUgly Animals: Crossing of Angry Birds and Yeti SportsVIENNA Advantage ERP and CRM: A real cloud based ERP and CRM in C#.Net offering enterprise level functionality for PC, Mac, iPhone, Surface and Android with HTML5 and Silverlight UI.WebSitemap Localizer: WebSitemap Localizer is a utility to auto-convert your Web.sitemap file to support localization.

    Read the article

  • CodePlex Daily Summary for Wednesday, November 07, 2012

    CodePlex Daily Summary for Wednesday, November 07, 2012Popular ReleasesMetodología General Ajustada - MGA: 03.04.03: Cambios Parmenio: Ajustes al formato F02 de programación para que la sincronización de las grillas no afecte el guardado de los datos. Cambios John: Integración de código con cambios enviados por Parmenio. Generación de instaladores. Soporte técnico por correo electrónico, telefónico y en sitio.nAPI for Windows Phone: Naver Open API Library Assemblies and Source Codes: nAPI (Naver Open API Library) for Windows Family Tested on - Windows 8 (Windows Store App) - Windows Phone 7 - Windows Phone 8 (Emulator)X-tee.NET: Xtee.NET 1.0: Generaator ja teegidFiskalizacija za developere: FiskalizacijaDev 1.2: Verzija 1.2. je, prije svega, odgovor na novu verziju Tehnicke specifikacije (v1.1.) koja je objavljena prije nekoliko dana. Pored novosti vezanih uz (sitne) izmjene u spomenutoj novoj verziji Tehnicke dokumentacije, projekt smo prošili sa nekim dodatnim feature-ima od kojih je vecina proizašla iz vaših prijedloga - hvala :) Novosti u v1.2. su: - Neusuglašenost zahtjeva (http://fiskalizacija.codeplex.com/workitem/645) - Sample projekt - iznosi se množe sa 100 (http://fiskalizacija.codeplex.c...PowerComboBox: PowerComboBox VB v1.0: Visual Basic source code class file.Edi: Themable Edi: Completed ExpressionDark theme Improved Error Handling and Reporting feature Refactored all views to be look-less controlsMFCMAPI: October 2012 Release: Build: 15.0.0.1036 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeJayData - The cross-platform HTML5 data-management library for JavaScript: JayData 1.2.3: JayData is a unified data access library for JavaScript to CRUD + Query data from different sources like OData, MongoDB, WebSQL, SqLite, HTML5 localStorage, Facebook or YQL. The library can be integrated with Knockout.js or Sencha Touch 2 and can be used on Node.js as well. See it in action in this 6 minutes video Sencha Touch 2 example app using JayData: Netflix browser. What's new in JayData 1.2.3 For detailed release notes check the release notes. TypeScript supportWrite your code in a ...SSIS Expression Editor & Tester: Expression Editor and Tester v1.0.8.0: Getting Started Download and extract the files, no install required. The ExpressionEditor.zip download contains a folder for each SQL Server version. ExpressionEditor2005 ExpressionEditor2008 ExpressionEditor2012 Changes Fixed issues 32868 and 33291 raised by BIDS Helper users. No functional changes from previous release. Versions There are three versions included, all built from the same code with the same functionality, but each targeting a different release of SQL Server. The downlo...MCEBuddy 2.x: MCEBuddy 2.3.7: Changelog for 2.3.7 (32bit and 64bit) 1. Improved performance of MP4 Fast and M4V Fast Profiles (no deinterlacing, removed --decomb) 2. Improved priority handling 3. Added support for Pausing and Resume conversions 4. Added support for fallback to source directory if network destination directory is unavailable 5. MCEBuddy now installs ShowAnalyzer during installation 6. Added support for long description atom in iTunesFoxyXLS: FoxyXLS Releases: Source code and samplesDyanamic Reports (RDLC) - SharePoint 2010 Visual WebPart: Initial Release: This is a Initial Release.HTML Renderer: HTML Renderer 1.0.0.0 (3): Major performance improvement (http://theartofdev.wordpress.com/2012/10/25/how-i-optimized-html-renderer-and-fell-in-love-with-vs-profiler/) Minor fixes raised in issue tracker and discussions.ProDinner - ASP.NET MVC Sample (EF4.4, N-Tier, jQuery): 8: update to ASP.net MVC Awesome 3.0 udpate to EntityFramework 4.4 update to MVC 4 added dinners grid on homepageASP.net MVC Awesome - jQuery Ajax Helpers: 3.0: added Grid helper added XML Documentation added textbox helper added Client Side API for AjaxList removed .SearchButton from AjaxList AjaxForm and Confirm helpers have been merged into the Form helper optimized html output for AjaxDropdown, AjaxList, Autocomplete works on MVC 3 and 4BlogEngine.NET: BlogEngine.NET 2.7: Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take a look at the Upgrading to BlogEngine.NET 2.7 instructions. If you looking for Web Application Project, ...Launchbar: Launchbar 4.2.2.0: This release is the first step in cleaning up the code and using all the latest features of .NET 4.5 Changes 4.2.2 (2012-11-02) Improved handling of left clicks 4.1.0 (2012-10-17) Removed tray icon Assembly renamed and signed with strong name Note When you upgrade, Launchbar will start with the default settings. You can import your previous settings by following these steps: Run Launchbar and just save the settings without configuring anything Shutdown Launchbar Go to the folder %LOCA...Mouse Jiggler: MouseJiggle-1.3: This adds the much-requested minimize-to-tray feature to Mouse Jiggler.Umbraco CMS: Umbraco 4.10.0 Release Candidate: This is a Release Candidate, which means that if we do not find any major issues in the next week, we will release this version as the final release of 4.10.0 on November 9th, 2012. The documentation for the MVC bits still lives in the Github version of the docs for now and will be updated on our.umbraco.org with the final release of 4.10.0. Browse the documentation here: https://github.com/umbraco/Umbraco4Docs/tree/4.8.0/Documentation/Reference/Mvc If you want to do only MVC then make sur...Skype Auto Recorder: SkypeAutoRecorder 1.3.4: New icon and images. Reworked settings window. Implemented high-quality sound encoding. Implemented a possibility to produce stereo records. Added buttons with system-wide hot keys for manual starting and canceling of recording. Added buttons for opening folder with records. Added Help button. Fixed an issue when recording is continuing after call end. Fixed an issue when recording doesn't start. Fixed several bugs and improved stability. Major refactoring and optimization...New Projects"On the Fly Zip and Attach" Windows Live Writer Plugin: This is a windows live writer zipping plug-in that allows you to select files/folders and zip them on the fly that will appear as attachment inserts, while you are writing blogs. Find details @ my Blogs Site: [url: Blogs: http://www.geekscafe.net|http://www.geekscafe.net].NET Micro Framework Tools: Collection of tools usefull for creating Netduino and .NET Micro Framework applications.Aktina: Aktina is a game engine written in DirectX 11.AppHub: AppHub??????AppStore???,???????,?????????,?????????,????????????,??,???????,???????????、??、??、??????????????,???????。 ??????????,????。Calcula Calles: Calcula CallesCRM 2011 ASP.NET Membership Provider: The CRM 2011 ASP.NET Membership provider Contains a Membership and Role provider, which can be used in ASP.NET Applications and/or ASP.NET based CMS systems.CSharp Executor: Dynamically execute C# script files in the same way that you might use VB scripts.Deque (by Stephen Cleary): A simple double-ended queue (deque) in C#. Unit tested.DirST: Allows one to replicate a DIRectory STructure without copying files. Written in C#.DNNTaskManager69: Testing DNN Module DevelopmentEchelon OS: an Sister Project to Aza DOS and this one is meant to be as minimilistic as it can with a filesystem and text editorGeek Reader: Lector de noticias para windows 8. Se trata de un template que permite rápidamente construir una aplicación windows 8 store GIII_P2: projekt 2katas-gpa: Codigo de los coding dojo acerca de patrones de diseñoKendoUI: Demonstrations using Kendo UI Complete for ASP.NET MVCLTorrent: C# BitTorrent and peer-to-peer protocol implementation MECopter: A real summary will follow soon...Migrate AD Group Permissions in Sharepoint/WSS: Sharepoint AD Group Migration tool, allows conversion of AD security principals used in Sharepoint ACLs from source domain to destination domainNetSend Manager: NetSend Manager is an Asp.Net console which enables you to send windows popup messages over a domain network. Messages can be send to a single user or to a grouNONAME0: ?? ? ???????-?????? ??? ?????? ???? ?????? ???? ??? ?? ???????????????? ??? ? ?????? ??? ??????????npr2012: Projekt testowy Personal News Assistant: Vision: The Personal News Assistant is an application which collects information from different sources and informs the user either on demand or preventive.PGIrony - Tookit & Examples for AST Generation with Irony: A tool-kit to ease AST generation,, and further ease grammr construction, with Irony.Proboscis: A query provider to work against HBase. Will be developed against HBase on Windows Azure.Project13251106: papaProject13271106: sdgProjet IMA: rtjRenderCraft: This is an editor like a AMD's RenderMonkey. But this supports Direct3D 11.Rx (Reactive Extensions): The Reactive Extensions (Rx) is a library for composing asynchronous and event-based programs using observable sequences and LINQ-style query operators. Sanle: sanle project with c++Screener: Screen sharing software.SemantEx: Adds a validation wrapper around regular expressions, allowing you to automatically apply conditional logic to capture groups.SharePoint CAML Extensions: SharePoint CAML ExtensionsSilentPlace: Turn on silence mode when you are in a silence placeSmartMacros: SmartMacros allows developers to define macros in C# and use them inside other source code. These macros are much stronger and safer than C/C++ macros, therefore they're called "Smart" :-)SoundArea link grabber: Soundarea link grabber the script put a list of links in your clipboard.Time Tracker Kickstart: This project is currently a work-in-progress.TransparentImage: TransparentImage is a console application that converts BMPS into PNG files. TransparentImage is written in VB and supports drop n drag.Travis7: A Travis-CI client for Windows PhoneWindows 8 Accelerator: A set of components and controls to accelerate your Windows 8 and Windows Phone 8 application development.Windows and Windows Phone DNS Library: Simple DNS lookup library for Windows and Windows Phone applicationsWinJS Toolkit - JavaScript Toolkit for Windows 8: The WinJS Toolkit is a set of classes, helper functions and tools that help creating Windows Store applications in HTML5, CSS3 and JavaScript.WPF TB: Study WPF????????: ZJU software project homework,One part of the total stocktrade system.

    Read the article

  • The Internet of Things & Commerce: Part 3 -- Interview with Kristen J. Flanagan, Commerce Product Management

    - by Katrina Gosek, Director | Commerce Product Strategy-Oracle
    Internet of Things & Commerce Series: Part 3 (of 3) And now for the final installment my three part series on the Internet of Things & Commerce. Post one, “The Next 7,000 Days”, introduced the idea of the Internet of Things, followed by a second post interviewing one of our chief commerce innovation strategists, Brian Celenza.  This final post in the series is an interview with Kristen J. Flanagan, lead product manager for Oracle Commerce omnichannel strategy. She takes us through the past, present, and future of how our Commerce Solution is re-imagining the way physical and digital shopping come together. ------- QUESTION: It’s your job to stay on top of what our customers’ need to not only run their online businesses effectively, but also to make sure they have product capabilities they can innovate and grow on. What key trend has been top-of-mind for you and our customers around this collision of physical and digital shopping? Kristen: I’ll agree with Brian Celenza that hands down mobile has forced a major disruption in shopping and selling behavior. A few years ago, mobile exploded at a pace I don't think anyone was expecting. Early on, we saw our customers scrambling to establish a mobile presence---mostly through "screen scraping" technologies. As smartphones continued to advance (at lightening speed!), our customers started to investigate ways to truly tap in to their eCommerce capabilities to deliver the mobile experience. They started looking to us for a means of using the eCommerce services and capabilities to deliver a mobile experience that is tailored for mobile rather than the desktop experience on a smaller screen. In the future, I think we'll see customers starting to really understand what their shoppers need and expect from a mobile offering and how they can adapt their content and delivery of that content to meet those needs. And, mobile shopping doesn’t stop at the consumer / buyer. Because the in-store experience is compelling and has advantages that digital just can't offer, we're also starting to see the eCommerce services being leveraged for mobile for in-store sales associates. Brick-and-mortar retailers are interested in putting the omnichannel product catalog, promotions, and cart into the hands of knowledgeable associates. Retailers are now looking to connect and harness the eCommerce data in-store so that shoppers have a reason to walk-in. I think we'll be seeing a lot more customers thinking about melding the in-store and digital experiences to present a richer offering for shoppers.    QUESTION: What are some examples of what our customers are doing currently to bring these concepts to reality? Kristen: Well, without question, connecting digital and brick-and-mortar worlds is becoming tablestakes for selling experiences. If a brand has a foot in both worlds (i.e., isn’t a pureplay online retailer), they have to connect the dots because shoppers – whether consumers or B2B buyers –don't think in clearly defined channels anymore. The expectation is connectedness – for on- and offline experiences, promotions, products, and customer data. What does this mean practically for businesses selling goods on- and offline? It touches a lot of systems: inventory info on the eCommerce site, fulfillment options across channels (buy online/pickup in store), order information (representing various channels for a cohesive view of shopper order history), promotions across digital and store, etc.  A few years ago, the main link between store and digital was the smartphone. We all remember when “apps” became a thing and many of our customers were scrambling to get a native app out there. Now we're seeing more strategic thinking around the benefits of mobile web vs. native and how that ties in to the purpose and role of mobile within the digital channel. Put it more broadly, how these pieces fit together in the overall brand puzzle.  The same could be said for “showrooming.” Where it was a major concern (i.e., shoppers using stores to look at merchandise and then order online from Amazon), in recent months, it’s emerged that the inverse is now becoming a a reality as well. "Webrooming" (using digital sites to do research before making a purchase in the store) is a new behavior pure play retailers are challenged with. There are many technologies, behaviors, and information that need to tie together to offer a holistic omnichannel shopping experience. As a result, brands are looking for ways to connect the digital and in-store experiences to bridge the gaps: shared assortments across channels, assisted selling apps that arm associates with information about shoppers, shared promotions, inventory, etc. QUESTION: How has Oracle Commerce been built to help brands make the link between in-store and digital over the last few years? Kristen: Over the last seven years, the product has been in step with the changes in industry needs. Here is a brief history of the evolution: Prior to Oracle’s acquisition of ATG and Endeca, key investments were made to cross-channel functionality that we are still building on today. Commerce Service Center (v2007.1) ATG introduced the Commerce Service Center in 2007.1 and marked the first entry into what was then called “cross-channel.” The Commerce Service Center is a call-center-agent-facing application that enables agents to see shopper orders, online catalog, promotions, and pricing. It is tightly integrated with the eCommerce capabilities of the platform and commerce engine and provided a means of connecting data from the call center and online channels.  REST services framework (v9.1)  In v9.1 we introduced the REST services framework and interface in the Platform that enabled customers to use ATG web services in other applications. This framework has become the basis for our subsequent omni-channel features and functionality. Multisite Architecture (v10) With the v10 release, we introduced the Multisite Architecture, which enabled customers to manage multiple sites (and channels) within a single instance of the BCC. Customers could create site- and channel-specific catalogs, promotions, targeters, and scenarios. Endeca Page Builder (2.x) / Experience Manager (3.x) With the introduction of Endeca for Mobile (now part of the core platform, available through the reference store – see blow) on top of Page Builder (and then eventually Experience Manager), Endeca gave business users the tools to create and manage native and mobile web applications. And since the acquisition of both ATG (2011) and Endeca (2012), Oracle Commerce has leveraged the best of each leading technology’s capabilities for omnichannel commerce to continue to drive innovation for our customers. Service enablement of core Oracle Commerce capabilities (v10.1.1, 10.2, & 11) After the establishment of the REST services framework and interface, we followed up in subsequent releases with service enablement of core Oracle Commerce capabilities throughout the iOS native app and the enablement of the core Commerce Service Center features. The result is that customers can leverage these services for their integrations with other systems, as well as their omnichannel initiatives.  Mobile web reference application (v10.1) In 10.1 we introduced the shopper-facing mobile reference application that showed how to use Oracle Commerce to deliver a mobile web experience for shoppers. This included the use of Experience Manager and cartridges to drive those experiences on select pages.  Native (iOS) reference application (v10.1.1)  We came out with the 10.1.1 shopper-facing native iOS ref app that illustrated how to use the Commerce REST services to deliver an iOS app. Also included Experience Manager-driven pages.   Assisted Selling reference application (v10.2.1)  The Assisted Selling reference application is our first reference application designed for the in-store associate. This iOS app shows customers how they can use Oracle Commerce data and information to provide a high-touch, consultative sales environment as well as to put the endless aisle into hands of their associates. Shoppers can start a cart online, and in-store associates can access that cart via the application to provide more information or add products and then transact using the ATG engine. Support for Retail promotions (v11) As part of the v11 release, we worked with teams in the Oracle Retail Global Business Unit (RGBU) to assess which promotion types and capabilities are supported across our products. Those products included Oracle Commerce, Oracle Point of Service (ORPOS), and Oracle Retail Price Management (RPM). The result is that customers can now more easily support omnichannel use cases between the store and digital.  Making sure Oracle Commerce can help support the omnichannel needs of our customers is core to our product strategy. With 89% of consumers now use two or more channels to make a single purchase, ensuring that cross-channel interactions are linked is critical to a great customer experience – and to sales. As Oracle Commerce evolves, we want to make it simple for organizations to create, deliver, and scale experiences across touchpoints with our create once, deploy commerce anywhere framework. We have a flexible, services-oriented architecture that allows data, content, catalogs, cart, experiences, personalization, and merchandising to be shared across touchpoints and easily extended in to new environments like mobile, social, in-store, Call Center, and new Websites. [For the latest downloads and Oracle Commerce documentation, please visit the Oracle Technical Network.] ------ Thank you to both Brian and Kristen for their contributions and to this blog series and their continued thought leadership for Oracle Commerce. We are all looking forward to the coming years of months of new shopping behaviors and opportunities to innovate. Because – if the digital fabric of our everyday lives continues to change at the same pace – the next five years (that just under 2,000 days), will be dramatic. ---------- THIS DOCUMENT IS FOR INFORMATIONAL PURPOSES ONLY AND MAY NOT BE INCORPORATED INTO A CONTRACT OR AGREEMENT

    Read the article

  • EM12c Release 4: Database as a Service Enhancements

    - by Adeesh Fulay
    Oracle Enterprise Manager 12.1.0.4 (or simply put EM12c R4) is the latest update to the product. As previous versions, this release provides tons of enhancements and bug fixes, attributing to improved stability and quality. One of the areas that is most exciting and has seen tremendous growth in the last few years is that of Database as a Service. EM12c R4 provides a significant update to Database as a Service. The key themes are: Comprehensive Database Service Catalog (includes single instance, RAC, and Data Guard) Additional Storage Options for Snap Clone (includes support for Database feature CloneDB) Improved Rapid Start Kits Extensible Metering and Chargeback Miscellaneous Enhancements 1. Comprehensive Database Service Catalog Before we get deep into implementation of a service catalog, lets first understand what it is and what benefits it provides. Per ITIL, a service catalog is an exhaustive list of IT services that an organization provides or offers to its employees or customers. Service catalogs have been widely popular in the space of cloud computing, primarily as the medium to provide standardized and pre-approved service definitions. There is already some good collateral out there that talks about Oracle database service catalogs. The two whitepapers i recommend reading are: Service Catalogs: Defining Standardized Database Service High Availability Best Practices for Database Consolidation: The Foundation for Database as a Service [Oracle MAA] EM12c comes with an out-of-the-box service catalog and self service portal since release 1. For the customers, it provides the following benefits: Present a collection of standardized database service definitions, Define standardized pools of hardware and software for provisioning, Role based access to cater to different class of users, Automated procedures to provision the predefined database definitions, Setup chargeback plans based on service tiers and database configuration sizes, etc Starting Release 4, the scope of services offered via the service catalog has been expanded to include databases with varying levels of availability - Single Instance (SI) or Real Application Clusters (RAC) databases with multiple data guard based standby databases. Some salient points of the data guard integration: Standby pools can now be defined across different datacenters or within the same datacenter as the primary (this helps in modelling the concept of near and far DR sites) The standby databases can be single instance, RAC, or RAC One Node databases Multiple standby databases can be provisioned, where the maximum limit is determined by the version of database software The standby databases can be in either mount or read only (requires active data guard option) mode All database versions 10g to 12c supported (as certified with EM 12c) All 3 protection modes can be used - Maximum availability, performance, security Log apply can be set to sync or async along with the required apply lag The different service levels or service tiers are popularly represented using metals - Platinum, Gold, Silver, Bronze, and so on. The Oracle MAA whitepaper (referenced above) calls out the various service tiers as defined by Oracle's best practices, but customers can choose any logical combinations from the table below:  Primary  Standby [1 or more]  EM 12cR4  SI  -  SI  SI  RAC -  RAC SI  RAC RAC  RON -  RON RON where RON = RAC One Node is supported via custom post-scripts in the service template A sample service catalog would look like the image below. Here we have defined 4 service levels, which have been deployed across 2 data centers, and have 3 standardized sizes. Again, it is important to note that this is just an example to get the creative juices flowing. I imagine each customer would come up with their own catalog based on the application requirements, their RTO/RPO goals, and the product licenses they own. In the screenwatch titled 'Build Service Catalog using EM12c DBaaS', I walk through the complete steps required to setup this sample service catalog in EM12c. 2. Additional Storage Options for Snap Clone In my previous blog posts, i have described the snap clone feature in detail. Essentially, it provides a storage agnostic, self service, rapid, and space efficient approach to solving your data cloning problems. The net benefit is that you get incredible amounts of storage savings (on average 90%) all while cloning databases in a matter of minutes. Space and Time, two things enterprises would love to save on. This feature has been designed with the goal of providing data cloning capabilities while protecting your existing investments in server, storage, and software. With this in mind, we have pursued with the dual solution approach of Hardware and Software. In the hardware approach, we connect directly to your storage appliances and perform all low level actions required to rapidly clone your databases. While in the software approach, we use an intermediate software layer to talk to any storage vendor or any storage configuration to perform the same low level actions. Thus delivering the benefits of database thin cloning, without requiring you to drastically changing the infrastructure or IT's operating style. In release 4, we expand the scope of options supported by snap clone with the addition of database CloneDB. While CloneDB is not a new feature, it was first introduced in 11.2.0.2 patchset, it has over the years become more stable and mature. CloneDB leverages a combination of Direct NFS (or dNFS) feature of the database, RMAN image copies, sparse files, and copy-on-write technology to create thin clones of databases from existing backups in a matter of minutes. It essentially has all the traits that we want to present to our customers via the snap clone feature. For more information on cloneDB, i highly recommend reading the following sources: Blog by Tim Hall: Direct NFS (DNFS) CloneDB in Oracle Database 11g Release 2 Oracle OpenWorld Presentation by Cern: Efficient Database Cloning using Direct NFS and CloneDB The advantages of the new CloneDB integration with EM12c Snap Clone are: Space and time savings Ease of setup - no additional software is required other than the Oracle database binary Works on all platforms Reduce the dependence on storage administrators Cloning process fully orchestrated by EM12c, and delivered to developers/DBAs/QA Testers via the self service portal Uses dNFS to delivers better performance, availability, and scalability over kernel NFS Complete lifecycle of the clones managed by EM12c - performance, configuration, etc 3. Improved Rapid Start Kits DBaaS deployments tend to be complex and its setup requires a series of steps. These steps are typically performed across different users and different UIs. The Rapid Start Kit provides a single command solution to setup Database as a Service (DBaaS) and Pluggable Database as a Service (PDBaaS). One command creates all the Cloud artifacts like Roles, Administrators, Credentials, Database Profiles, PaaS Infrastructure Zone, Database Pools and Service Templates. Once the Rapid Start Kit has been successfully executed, requests can be made to provision databases and PDBs from the self service portal. Rapid start kit can create complex topologies involving multiple zones, pools and service templates. It also supports standby databases and use of RMAN image backups. The Rapid Start Kit in reality is a simple emcli script which takes a bunch of xml files as input and executes the complete automation in a matter of seconds. On a full rack Exadata, it took only 40 seconds to setup PDBaaS end-to-end. This kit works for both Oracle's engineered systems like Exadata, SuperCluster, etc and also on commodity hardware. One can draw parallel to the Exadata One Command script, which again takes a bunch of inputs from the administrators and then runs a simple script that configures everything from network to provisioning the DB software. Steps to use the kit: The kit can be found under the SSA plug-in directory on the OMS: EM_BASE/oracle/MW/plugins/oracle.sysman.ssa.oms.plugin_12.1.0.8.0/dbaas/setup It can be run from this default location or from any server which has emcli client installed For most scenarios, you would use the script dbaas/setup/database_cloud_setup.py For Exadata, special integration is provided to reduce the number of inputs even further. The script to use for this scenario would be dbaas/setup/exadata_cloud_setup.py The database_cloud_setup.py script takes two inputs: Cloud boundary xml: This file defines the cloud topology in terms of the zones and pools along with host names, oracle home locations or container database names that would be used as infrastructure for provisioning database services. This file is optional in case of Exadata, as the boundary is well know via the Exadata system target available in EM. Input xml: This file captures inputs for users, roles, profiles, service templates, etc. Essentially, all inputs required to define the DB services and other settings of the self service portal. Once all the xml files have been prepared, invoke the script as follows for PDBaaS: emcli @database_cloud_setup.py -pdbaas -cloud_boundary=/tmp/my_boundary.xml -cloud_input=/tmp/pdb_inputs.xml          The script will prompt for passwords a few times for key users like sysman, cloud admin, SSA admin, etc. Once complete, you can simply log into EM as the self service user and request for databases from the portal. More information available in the Rapid Start Kit chapter in Cloud Administration Guide.  4. Extensible Metering and Chargeback  Last but not the least, Metering and Chargeback in release 4 has been made extensible in all possible regards. The new extensibility features allow customer, partners, system integrators, etc to : Extend chargeback to any target type managed in EM Promote any metric in EM as a chargeback entity Extend list of charge items via metric or configuration extensions Model abstract entities like no. of backup requests, job executions, support requests, etc  A slew of emcli verbs have also been added that allows administrators to create, edit, delete, import/export charge plans, and assign cost centers all via the command line. More information available in the Chargeback API chapter in Cloud Administration Guide. 5. Miscellaneous Enhancements There are other miscellaneous, yet important, enhancements that are worth a mention. These mostly have been asked by customers like you. These are: Custom naming of DB Services Self service users can provide custom names for DB SID, DB service, schemas, and tablespaces Every custom name is validated for uniqueness in EM 'Create like' of Service Templates Now creating variants of a service template is only a click away. This would be vital when you publish service templates to represent different database sizes or service levels. Profile viewer View the details of a profile like datafile, control files, snapshot ids, export/import files, etc prior to its selection in the service template Cleanup automation - for failed and successful requests Single emcli command to cleanup all remnant artifacts of a failed request Cleanup can be performed on a per request bases or by the entire pool As an extension, you can also delete successful requests Improved delete user workflow Allows administrators to reassign cloud resources to another user or delete all of them Support for multiple tablespaces for schema as a service In addition to multiple schemas, user can also specify multiple tablespaces per request I hope this was a good introduction to the new Database as a Service enhancements in EM12c R4. I encourage you to explore many of these new and existing features and give us feedback. Good luck! References: Cloud Management Page on OTN Cloud Administration Guide [Documentation] -- Adeesh Fulay (@adeeshf)

    Read the article

  • E.T. Phone "Home" - Hey I've discovered a leak..!

    - by Martin Deh
    Being a member of the WebCenter ATEAM, we are often asked to performance tune a WebCenter custom portal application or a WebCenter Spaces deployment.  Most of the time, the process is pretty much the same.  For example, we often use tools like httpWatch and FireBug to monitor the application, and then perform load tests using JMeter or Selenium.  In addition, there are the fine tuning of the different performance based tuning parameters that are outlined in the documentation and by blogs that have been written by my fellow ATEAMers (click on the "performance" tag in this ATEAM blog).  While performing the load test where the outcome produces a significant reduction in the systems resources (memory), one of the causes that plays a role in memory "leakage" is due to the implementation of the navigation menu UI.  OOTB in both JDeveloper and WebCenter Spaces, there are sample (page) templates that include a "default" navigation menu.  In WebCenter Spaces, this is through the SpacesNavigationModel taskflow region, and in a custom portal (i.e. pageTemplate_globe.jspx) the menu UI is contructed using standard ADF components.  These sample menu UI's basically enable the underlying navigation model to visualize itself to some extent.  However, due to certain limitations of these sample menu implementations (i.e. deeper sub-level of navigations items, look-n-feel, .etc), many customers have developed their own custom navigation menus using a combination of HTML, CSS and JQuery.  While this is supported somewhat by the framework, it is important to know what are some of the best practices in ensuring that the navigation menu does not leak.  In addition, in this blog I will point out a leak (BUG) that is in the sample templates.  OK, E.T. the suspence is killing me, what is this leak? Note: for those who don't know, info on E.T. can be found here In both of the included templates, the example given for handling the navigation back to the "Home" page, will essentially provide a nice little memory leak every time the link is clicked. Let's take a look a simple example, which uses the default template in Spaces. The outlined section below is the "link", which is used to enable a user to navigation back quickly to the Group Space Home page. When you (mouse) hover over the link, the browser displays the target URL. From looking initially at the proposed URL, this is the intended destination.  Note: "home" in this case is the navigation model reference (id), that enables the display of the "pretty URL". Next, notice the current URL, which is displayed in the browser.  Remember, that PortalSiteHome = home.  The other highlighted item adf.ctrl-state, is very important to the framework.  This item is basically a persistent query parameter, which is used by the (ADF) framework to managing the current session and page instance.  Without this parameter present, among other things, the browser back-button navigation will fail.  In this example, the value for this parameter is currently 95K25i7dd_4.  Next, through the navigation menu item, I will click on the Page2 link. Inspecting the URL again, I can see that it reports that indeed the navigation is successful and the adf.ctrl-state is also in the URL.  For those that are wondering why the URL displays Page3.jspx, instead of Page2.jspx. Basically the (file) naming convention for pages created ar runtime in Spaces start at Page1, and then increment as you create additional pages.  The name of the actual link (i.e. Page2) is the page "title" attribute.  So the moral of the story is, unlike design time created pages, run time created pages the name of the file will 99% never match the name that appears in the link. Next, is to click on the quick link for navigating back to the Home page. Quick investigation yields that the navigation was indeed successful.  In the browser's URL there is a home (pretty URL) reference, and there is also a reference to the adf.ctrl-state parameter.  So what's the issue?  Can you remember what the value was for the adf.ctrl-state?  The current value is 3D95k25i7dd_149.  However, the previous value was 95k25i7dd_4.  Here is what happened.  Remember when (mouse) hovering over the link produced the following target URL: http://localhost:8888/webcenter/spaces/NavigationTest/home This is great for the browser as this URL will navigate to the intended targer.  However, what is missing is the adf.ctrl-state parameter.  Since this parameter was not present upon navigation "within" the framework, the ADF framework produced another adf.ctrl-state (object).  The previous adf.ctrl-state basically is orphaned while continuing to be alive in memory.  Note: the auto-creation of the adf.ctrl state does happen initially when you invoke the Spaces application  for the first time.  The following is the line of code which produced the issue: <af:goLink destination="#{boilerBean.globalLogoURIInSpace} ... Here the boilerBean is responsible for returning the "string" url, which in this case is /spaces/NavigationTest/home. Unfortunately, again what is missing is adf.ctrl-state. Note: there are more than one instance of the goLinks in the sample templates. So E.T. how can I correct this? There are 2 simple fixes.  For the goLink's destination, use the navigation model to return the actually "node" value, then use the goLinkPrettyUrl method to add the current adf.ctrl-state: <af:goLink destination="#{navigationContext.defaultNavigationModel.node['home'].goLinkPrettyUrl}"} ... />  Note: the node value is the [navigation model id]  Using a goLink does solve the main issue.  However, since the link basically does a redirect, some browsers like IE will produce a somewhat significant "flash".  In a Spaces application, this may be an annoyance to the users.  Another way to solve the leakage problem, and also remove the flash between navigations is to use a af:commandLink.  For example, here is the code example for this scenario: <af:commandLink id="pt_cl2asf" actionListener="#{navigationContext.processAction}" action="pprnav">    <f:attribute name="node" value="#{navigationContext.defaultNavigationModel.node['home']}"/> </af:commandLink> Here, the navigation node to where home is located is delivered by way of the attribute to the commandLink.  The actual navigation is performed by the processAction, which is needing the "node" value. E.T. OK, you solved the OOTB sample BUG, what about my custom navigation code? I have seen many implementations of creating a navigation menu through custom code.  In addition, there are some blog sites that also give detailed examples.  The majority of these implementations are very similar.  The code usually involves using standard HTML tags (i.e. DIVS, UL, LI, .,etc) and either CSS or JavaScript (JQuery) to produce the flyout/drop-down effect.  The navigation links in these cases are standard <a href... > tags.  Although, this type of approach is not fully accepted by the ADF community, it does work.  The important thing to note here is that the <a> tag value must use the goLinkPrettyURL method of contructing the target URL.  For example: <a href="${contextRoot}${menu.goLinkPrettyUrl}"> The main reason why this type of approach is popular is that links that are created this way (also with using af:goLinks), the pages become crawlable by search engines.  CommandLinks are currently not search friendly.  However, in the case of a Spaces instance this may be acceptable.  So in this use-case, af:commandLinks, which would replace the <a>  (or goLink) tags. The example code given of the af:commandLink above is still valid. One last important item.  If you choose to use af:commandLinks, special attention must be given to the scenario in which java script has been used to produce the flyout effect in the custom menu UI.  In many cases that I have seen, the commandLink can only be invoked once, since there is a conflict between the custom java script with the ADF frameworks own scripting to control the view.  The recommendation here, would be to use a pure CSS approach to acheive the dropdown effects. One very important thing to note.  Due to another BUG, the WebCenter environement must be patched to BP3 (patch  p14076906).  Otherwise the leak is still present using the goLinkPrettyUrl method.  Thanks E.T.!  Now I can phone home and not worry about my application running out of resources due to my custom navigation! 

    Read the article

< Previous Page | 104 105 106 107 108 109 110 111 112  | Next Page >