Search Results

Search found 333 results on 14 pages for 'transformations'.

Page 13/14 | < Previous Page | 9 10 11 12 13 14  | Next Page >

  • The Sensemaking Spectrum for Business Analytics: Translating from Data to Business Through Analysis

    - by Joe Lamantia
    One of the most compelling outcomes of our strategic research efforts over the past several years is a growing vocabulary that articulates our cumulative understanding of the deep structure of the domains of discovery and business analytics. Modes are one example of the deep structure we’ve found.  After looking at discovery activities across a very wide range of industries, question types, business needs, and problem solving approaches, we've identified distinct and recurring kinds of sensemaking activity, independent of context.  We label these activities Modes: Explore, compare, and comprehend are three of the nine recognizable modes.  Modes describe *how* people go about realizing insights.  (Read more about the programmatic research and formal academic grounding and discussion of the modes here: https://www.researchgate.net/publication/235971352_A_Taxonomy_of_Enterprise_Search_and_Discovery) By analogy to languages, modes are the 'verbs' of discovery activity.  When applied to the practical questions of product strategy and development, the modes of discovery allow one to identify what kinds of analytical activity a product, platform, or solution needs to support across a spread of usage scenarios, and then make concrete and well-informed decisions about every aspect of the solution, from high-level capabilities, to which specific types of information visualizations better enable these scenarios for the types of data users will analyze. The modes are a powerful generative tool for product making, but if you've spent time with young children, or had a really bad hangover (or both at the same time...), you understand the difficult of communicating using only verbs.  So I'm happy to share that we've found traction on another facet of the deep structure of discovery and business analytics.  Continuing the language analogy, we've identified some of the ‘nouns’ in the language of discovery: specifically, the consistently recurring aspects of a business that people are looking for insight into.  We call these discovery Subjects, since they identify *what* people focus on during discovery efforts, rather than *how* they go about discovery as with the Modes. Defining the collection of Subjects people repeatedly focus on allows us to understand and articulate sense making needs and activity in more specific, consistent, and complete fashion.  In combination with the Modes, we can use Subjects to concretely identify and define scenarios that describe people’s analytical needs and goals.  For example, a scenario such as ‘Explore [a Mode] the attrition rates [a Measure, one type of Subject] of our largest customers [Entities, another type of Subject] clearly captures the nature of the activity — exploration of trends vs. deep analysis of underlying factors — and the central focus — attrition rates for customers above a certain set of size criteria — from which follow many of the specifics needed to address this scenario in terms of data, analytical tools, and methods. We can also use Subjects to translate effectively between the different perspectives that shape discovery efforts, reducing ambiguity and increasing impact on both sides the perspective divide.  For example, from the language of business, which often motivates analytical work by asking questions in business terms, to the perspective of analysis.  The question posed to a Data Scientist or analyst may be something like “Why are sales of our new kinds of potato chips to our largest customers fluctuating unexpectedly this year?” or “Where can innovate, by expanding our product portfolio to meet unmet needs?”.  Analysts translate questions and beliefs like these into one or more empirical discovery efforts that more formally and granularly indicate the plan, methods, tools, and desired outcomes of analysis.  From the perspective of analysis this second question might become, “Which customer needs of type ‘A', identified and measured in terms of ‘B’, that are not directly or indirectly addressed by any of our current products, offer 'X' potential for ‘Y' positive return on the investment ‘Z' required to launch a new offering, in time frame ‘W’?  And how do these compare to each other?”.  Translation also happens from the perspective of analysis to the perspective of data; in terms of availability, quality, completeness, format, volume, etc. By implication, we are proposing that most working organizations — small and large, for profit and non-profit, domestic and international, and in the majority of industries — can be described for analytical purposes using this collection of Subjects.  This is a bold claim, but simplified articulation of complexity is one of the primary goals of sensemaking frameworks such as this one.  (And, yes, this is in fact a framework for making sense of sensemaking as a category of activity - but we’re not considering the recursive aspects of this exercise at the moment.) Compellingly, we can place the collection of subjects on a single continuum — we call it the Sensemaking Spectrum — that simply and coherently illustrates some of the most important relationships between the different types of Subjects, and also illuminates several of the fundamental dynamics shaping business analytics as a domain.  As a corollary, the Sensemaking Spectrum also suggests innovation opportunities for products and services related to business analytics. The first illustration below shows Subjects arrayed along the Sensemaking Spectrum; the second illustration presents examples of each kind of Subject.  Subjects appear in colors ranging from blue to reddish-orange, reflecting their place along the Spectrum, which indicates whether a Subject addresses more the viewpoint of systems and data (Data centric and blue), or people (User centric and orange).  This axis is shown explicitly above the Spectrum.  Annotations suggest how Subjects align with the three significant perspectives of Data, Analysis, and Business that shape business analytics activity.  This rendering makes explicit the translation and bridging function of Analysts as a role, and analysis as an activity. Subjects are best understood as fuzzy categories [http://georgelakoff.files.wordpress.com/2011/01/hedges-a-study-in-meaning-criteria-and-the-logic-of-fuzzy-concepts-journal-of-philosophical-logic-2-lakoff-19731.pdf], rather than tightly defined buckets.  For each Subject, we suggest some of the most common examples: Entities may be physical things such as named products, or locations (a building, or a city); they could be Concepts, such as satisfaction; or they could be Relationships between entities, such as the variety of possible connections that define linkage in social networks.  Likewise, Events may indicate a time and place in the dictionary sense; or they may be Transactions involving named entities; or take the form of Signals, such as ‘some Measure had some value at some time’ - what many enterprises understand as alerts.   The central story of the Spectrum is that though consumers of analytical insights (represented here by the Business perspective) need to work in terms of Subjects that are directly meaningful to their perspective — such as Themes, Plans, and Goals — the working realities of data (condition, structure, availability, completeness, cost) and the changing nature of most discovery efforts make direct engagement with source data in this fashion impossible.  Accordingly, business analytics as a domain is structured around the fundamental assumption that sense making depends on analytical transformation of data.  Analytical activity incrementally synthesizes more complex and larger scope Subjects from data in its starting condition, accumulating insight (and value) by moving through a progression of stages in which increasingly meaningful Subjects are iteratively synthesized from the data, and recombined with other Subjects.  The end goal of  ‘laddering’ successive transformations is to enable sense making from the business perspective, rather than the analytical perspective.Synthesis through laddering is typically accomplished by specialized Analysts using dedicated tools and methods. Beginning with some motivating question such as seeking opportunities to increase the efficiency (a Theme) of fulfillment processes to reach some level of profitability by the end of the year (Plan), Analysts will iteratively wrangle and transform source data Records, Values and Attributes into recognizable Entities, such as Products, that can be combined with Measures or other data into the Events (shipment of orders) that indicate the workings of the business.  More complex Subjects (to the right of the Spectrum) are composed of or make reference to less complex Subjects: a business Process such as Fulfillment will include Activities such as confirming, packing, and then shipping orders.  These Activities occur within or are conducted by organizational units such as teams of staff or partner firms (Networks), composed of Entities which are structured via Relationships, such as supplier and buyer.  The fulfillment process will involve other types of Entities, such as the products or services the business provides.  The success of the fulfillment process overall may be judged according to a sophisticated operating efficiency Model, which includes tiered Measures of business activity and health for the transactions and activities included.  All of this may be interpreted through an understanding of the operational domain of the businesses supply chain (a Domain).   We'll discuss the Spectrum in more depth in succeeding posts.

    Read the article

  • Saving UIImagePickerController images to Core Data without crashing

    - by Gordon Fontenot
    I have been trying to minimize my memory footprint with UIImagePickerController, but I'm starting to think that the memory problems I am having are resulting from poor memory management, instead of a particular way to handle the UIImagePickerController object. My workflow is this: The "Edit Image" button is clicked, which presents a UIActionSheet. This action sheet allows you to delete, take a picture, choose from the library, or cancel. If you select Choose from the library or Take Picture, I alloc an instance of UIImagePickerController and present it, followed by a release of UIImagePickerController: -(void)actionSheet:(UIActionSheet *)actionSheet clickedButtonAtIndex:(NSInteger)buttonIndex { if (actionSheet.tag != 999) { UIImagePickerController *imagePicker = [[UIImagePickerController alloc] init]; imagePicker.delegate = self; BOOL pickImage = nil; if (actionSheet.tag == iPhoneWithDelete) { switch (buttonIndex) { case 0: object.objectImage = nil; pickImage = NO; break; case 1: imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera; pickImage = YES; break; case 2: imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; pickImage = YES; break; default: pickImage = NO; break; } } else if (actionSheet.tag == iPhoneNoDelete) { switch (buttonIndex) { case 0: imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera; pickImage = YES; break; case 1: imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; pickImage = YES; break; default: pickImage = NO; break; } } else if (actionSheet.tag == iPodWithDelete) { switch (buttonIndex) { case 0: object.objectImage = nil; pickImage = NO; break; case 1: imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; pickImage = YES; break; default: pickImage = NO; break; } } else if (actionSheet.tag == iPodNoDelete) { switch (buttonIndex) { case 0: imagePicker.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum; pickImage = YES; break; default: pickImage = NO; break; } } if (pickImage) { imagePicker.allowsEditing = YES; [self presentModalViewController:imagePicker animated:YES]; } else { [self setupImageButton]; [self setupChooseImageButton]; } [imagePicker release]; } } Once I get a selection back from the UIImagePickerController, I save 2 images, a resized version of the edited image to use for a thumbnail, and a 800x600 version of the original unedited image into a relationship attribute (Transformational, using the same UIImage to PNG transformations found in the Recipes demo code) for display use: (the resize methods are based on the one demoed in this SO post.) - (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info { [self dismissModalViewControllerAnimated:YES]; NSManagedObject *oldImage = object.imageFull; if (oldImage != nil) { [object.managedObjectContext deleteObject:oldImage]; } NSManagedObject *image = [NSEntityDescription insertNewObjectForEntityForName:@"Image" inManagedObjectContext:object.managedObjectContext]; object.imageFull = image; UIImage *rawImage = [info objectForKey:@"UIImagePickerControllerOriginalImage"]; CGSize size = CGSizeMake(800, 600); UIImage *fullImage = [UIImageManipulator scaleImage:rawImage toSize:size]; [image setValue:fullImage forKey:@"imageFull"]; UIImage *processedImage = [UIImageManipulator scaleImage:[info objectForKey:@"UIImagePickerControllerEditedImage"] toSize:CGSizeMake(75, 75)]; object.objectImage = processedImage; [self setupImageButton]; [self setupChooseImageButton]; rawImage = nil; fullImage = nil; processedImage = nil; } When I go through viewDidUnload I am setting self.object = nil, and [object release] during dealloc, but I'm still getting memory warnings after about 10 image changes, with a crash at around 20. It leads me to believe that I am not getting that full image out of memory the correct way. What am I missing here? And on a second note, does the Camera source use significantly more memory than the Photo Albums source? I tend to get more crashes when using the camera.

    Read the article

  • How to find problem with PHP XSLTProcessor when return from transformToXML is false and libxml_get_

    - by John
    I'm working on the code below to allow HTTP user agents that cannot perform XSL transformations to view the resources on my server. I'm mystified because the result of transformToXML is false, but the result of libxml_get_errors() is an empty array. As you can see, the code outputs the LibXSLT version ID and I'm getting the problem on WinVista with version 1.1.24. Is libxml_get_errors() not the right function to get the errors from the XSLTProcessor object? If you're interested in the XML documents, you can get them from http://bobberinteractive.com/index.xhtml and .../stylesheets/layout.xsl <?php //redirect browsers that can handle the source files. if (strpos ( $_SERVER ['HTTP_ACCEPT'], 'application/xhtml+xml' )) { header ( "HTTP/1.1 301 Moved Permanently" ); header ( "Location: http://" . $_SERVER ['SERVER_NAME'] . "/index.xhtml" ); header ( "Content-Type: text/text" ); echo "\nYour browser is capable of processing the <a href='/index.xhtml' site contents on its own."; die (); } //start by checking the template $baseDir = dirname ( __FILE__ ); $xslDoc = new DOMDocument (); if (! $xslDoc-load ( $baseDir . '/stylesheets/layout.xsl' )) { header ( "HTTP/1.1 500 Server Error" ); header ( "Content-Type: text/plain" ); echo "\n Can't load " . $baseDir . '/stylesheets/layout.xsl'; die (); } //resolve the requested resource (browsers that need transformation will request the resource without the suffix) $uri = $_SERVER ['REQUEST_URI']; $len = strlen ( $uri ); if (1 = $len || '/' == substr ( $uri, $len - 1 )) { $fileName = $baseDir . "/index.xhtml"; // use 'default' document if pathname ends in '/' } else { $fileName = $baseDir . (1 load ( $fileName )) { header ( "HTTP/1.1 500 Server Error" ); echo "\n Can't load " . $fileName; die (); } // now start the XSL template processing $proc = new XSLTProcessor (); $proc-importStylesheet ( $xslDoc ); $doc = $proc-transformToXML ( $xmlDoc ); if (false === $doc) { header ( "HTTP/1.1 500 Server Error" ); header ( "Content-Type: text/plain" ); echo "\n"; // HERE is where it gets strange: the value of $doc is false and libxml_get_errors returns 0 entries. display_xml_errors ( libxml_get_errors() ); die (); } header ( "Content-Type: text/html" ); echo "\n"; echo $doc; function display_xml_errors($errors) { echo count ( $errors ) . " Error(s) from LibXSLT " . LIBXSLT_DOTTED_VERSION; for($i = 0; $i level) { case LIBXML_ERR_WARNING : $return .= "Warning $error-code: "; break; case LIBXML_ERR_ERROR : $return .= "Error $error-code: "; break; case LIBXML_ERR_FATAL : $return .= "Fatal Error $error-code: "; break; } $return .= trim ( $error-message ) . "\n Line: $error-line" . "\n Column: $error-column"; if ($error-file) { $return .= "\n File: $error-file"; } echo "$return\n\n--------------------------------------------\n\n"; } }

    Read the article

  • Can a conforming C implementation #define NULL to be something wacky

    - by janks
    I'm asking because of the discussion that's been provoked in this thread: http://stackoverflow.com/questions/2597142/when-was-the-null-macro-not-0/2597232 Trying to have a serious back-and-forth discussion using comments under other people's replies is not easy or fun. So I'd like to hear what our C experts think without being restricted to 500 characters at a time. The C standard has precious few words to say about NULL and null pointer constants. There's only two relevant sections that I can find. First: 3.2.2.3 Pointers An integral constant expression with the value 0, or such an expression cast to type void * , is called a null pointer constant. If a null pointer constant is assigned to or compared for equality to a pointer, the constant is converted to a pointer of that type. Such a pointer, called a null pointer, is guaranteed to compare unequal to a pointer to any object or function. and second: 4.1.5 Common definitions <stddef.h> The macros are NULL which expands to an implementation-defined null pointer constant; The question is, can NULL expand to an implementation-defined null pointer constant that is different from the ones enumerated in 3.2.2.3? In particular, could it be defined as: #define NULL __builtin_magic_null_pointer Or even: #define NULL ((void*)-1) My reading of 3.2.2.3 is that it specifies that an integral constant expression of 0, and an integral constant expression of 0 cast to type void* must be among the forms of null pointer constant that the implementation recognizes, but that it isn't meant to be an exhaustive list. I believe that the implementation is free to recognize other source constructs as null pointer constants, so long as no other rules are broken. So for example, it is provable that #define NULL (-1) is not a legal definition, because in if (NULL) do_stuff(); do_stuff() must not be called, whereas with if (-1) do_stuff(); do_stuff() must be called; since they are equivalent, this cannot be a legal definition of NULL. But the standard says that integer-to-pointer conversions (and vice-versa) are implementation-defined, therefore it could define the conversion of -1 to a pointer as a conversion that produces a null pointer. In which case if ((void*)-1) would evaluate to false, and all would be well. So what do other people think? I'd ask for everybody to especially keep in mind the "as-if" rule described in 2.1.2.3 Program execution. It's huge and somewhat roundabout, so I won't paste it here, but it essentially says that an implementation merely has to produce the same observable side-effects as are required of the abstract machine described by the standard. It says that any optimizations, transformations, or whatever else the compiler wants to do to your program are perfectly legal so long as the observable side-effects of the program aren't changed by them. So if you are looking to prove that a particular definition of NULL cannot be legal, you'll need to come up with a program that can prove it. Either one like mine that blatantly breaks other clauses in the standard, or one that can legally detect whatever magic the compiler has to do to make the strange NULL definition work. Steve Jessop found an example of way for a program to detect that NULL isn't defined to be one of the two forms of null pointer constants in 3.2.2.3, which is to stringize the constant: #define stringize_helper(x) #x #define stringize(x) stringize_helper(x) Using this macro, one could puts(stringize(NULL)); and "detect" that NULL does not expand to one of the forms in 3.2.2.3. Is that enough to render other definitions illegal? I just don't know. Thanks!

    Read the article

  • Converting XML to RSS

    - by hkw
    I've got an xml feed that will be published in one location of our website, and would like to repurpose that for an RSS feed. A few different pages on the website will also be referencing the same xml - all of those transformations are set up and working. The base xml file (XMLTEST.xml) is using this structure: <POST> <item> <POST_ID>80000852</POST_ID> <POST_TITLE>title</POST_TITLE> <POST_CHANNEL>I</POST_CHANNEL> <POST_DESC>description</POST_DESC> <LINK>http://www...</LINK> <STOC>N</STOC> </item> </POST> I'm trying to transform the xml into an RSS feed using the following setup (feed.xml + rss.xsl) feed.xml: <?xml-stylesheet href="rss.xsl" type="text/xsl"?> <RSSChannels> <RSSChannel src="XMLTEST.xml"/> </RSSChannels> rss.xsl: <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:output method="xml" omit-xml-declaration="yes" /> <xsl:template match="RSSChannels"> <rss version="2.0"> <channel> <title>site title</title> <link></link> <description>Site description...</description> <xsl:apply-templates /> </channel> </rss> </xsl:template> <xsl:template match="RSSChannel"> <xsl:apply-templates select="document(@src)" /> </xsl:template> <xsl:template match="item"> <xsl:choose> <xsl:when test="STOC = 'Y'"></xsl:when> <xsl:when test="POST_CHANNEL = 'I'"></xsl:when> <xsl:otherwise> <item> <title> <xsl:value-of select="*[local-name()='POST_TITLE']" /> </title> <link> <xsl:value-of select="*[local-name()='LINK']" /> </link> <description> <xsl:value-of select="*[local-name()='POST_DESC']" /> </description> </item> </xsl:otherwise> </xsl:choose> </xsl:template> </xsl:stylesheet> When I try to view the output of feed.xml in Firefox, all of the filtering is applied correctly (sorting out items that shouldn't be published in that channel), but the page outputs as plain text, rather than the usual feed-detection that takes place in Firefox. Any ideas on what I'm missing? Thanks for any suggestions you can provide.

    Read the article

  • How to scale rotated objects properly in Actionscript 3?

    - by Tom
    This is unfortunately a quite complex issue to explain, so please don't get discouraged by the wall of text - it's there for a reason. ;) I'm working on a transformation manager for flash, written with Actionscript 3. Users can place objects on the screen, for example a rectangle. This rectangle can then be selected and transformed: move, scale or rotate. Because flash by default rotates around the top left point of the object, and I want it to rotate around the center, I created a wrapper setup for each display object (eg. a rectangle). This is how the wrappers are setup: //the position wrapper makes sure that we do get the top left position when we access x and y var positionWrapper:Sprite = new Sprite(); positionWrapper.x = renderObject.x; positionWrapper.y = renderObject.y; //set the render objects location to center at the rotation wrappers top left renderObject.x = 0 - renderObject.width / 2; renderObject.y = 0 - renderObject.height / 2; //now create a rotation wrapper, at the center of the display object var rotationWrapper:Sprite = new Sprite(); rotationWrapper.x = renderObject.width / 2; rotationWrapper.y = renderObject.height / 2; //put the rotation wrapper inside the position wrapper and the render object inside the rotation wrapper positionWrapper.addChild(rotationWrapper); rotationWrapper.addChild(renderObject); Now, the x and y of the object can be accessed and set directly: mainWrapper.x or mainWrapper.y. The rotation can be set and accessed from the child of this main wrapper: mainWrapper.getChildAt(0).rotation. Finally, the width and height of the display object can be retreived and set by getting the child of the rotation wrapper and accessing the display object directly. An example on how I access them: //get wrappers and render object var positionWrapper:Sprite = currentSelection["render"]; var rotationWrapper:Sprite = positionWrapper.getChildAt(0) as Sprite; var renderObject:DisplayObject = rotationWrapper.getChildAt(0); This works perfectly for all initial transformations: moving, scaling and rotating. However, the problem arises when you first rotate an object (eg. 45 degrees) and then scale it. The scaled object is getting out of shape and doesn't scale as it should. This for example happens when you scale to the left. Scaling left is basically adding n width to the object and then reduce the x coord of the position wrapper by n too: renderObject.width -= diffX; positionWrapper.x += diffX; This works when the object is not rotated. However, when it is, the position wrapper won't be rotated as it is a parent of the rotation wrapper. This will make the position wrapper move left horizontally while the width of the object is increased diagonally. I hope this makes any sense, if not, please tell me and I'll try to elaborate more. Now, to the question: should I use a different kind of setup, system or structure? Should I maybe use matrixes, if so, how would you keep a static width/height after rotation? Or how do I fix my current wrapper system for scaling after rotation? Any help is appreciated.

    Read the article

  • convert pixels into image

    - by Zeta Op
    what i am trying to do is to convert a pixel from a video cam, into an image to expalin it better imagine a 3d model so.. the pixels would be each polying, and i want to do is to conver each polyigon into an image. what i have so far is this ** import processing.video.*; PImage hoja; Capture cam; boolean uno, dos, tres, cuatro; import ddf.minim.*; Minim minim; AudioPlayer audio; float set; void setup() { //audio minim = new Minim(this); // audio = minim.loadFile("audio"); // audio.loop(); // uno=false; dos=false; tres=false; cuatro=true; size(640, 480); hoja=loadImage("hoja.gif"); cam = new Capture(this, width, height); cam.start(); } void draw() { if (cam.available() == true) { cam.read(); if (uno==true) { filtroUno(); image(cam, 0, 0, 640, 480); } if (dos==true) { filtroDos(); } if(tres==true){ filtroTres(); } if(cuatro==true){ filtroCuatro(); image(cam, set, 0,640,480); } } // The following does the same, and is faster when just drawing the image // without any additional resizing, transformations, or tint. //set(0, 0, cam); } void filtroUno() { cam.loadPixels(); hoja.loadPixels(); for (int i=0;i<cam.pixels.length;i++) { if (brightness(cam.pixels[i])>110) { cam.pixels[i]=color(0, 255, 255); } else { cam.pixels[i]=color(255, 0, 0); } } for (int i=0;i<cam.width;i+=10) { for (int j=0;j<cam.height;j+=10) { int loc=i+(j*cam.width); if (cam.pixels[loc]==color(255, 0, 0)) { for (int x=i;x<i+10;x++) { for (int y=j;y<j+10;y++) { // println("bla"); int locDos=i+(j*cam.width); cam.pixels[locDos]=hoja.get(x, y); } } } } } cam.updatePixels(); } ** the broblem is that each pixel is creating me a matrix, so.. is not recreating what id that to do. i had the method filtroUno but it wasn't showing ok.. and was the result void filtroUno() { cam.loadPixels(); hoja.loadPixels(); for (int i=0;i<cam.pixels.length;i++) { if (brightness(cam.pixels[i])>110) { cam.pixels[i]=color(0, 255, 255); } else { cam.pixels[i]=color(255, 0, 0); } } for (int i=0;i<cam.width;i+=10) { for (int j=0;j<cam.height;j+=10) { int loc=i+j*hoja.width*10; if (cam.pixels[loc]==color(255, 0, 0)) { for (int x=i;x<i+10;x++) { for (int y=j;y<j+10;y++) { // println("bla"); int locDos=x+y*hoja.height*10; cam.pixels[locDos]=hoja.get(x, y); } } } } } cam.updatePixels(); } i hope you can help me thanks note: each red pixel should be the gif image the imge size is 10x10

    Read the article

  • Exam 70-480 Study Material: Programming in HTML5 with JavaScript and CSS3

    - by Stacy Vicknair
    Here’s a list of sources of information for the different elements that comprise the 70-480 exam: General Resources http://www.w3schools.com (As pointed out in David Pallmann’s blog some of this content is unverified, but it is a decent source of information. For more about when it isn’t decent, see http://www.w3fools.com ) http://www.bloggedbychris.com/2012/09/19/microsoft-exam-70-480-study-guide/ (A guy who did a lot of what I did already, sadly I found this halfway through finishing my resources list. This list is expertly put together so I would recommend checking it out.) http://davidpallmann.blogspot.com/2012/08/microsoft-certification-exam-70-480.html http://pluralsight.com/training/Courses (Yes, this isn’t free, but if you look at the course listing there is an entire section on HTML5, CSS3 and Javascript. You can always try the trial!)   Some of the links I put below will overlap with the other resources above, but I tried to find explanations that looked beneficial to me on links outside those already mentioned.   Test Breakdown Implement and Manipulate Document Structures and Objects (24%) Create the document structure. o This objective may include but is not limited to: structure the UI by using semantic markup, including for search engines and screen readers (Section, Article, Nav, Header, Footer, and Aside); create a layout container in HTML http://www.w3schools.com/html/html5_new_elements.asp   Write code that interacts with UI controls. o This objective may include but is not limited to: programmatically add and modify HTML elements; implement media controls; implement HTML5 canvas and SVG graphics http://www.w3schools.com/html/html5_canvas.asp http://www.w3schools.com/html/html5_svg.asp   Apply styling to HTML elements programmatically. o This objective may include but is not limited to: change the location of an element; apply a transform; show and hide elements   Implement HTML5 APIs. o This objective may include but is not limited to: implement storage APIs, AppCache API, and Geolocation API http://www.w3schools.com/html/html5_geolocation.asp http://www.w3schools.com/html/html5_webstorage.asp http://www.w3schools.com/html/html5_app_cache.asp   Establish the scope of objects and variables. o This objective may include but is not limited to: define the lifetime of variables; keep objects out of the global namespace; use the “this” keyword to reference an object that fired an event; scope variables locally and globally http://robertnyman.com/2008/10/09/explaining-javascript-scope-and-closures/ http://www.quirksmode.org/js/this.html   Create and implement objects and methods. o This objective may include but is not limited to: implement native objects; create custom objects and custom properties for native objects using prototypes and functions; inherit from an object; implement native methods and create custom methods http://www.javascriptkit.com/javatutors/object.shtml http://www.crockford.com/javascript/inheritance.html http://stackoverflow.com/questions/1635116/javascript-class-method-vs-class-prototype-method http://www.javascriptkit.com/javatutors/proto.shtml     Implement Program Flow (25%) Implement program flow. o This objective may include but is not limited to: iterate across collections and array items; manage program decisions by using switch statements, if/then, and operators; evaluate expressions http://www.javascriptkit.com/jsref/looping.shtml http://www.javascriptkit.com/javatutors/varshort.shtml http://www.javascriptkit.com/javatutors/switch.shtml   Raise and handle an event. o This objective may include but is not limited to: handle common events exposed by DOM (OnBlur, OnFocus, OnClick); declare and handle bubbled events; handle an event by using an anonymous function http://dev.w3.org/2006/webapi/DOM-Level-3-Events/html/DOM3-Events.html http://javascript.info/tutorial/bubbling-and-capturing   Implement exception handling. o This objective may include but is not limited to: set and respond to error codes; throw an exception; request for null checks; implement try-catch-finally blocks http://www.javascriptkit.com/javatutors/trycatch.shtml   Implement a callback. o This objective may include but is not limited to: receive messages from the HTML5 WebSocket API; use jQuery to make an AJAX call; wire up an event; implement a callback by using anonymous functions; handle the “this” pointer http://www.w3.org/TR/2011/WD-websockets-20110419/ http://www.html5rocks.com/en/tutorials/websockets/basics/ http://api.jquery.com/jQuery.ajax/   Create a web worker process. o This objective may include but is not limited to: start and stop a web worker; pass data to a web worker; configure timeouts and intervals on the web worker; register an event listener for the web worker; limitations of a web worker https://developer.mozilla.org/en-US/docs/DOM/Using_web_workers http://www.html5rocks.com/en/tutorials/workers/basics/   Access and Secure Data (26%) Validate user input by using HTML5 elements. o This objective may include but is not limited to: choose the appropriate controls based on requirements; implement HTML input types and content attributes (for example, required) to collect user input http://diveintohtml5.info/forms.html   Validate user input by using JavaScript. o This objective may include but is not limited to: evaluate a regular expression to validate the input format; validate that you are getting the right kind of data type by using built-in functions; prevent code injection http://www.regular-expressions.info/javascript.html http://msdn.microsoft.com/en-us/library/66ztdbe6(v=vs.94).aspx https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Operators/typeof http://blog.stackoverflow.com/2008/06/safe-html-and-xss/ http://stackoverflow.com/questions/942011/how-to-prevent-javascript-injection-attacks-within-user-generated-html   Consume data. o This objective may include but is not limited to: consume JSON and XML data; retrieve data by using web services; load data or get data from other sources by using XMLHTTPRequest http://www.erichynds.com/jquery/working-with-xml-jquery-and-javascript/ http://www.webdevstuff.com/86/javascript-xmlhttprequest-object.html http://www.json.org/ http://stackoverflow.com/questions/4935632/how-to-parse-json-in-javascript   Serialize, deserialize, and transmit data. o This objective may include but is not limited to: binary data; text data (JSON, XML); implement the jQuery serialize method; Form.Submit; parse data; send data by using XMLHTTPRequest; sanitize input by using URI/form encoding http://api.jquery.com/serialize/ http://www.javascript-coder.com/javascript-form/javascript-form-submit.phtml http://stackoverflow.com/questions/327685/is-there-a-way-to-read-binary-data-into-javascript https://developer.mozilla.org/en-US/docs/JavaScript/Reference/Global_Objects/encodeURI     Use CSS3 in Applications (25%) Style HTML text properties. o This objective may include but is not limited to: apply styles to text appearance (color, bold, italics); apply styles to text font (WOFF and @font-face, size); apply styles to text alignment, spacing, and indentation; apply styles to text hyphenation; apply styles for a text drop shadow http://www.w3schools.com/css/css_text.asp http://www.w3schools.com/css/css_font.asp http://nicewebtype.com/notes/2009/10/30/how-to-use-css-font-face/ http://webdesign.about.com/od/beginningcss/p/aacss5text.htm http://www.w3.org/TR/css3-text/ http://www.css3.info/preview/box-shadow/   Style HTML box properties. o This objective may include but is not limited to: apply styles to alter appearance attributes (size, border and rounding border corners, outline, padding, margin); apply styles to alter graphic effects (transparency, opacity, background image, gradients, shadow, clipping); apply styles to establish and change an element’s position (static, relative, absolute, fixed) http://net.tutsplus.com/tutorials/html-css-techniques/10-css3-properties-you-need-to-be-familiar-with/ http://www.w3schools.com/css/css_image_transparency.asp http://www.w3schools.com/cssref/pr_background-image.asp http://ie.microsoft.com/testdrive/graphics/cssgradientbackgroundmaker/default.html http://www.w3.org/TR/CSS21/visufx.html http://www.barelyfitz.com/screencast/html-training/css/positioning/ http://davidwalsh.name/css-fixed-position   Create a flexible content layout. o This objective may include but is not limited to: implement a layout using a flexible box model; implement a layout using multi-column; implement a layout using position floating and exclusions; implement a layout using grid alignment; implement a layout using regions, grouping, and nesting http://www.html5rocks.com/en/tutorials/flexbox/quick/ http://www.css3.info/preview/multi-column-layout/ http://msdn.microsoft.com/en-us/library/ie/hh673558(v=vs.85).aspx http://dev.w3.org/csswg/css3-grid-layout/ http://dev.w3.org/csswg/css3-regions/   Create an animated and adaptive UI. o This objective may include but is not limited to: animate objects by applying CSS transitions; apply 3-D and 2-D transformations; adjust UI based on media queries (device adaptations for output formats, displays, and representations); hide or disable controls http://www.bloggedbychris.com/2012/09/19/microsoft-exam-70-480-study-guide/   Find elements by using CSS selectors and jQuery. o This objective may include but is not limited to: choose the correct selector to reference an element; define element, style, and attribute selectors; find elements by using pseudo-elements and pseudo-classes (for example, :before, :first-line, :first-letter, :target, :lang, :checked, :first-child) http://www.bloggedbychris.com/2012/09/19/microsoft-exam-70-480-study-guide/   Structure a CSS file by using CSS selectors. o This objective may include but is not limited to: reference elements correctly; implement inheritance; override inheritance by using !important; style an element based on pseudo-elements and pseudo-classes (for example, :before, :first-line, :first-letter, :target, :lang, :checked, :first-child) http://www.bloggedbychris.com/2012/09/19/microsoft-exam-70-480-study-guide/   Technorati Tags: 70-480,CSS3,HTML5,HTML,CSS,JavaScript,Certification

    Read the article

  • CodePlex Daily Summary for Wednesday, April 07, 2010

    CodePlex Daily Summary for Wednesday, April 07, 2010New ProjectsAStar.net: AStar.net is a project for compute the A* path finding algorithm. It expose classes and interface that can be used for all purpose with multi-threa...Auto Complete for MOSS 2007: Auto Complete for MOSS 2007 (WSS 3.0) makes it easier for all user to fill list of nessesary string in the field. AutoPoco: AutoPoco is a framework with the purpose of fluently building test data from Plain Old CLR ObjectsBlueProject: BlueProjectColor Picker for MOSS 2007: Color picker for MOSS 2007 (WSS 3.0) makes it easier for some user to choose color in the field. ComBrowser2: ComBrowser2DCommunication: Communication Components and Software By Delphi On Windows.DirST: Allows one to replicate a DIRectory STructure without copying files. Written in C#.Discussion column for MOSS 2007: Discussion column for MOSS 2007/ (WSS 3.0) is a field column for different type lists such as Custom List, Document Library, Issue Tracking, not on...Effect Custom Tool for Visual Studio: Effect Custom Tool for Visual Studio is a visual studio 2008 extension that helps you generate c# classes from effect (*.fx) files for use with Xna...Energon: Energon is a framework to create and run software energy consumption tests. Requires a couple of PC with a TCP connections, a phidgets ammeter and ...Fiansa: Proyecto FiansaFirstSpark: FirstSpark is a sample Spark View Engine projectISOM: Internetowy System Obsługi Magisterek La Carta Mas Alta: La Carta Mas Alta is an open source card game totally written in PHP and HTML. This cross-platform and cross-browser game was tested under BeOS, Li...Near forums - ASP.NET MVC forum engine: Open source SEO friendly ASP.NET MVC forum engine. Features: Navigation in forums, topics and tags; login with Facebook Connect and Single sign-on;...PEC: Still editingPowershell Zip Export/Import Cmdlet Module: Powershellzip is a powershell module with a set of Cmdlets for zip file export, import and processingProgress bar for MOSS 2007: Progress bar for MOSS 2007 (WSS 3.0) makes it easier for some user to display progress in the field in percent (0-100%). SharePoint Accelerators: This project delivers a number of SharePoint accelerators that make your every-day SharePoint life easier.Shweet: SharePoint 2010 Team Messaging built with Pex: Shweet is a simple SharePoint Foundation 2010 application that allows teams to do messaging and subscriptions in the style similar to twitter. De...SilverlightEncoder: Video and audio encoder for Silverlight 4 Out-of-BrowserStarMath: Static Array Math Library: While there are already countless math libraries for performing common matrix/array functions, StarMath is distinguished by its simplicity and flex...StringToNumber: StringToNumber is a .NET library for parsing numbers in their written form into their numeric equivalent. It's developed in C#. It was created to...SubstitutionITIS: <project name="Substitution" language="c-sharp" to="myschool" whatdo="manageSubstitutionTeachers" />TamTam.SharePoint2010.LinkedIn: SP2010 and LinkedIn working togetherThink And Explore: Think my own wayusenet-o-matic: small mobile optimised webscript to search various NZB sources like newzbin and NZBmatrix and send donwloads to your SABnzbd server. Valid HTML5 Templates for VisualStudio: This project will host valid HTML templates for Visual Studio. The primary focus however will be on the newer standard HTML5 and Visual Studio 2010XSS Attack: This tool will simulate an attack on your database and update up to 5000 rows in every table and replace your strings in your database with random ...Yazgelistir Beta: Yazgelistir'in beta sitesiNew ReleasesAStar.net: AStar.net 1.0 downloads: AStar.net 1.0 downloadsAvalaible downloads:Astar.net dll - Runtime library ready to be included in a project. Astar.net source project - The Visu...AutoPoco: 0.1 - Initial Release: This is the initial release, changes pending, lots left to do, check it out though!Bag of Tricks: Bag of Tricks - WPF - Libraries and Sample App: Here is a drop of the Bag of Tricks. It targets the 3.5 Client Profile.Boxee Launcher: Boxee Launcher 1.0.1.4: Taskbar will now be hiddenDesignit Umbraco Newsletter Package: ver 1.0.0 beta1: Please remember this is a beta version. If you have any problems or issues, don't hesitate to use the issue tracker and/or forum on this site. Ple...Effect Custom Tool for Visual Studio: Effect Custom Tool for Visual Studio 2008: Effect Custom Tool for Visual Studio is a visual studio 2008 extension that helps you generate c# classes from effect (*.fx) files for use with Xna...Fluent Ribbon Control Suite: Fluent Ribbon Control Suite 1.1: Fluent Ribbon Control Suite 1.1 Includes: Fluent.dll (with .pdb and .xml) Showcase Application Samples Foundation (Tabs, Groups, Contextual Tab...GsGrid: gsgrid 1.6.5: gsgrid 1.6.5Home Access Plus+: v3.2.5.2: v3.2.5.1 Release Change Log: Attempt to fix Domain Admin Lookup box File Changes: ~/bin/CHS Extranet.dll ~/bin/CHS Extranet.pdbHulu Launcher: Hulu Launcher 1.0.1.4: Will now hide taskbar.Icarus Scene Engine: Icarus Professional 2 Alpha 2a v 1.10.404.936: Alpha release 2 of Icarus Professional. This release includes: IcarusX: The ActiveX-based browser control for rendering IPX projects online. Icaru...kdar: KDAR 0.0.19: KDAR - Kernel Debugger Anti Rootkit - thread start notifier routine check added - registry callback сheck added - NDIS6 checks added - some bug i...Mobile Device Browser File: Mobile Device Browser File (2010-04-07): The Mobile Browser Definition File contains definitions for individual mobile devices and browsers. At run time, ASP.NET uses the information in th...MvcContrib: a Codeplex Foundation project: Portable Area Template: Use this Visual Studio 2008 Project template to create new portable areas. Drop this file in your Documents\Visual Studio 2008\Templates\ProjectTe...Office Apps: 0.8.8: new ui for Document.Viewer bug fix'sProtoforma | Tactica Adversa: Skilful 0.3.4.477: RC1ROOT Builder: ROOT Builder 1.41: Simplifies building of ROOT on the windows platform by generating a Visual Studio C make project that will build and run (and debug ROOT). See Inst...SharePoint Accelerators: Search AutoComplete for SharePoint Lists: Search AutoComplete for SharePoint List is a Web Part that allows you to search contents of a SharePoint. Search is interactive and offers you resu...SharePoint Labs: SPLab4002A-FRA-Level100: SPLab4002A-FRA-Level100 This SharePoint Lab will teach you the 2nd best practice you should apply when writing code with the SharePoint API. Lab La...SharePoint Labs: SPLab4003A-FRA-Level100: SPLab4003A-FRA-Level100 This SharePoint Lab will teach you the 3rd best practice you should apply when writing code with the SharePoint API. Lab La...SQL Compact data and schema script utility: Version 3.0: This release contains 4 downloadable files: - SSMS 2008 scripting add-in - SQL Server 2005/2008 command line utility to generate a script with sch...StarMath: Static Array Math Library: StarMath Source Files: The source file package includes the main library StarMath.dll (and it's source files), and an example exe project to invoke commands from StarMath.stefvanhooijdonk.com: Powershell Solution Install Script: Powershell Example script to install a SP2010 Solution, which actually waits for the Retraction and Deployment jobs.TamTam.SharePoint2010.LinkedIn: 0.0.0.1: How to use/install Small instruction to use this code/solution step 1 Add the Farm Solution to your SP2010 installation step 2 Go to the MySite H...usenet-o-matic: V 0.2: supports Newzbin and NZBmatrix as index sources and SABnzbd as download serverVCC: Latest build, v2.1.30406.0: Automatic drop of latest buildVisual Studio DSite: E-Z Image To PDF Converter Beta: This simple little program can convert all common image formats into pdfs and can even encrypt the pdf with an encrpytion key.Web and Load Test Plugins for Visual Studio Team Test: Release 2.0: Release 2.0 is targeted at VS 2010. VS 2010 exposes major new extensibility points: 1) Recorder plugins enable you to do custom correlations and o...Wicked Compression ASP.NET HTTP Module: WickedCompressionModule 4.0 Alpha: New Features, New Enhancements! - Visual Studio 2010 Projects! - Support for ASP.NET 2.0, 3.5, and 4.0 - AJAX Support for ASP.NET 3.5 and 4.0 - Bu...x5s - test encodings and character transformations to find XSS hotspots: x5s v1.0.0 beta: This is the v1.0 beta release of x5s. All feedback welcome in planning for the next release. Make sure Fiddler is installed prior to running th...xvanneste: Silverlight SharePoint: Fichiers du webcast sur silverlight: Silverlight OM Silverlight Webpart Silverlight Embedded ressource Silverlight List ViewYet Another Web App Monitoring Tool (YAWAMT): yawamt v0.5: A new release has seen the light :-) I've lowered the release to version 0.5 but changed some major things: - deletion of URLS works - settings can...Zinc Launcher: Zinc Launcher 1.0.1.1: Taskbar will now be hidden Delay to show Zinc was reduced to improve responsiveness.Most Popular ProjectsRawrWBFS ManagerMicrosoft SQL Server Product Samples: DatabaseASP.NET Ajax LibrarySilverlight ToolkitAJAX Control ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesFacebook Developer ToolkitMost Active ProjectsGraffiti CMSnopCommerce. Open Source online shop e-commerce solution.RawrFacebook Developer ToolkitShweet: SharePoint 2010 Team Messaging built with Pexpatterns & practices – Enterprise LibraryNcqrs Framework - The CQRS framework for .NETjQuery Library for SharePoint Web ServicesIonics Isapi Rewrite FilterAcadsys

    Read the article

  • Demystifying Silverlight Dependency Properties

    - by dwahlin
    I have the opportunity to teach a lot of people about Silverlight (amongst other technologies) and one of the topics that definitely confuses people initially is the concept of dependency properties. I confess that when I first heard about them my initial thought was “Why do we need a specialized type of property?” While you can certainly use standard CLR properties in Silverlight applications, Silverlight relies heavily on dependency properties for just about everything it does behind the scenes. In fact, dependency properties are an essential part of the data binding, template, style and animation functionality available in Silverlight. They simply back standard CLR properties. In this post I wanted to put together a (hopefully) simple explanation of dependency properties and why you should care about them if you’re currently working with Silverlight or looking to move to it.   What are Dependency Properties? XAML provides a great way to define layout controls, user input controls, shapes, colors and data binding expressions in a declarative manner. There’s a lot that goes on behind the scenes in order to make XAML work and an important part of that magic is the use of dependency properties. If you want to bind data to a property, style it, animate it or transform it in XAML then the property involved has to be a dependency property to work properly. If you’ve ever positioned a control in a Canvas using Canvas.Left or placed a control in a specific Grid row using Grid.Row then you’ve used an attached property which is a specialized type of dependency property. Dependency properties play a key role in XAML and the overall Silverlight framework. Any property that you bind, style, template, animate or transform must be a dependency property in Silverlight applications. You can programmatically bind values to controls and work with standard CLR properties, but if you want to use the built-in binding expressions available in XAML (one of my favorite features) or the Binding class available through code then dependency properties are a necessity. Dependency properties aren’t needed in every situation, but if you want to customize your application very much you’ll eventually end up needing them. For example, if you create a custom user control and want to expose a property that consumers can use to change the background color, you have to define it as a dependency property if you want bindings, styles and other features to be available for use. Now that the overall purpose of dependency properties has been discussed let’s take a look at how you can create them. Creating Dependency Properties When .NET first came out you had to write backing fields for each property that you defined as shown next: Brush _ScheduleBackground; public Brush ScheduleBackground { get { return _ScheduleBackground; } set { _ScheduleBackground = value; } } Although .NET 2.0 added auto-implemented properties (for example: public Brush ScheduleBackground { get; set; }) where the compiler would automatically generate the backing field used by get and set blocks, the concept is still the same as shown in the above code; a property acts as a wrapper around a field. Silverlight dependency properties replace the _ScheduleBackground field shown in the previous code and act as the backing store for a standard CLR property. The following code shows an example of defining a dependency property named ScheduleBackgroundProperty: public static readonly DependencyProperty ScheduleBackgroundProperty = DependencyProperty.Register("ScheduleBackground", typeof(Brush), typeof(Scheduler), null);   Looking through the code the first thing that may stand out is that the definition for ScheduleBackgroundProperty is marked as static and readonly and that the property appears to be of type DependencyProperty. This is a standard pattern that you’ll use when working with dependency properties. You’ll also notice that the property explicitly adds the word “Property” to the name which is another standard you’ll see followed. In addition to defining the property, the code also makes a call to the static DependencyProperty.Register method and passes the name of the property to register (ScheduleBackground in this case) as a string. The type of the property, the type of the class that owns the property and a null value (more on the null value later) are also passed. In this example a class named Scheduler acts as the owner. The code handles registering the property as a dependency property with the call to Register(), but there’s a little more work that has to be done to allow a value to be assigned to and retrieved from the dependency property. The following code shows the complete code that you’ll typically use when creating a dependency property. You can find code snippets that greatly simplify the process of creating dependency properties out on the web. The MVVM Light download available from http://mvvmlight.codeplex.com comes with built-in dependency properties snippets as well. public static readonly DependencyProperty ScheduleBackgroundProperty = DependencyProperty.Register("ScheduleBackground", typeof(Brush), typeof(Scheduler), null); public Brush ScheduleBackground { get { return (Brush)GetValue(ScheduleBackgroundProperty); } set { SetValue(ScheduleBackgroundProperty, value); } } The standard CLR property code shown above should look familiar since it simply wraps the dependency property. However, you’ll notice that the get and set blocks call GetValue and SetValue methods respectively to perform the appropriate operation on the dependency property. GetValue and SetValue are members of the DependencyObject class which is another key component of the Silverlight framework. Silverlight controls and classes (TextBox, UserControl, CompositeTransform, DataGrid, etc.) ultimately derive from DependencyObject in their inheritance hierarchy so that they can support dependency properties. Dependency properties defined in Silverlight controls and other classes tend to follow the pattern of registering the property by calling Register() and then wrapping the dependency property in a standard CLR property (as shown above). They have a standard property that wraps a registered dependency property and allows a value to be assigned and retrieved. If you need to expose a new property on a custom control that supports data binding expressions in XAML then you’ll follow this same pattern. Dependency properties are extremely useful once you understand why they’re needed and how they’re defined. Detecting Changes and Setting Defaults When working with dependency properties there will be times when you want to assign a default value or detect when a property changes so that you can keep the user interface in-sync with the property value. Silverlight’s DependencyProperty.Register() method provides a fourth parameter that accepts a PropertyMetadata object instance. PropertyMetadata can be used to hook a callback method to a dependency property. The callback method is called when the property value changes. PropertyMetadata can also be used to assign a default value to the dependency property. By assigning a value of null for the final parameter passed to Register() you’re telling the property that you don’t care about any changes and don’t have a default value to apply. Here are the different constructor overloads available on the PropertyMetadata class: PropertyMetadata Constructor Overload Description PropertyMetadata(Object) Used to assign a default value to a dependency property. PropertyMetadata(PropertyChangedCallback) Used to assign a property changed callback method. PropertyMetadata(Object, PropertyChangedCalback) Used to assign a default property value and a property changed callback.   There are many situations where you need to know when a dependency property changes or where you want to apply a default. Performing either task is easily accomplished by creating a new instance of the PropertyMetadata class and passing the appropriate values to its constructor. The following code shows an enhanced version of the initial dependency property code shown earlier that demonstrates these concepts: public Brush ScheduleBackground { get { return (Brush)GetValue(ScheduleBackgroundProperty); } set { SetValue(ScheduleBackgroundProperty, value); } } public static readonly DependencyProperty ScheduleBackgroundProperty = DependencyProperty.Register("ScheduleBackground", typeof(Brush), typeof(Scheduler), new PropertyMetadata(new SolidColorBrush(Colors.LightGray), ScheduleBackgroundChanged)); private static void ScheduleBackgroundChanged(DependencyObject d, DependencyPropertyChangedEventArgs e) { var scheduler = d as Scheduler; scheduler.Background = e.NewValue as Brush; } The code wires ScheduleBackgroundProperty to a property change callback method named ScheduleBackgroundChanged. What’s interesting is that this callback method is static (as is the dependency property) so it gets passed the instance of the object that owns the property that has changed (otherwise we wouldn’t be able to get to the object instance). In this example the dependency object is cast to a Scheduler object and its Background property is assigned to the new value of the dependency property. The code also handles assigning a default value of LightGray to the dependency property by creating a new instance of a SolidColorBrush. To Sum Up In this post you’ve seen the role of dependency properties and how they can be defined in code. They play a big role in XAML and the overall Silverlight framework. You can think of dependency properties as being replacements for fields that you’d normally use with standard CLR properties. In addition to a discussion on how dependency properties are created, you also saw how to use the PropertyMetadata class to define default dependency property values and hook a dependency property to a callback method. The most important thing to understand with dependency properties (especially if you’re new to Silverlight) is that they’re needed if you want a property to support data binding, animations, transformations and styles properly. Any time you create a property on a custom control or user control that has these types of requirements you’ll want to pick a dependency property over of a standard CLR property with a backing field. There’s more that can be covered with dependency properties including a related property called an attached property….more to come.

    Read the article

  • SQL SERVER – Import CSV into Database – Transferring File Content into a Database Table using CSVexpress

    - by pinaldave
    One of the most common data integration tasks I run into is a desire to move data from a file into a database table.  Generally the user is familiar with his data, the structure of the file, and the database table, but is unfamiliar with data integration tools and therefore views this task as something that is difficult.  What these users really need is a point and click approach that minimizes the learning curve for the data integration tool.  This is what CSVexpress (www.CSVexpress.com) is all about!  It is based on expressor Studio, a data integration tool I’ve been reviewing over the last several months. With CSVexpress, moving data between data sources can be as simple as providing the database connection details, describing the structure of the incoming and outgoing data and then connecting two pre-programmed operators.   There’s no need to learn the intricacies of the data integration tool or to write code.  Let’s look at an example. Suppose I have a comma separated value data file with data similar to the following, which is a listing of terminated employees that includes their hiring and termination date, department, job description, and final salary. EMP_ID,STRT_DATE,END_DATE,JOB_ID,DEPT_ID,SALARY 102,13-JAN-93,24-JUL-98 17:00,Programmer,60,"$85,000" 101,21-SEP-89,27-OCT-93 17:00,Account Representative,110,"$65,000" 103,28-OCT-93,15-MAR-97 17:00,Account Manager,110,"$75,000" 304,17-FEB-96,19-DEC-99 17:00,Marketing,20,"$45,000" 333,24-MAR-98,31-DEC-99 17:00,Data Entry Clerk,50,"$35,000" 100,17-SEP-87,17-JUN-93 17:00,Administrative Assistant,90,"$40,000" 334,24-MAR-98,31-DEC-98 17:00,Sales Representative,80,"$40,000" 400,01-JAN-99,31-DEC-99 17:00,Sales Manager,80,"$55,000" Notice the concise format used for the date values, the fact that the termination date includes both date and time information, and that the salary is clearly identified as money by the dollar sign and digit grouping.  In moving this data to a database table I want to express the dates using a format that includes the century since it’s obvious that this listing could include employees who left the company in both the 20th and 21st centuries, and I want the salary to be stored as a decimal value without the currency symbol and grouping character.  Most data integration tools would require coding within a transformation operation to effect these changes, but not expressor Studio.  Directives for these modifications are included in the description of the incoming data. Besides starting the expressor Studio tool and opening a project, the first step is to create connection artifacts, which describe to expressor where data is stored.  For this example, two connection artifacts are required: a file connection, which encapsulates the file system location of my file; and a database connection, which encapsulates the database connection information.  With expressor Studio, I use wizards to create these artifacts. First click New Connection > File Connection in the Home tab of expressor Studio’s ribbon bar, which starts the File Connection wizard.  In the first window, I enter the path to the directory that contains the input file.  Note that the file connection artifact only specifies the file system location, not the name of the file. Then I click Next and enter a meaningful name for this connection artifact; clicking Finish closes the wizard and saves the artifact. To create the Database Connection artifact, I must know the location of, or instance name, of the target database and have the credentials of an account with sufficient privileges to write to the target table.  To use expressor Studio’s features to the fullest, this account should also have the authority to create a table. I click the New Connection > Database Connection in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  expressor Studio includes high-performance drivers for many relational database management systems, so I can simply make a selection from the “Supplied database drivers” drop down control.  If my desired RDBMS isn’t listed, I can optionally use an existing ODBC DSN by selecting the “Existing DSN” radio button. In the following window, I enter the connection details.  With Microsoft SQL Server, I may choose to use Windows Authentication rather than rather than account credentials.  After clicking Next, I enter a meaningful name for this connection artifact and clicking Finish closes the wizard and saves the artifact. Now I create a schema artifact, which describes the structure of the file data.  When expressor reads a file, all data fields are typed as strings.  In some use cases this may be exactly what is needed and there is no need to edit the schema artifact.  But in this example, editing the schema artifact will be used to specify how the data should be transformed; that is, reformat the dates to include century designations, change the employee and job ID’s to integers, and convert the salary to a decimal value. Again a wizard is used to create the schema artifact.  I click New Schema > Delimited Schema in the Home tab of expressor Studio’s ribbon bar, which starts the Database Connection wizard.  In the first window, I click Get Data from File, which then displays a listing of the file connections in the project.  When I click on the file connection I previously created, a browse window opens to this file system location; I then select the file and click Open, which imports 10 lines from the file into the wizard. I now view the file’s content and confirm that the appropriate delimiter characters are selected in the “Field Delimiter” and “Record Delimiter” drop down controls; then I click Next. Since the input file includes a header row, I can easily indicate that fields in the file should be identified through the corresponding header value by clicking “Set All Names from Selected Row. “ Alternatively, I could enter a different identifier into the Field Details > Name text box.  I click Next and enter a meaningful name for this schema artifact; clicking Finish closes the wizard and saves the artifact. Now I open the schema artifact in the schema editor.  When I first view the schema’s content, I note that the types of all attributes in the Semantic Type (the right-hand panel) are strings and that the attribute names are the same as the field names in the data file.  To change an attribute’s name and type, I highlight the attribute and click Edit in the Attributes grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Attribute window; I can change the attribute name and select the desired type from the “Data type” drop down control.  In this example, I change the name of each attribute to the name of the corresponding database table column (EmployeeID, StartingDate, TerminationDate, JobDescription, DepartmentID, and FinalSalary).  Then for the EmployeeID and DepartmentID attributes, I select Integer as the data type, for the StartingDate and TerminationDate attributes, I select Datetime as the data type, and for the FinalSalary attribute, I select the Decimal type. But I can do much more in the schema editor.  For the datetime attributes, I can set a constraint that ensures that the data adheres to some predetermined specifications; a starting date must be later than January 1, 1980 (the date on which the company began operations) and a termination date must be earlier than 11:59 PM on December 31, 1999.  I simply select the appropriate constraint and enter the value (1980-01-01 00:00 as the starting date and 1999-12-31 11:59 as the termination date). As a last step in setting up these datetime conversions, I edit the mapping, describing the format of each datetime type in the source file. I highlight the mapping line for the StartingDate attribute and click Edit Mapping in the Mappings grouping on the Schema > Edit tab of the editor’s ribbon bar.  This opens the Edit Mapping window in which I either enter, or select, a format that describes how the datetime values are represented in the file.  Note the use of Y01 as the syntax for the year.  This syntax is the indicator to expressor Studio to derive the century by setting any year later than 01 to the 20th century and any year before 01 to the 21st century.  As each datetime value is read from the file, the year values are transformed into century and year values. For the TerminationDate attribute, my format also indicates that the datetime value includes hours and minutes. And now to the Salary attribute. I open its mapping and in the Edit Mapping window select the Currency tab and the “Use currency” check box.  This indicates that the file data will include the dollar sign (or in Europe the Pound or Euro sign), which should be removed. And on the Grouping tab, I select the “Use grouping” checkbox and enter 3 into the “Group size” text box, a comma into the “Grouping character” text box, and a decimal point into the “Decimal separator” character text box. These entries allow the string to be properly converted into a decimal value. By making these entries into the schema that describes my input file, I’ve specified how I want the data transformed prior to writing to the database table and completely removed the requirement for coding within the data integration application itself. Assembling the data integration application is simple.  Onto the canvas I drag the Read File and Write Table operators, connecting the output of the Read File operator to the input of the Write Table operator. Next, I select the Read File operator and its Properties panel opens on the right-hand side of expressor Studio.  For each property, I can select an appropriate entry from the corresponding drop down control.  Clicking on the button to the right of the “File name” text box opens the file system location specified in the file connection artifact, allowing me to select the appropriate input file.  I indicate also that the first row in the file, the header row, should be skipped, and that any record that fails one of the datetime constraints should be skipped. I then select the Write Table operator and in its Properties panel specify the database connection, normal for the “Mode,” and the “Truncate” and “Create Missing Table” options.  If my target table does not yet exist, expressor will create the table using the information encapsulated in the schema artifact assigned to the operator. The last task needed to complete the application is to create the schema artifact used by the Write Table operator.  This is extremely easy as another wizard is capable of using the schema artifact assigned to the Read Table operator to create a schema artifact for the Write Table operator.  In the Write Table Properties panel, I click the drop down control to the right of the “Schema” property and select “New Table Schema from Upstream Output…” from the drop down menu. The wizard first displays the table description and in its second screen asks me to select the database connection artifact that specifies the RDBMS in which the target table will exist.  The wizard then connects to the RDBMS and retrieves a list of database schemas from which I make a selection.  The fourth screen gives me the opportunity to fine tune the table’s description.  In this example, I set the width of the JobDescription column to a maximum of 40 characters and select money as the type of the LastSalary column.  I also provide the name for the table. This completes development of the application.  The entire application was created through the use of wizards and the required data transformations specified through simple constraints and specifications rather than through coding.  To develop this application, I only needed a basic understanding of expressor Studio, a level of expertise that can be gained by working through a few introductory tutorials.  expressor Studio is as close to a point and click data integration tool as one could want and I urge you to try this product if you have a need to move data between files or from files to database tables. Check out CSVexpress in more detail.  It offers a few basic video tutorials and a preview of expressor Studio 3.5, which will support the reading and writing of data into Salesforce.com. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Pinal Dave, PostADay, SQL, SQL Authority, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLServer, T SQL, Technology

    Read the article

  • Tutorial: Getting Started with the NoSQL JavaScript / Node.js API for MySQL Cluster

    - by Mat Keep
    Tutorial authored by Craig Russell and JD Duncan  The MySQL Cluster team are working on a new NoSQL JavaScript connector for MySQL. The objectives are simplicity and high performance for JavaScript users: - allows end-to-end JavaScript development, from the browser to the server and now to the world's most popular open source database - native "NoSQL" access to the storage layer without going first through SQL transformations and parsing. Node.js is a complete web platform built around JavaScript designed to deliver millions of client connections on commodity hardware. With the MySQL NoSQL Connector for JavaScript, Node.js users can easily add data access and persistence to their web, cloud, social and mobile applications. While the initial implementation is designed to plug and play with Node.js, the actual implementation doesn't depend heavily on Node, potentially enabling wider platform support in the future. Implementation The architecture and user interface of this connector are very different from other MySQL connectors in a major way: it is an asynchronous interface that follows the event model built into Node.js. To make it as easy as possible, we decided to use a domain object model to store the data. This allows for users to query data from the database and have a fully-instantiated object to work with, instead of having to deal with rows and columns of the database. The domain object model can have any user behavior that is desired, with the NoSQL connector providing the data from the database. To make it as fast as possible, we use a direct connection from the user's address space to the database. This approach means that no SQL (pun intended) is needed to get to the data, and no SQL server is between the user and the data. The connector is being developed to be extensible to multiple underlying database technologies, including direct, native access to both the MySQL Cluster "ndb" and InnoDB storage engines. The connector integrates the MySQL Cluster native API library directly within the Node.js platform itself, enabling developers to seamlessly couple their high performance, distributed applications with a high performance, distributed, persistence layer delivering 99.999% availability. The following sections take you through how to connect to MySQL, query the data and how to get started. Connecting to the database A Session is the main user access path to the database. You can get a Session object directly from the connector using the openSession function: var nosql = require("mysql-js"); var dbProperties = {     "implementation" : "ndb",     "database" : "test" }; nosql.openSession(dbProperties, null, onSession); The openSession function calls back into the application upon creating a Session. The Session is then used to create, delete, update, and read objects. Reading data The Session can read data from the database in a number of ways. If you simply want the data from the database, you provide a table name and the key of the row that you want. For example, consider this schema: create table employee (   id int not null primary key,   name varchar(32),   salary float ) ENGINE=ndbcluster; Since the primary key is a number, you can provide the key as a number to the find function. function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find('employee', 0, onData); }; function onData = function(err, data) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(data));   ... use data in application }; If you want to have the data stored in your own domain model, you tell the connector which table your domain model uses, by specifying an annotation, and pass your domain model to the find function. var annotations = new nosql.Annotations(); function Employee = function(id, name, salary) {   this.id = id;   this.name = name;   this.salary = salary;   this.giveRaise = function(percent) {     this.salary *= percent;   } }; annotations.mapClass(Employee, {'table' : 'employee'}); function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find(Employee, 0, onData); }; Updating data You can update the emp instance in memory, but to make the raise persistent, you need to write it back to the database, using the update function. function onData = function(err, emp) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(emp));   emp.giveRaise(0.12); // gee, thanks!   session.update(emp); // oops, session is out of scope here }; Using JavaScript can be tricky because it does not have the concept of block scope for variables. You can create a closure to handle these variables, or use a feature of the connector to remember your variables. The connector api takes a fixed number of parameters and returns a fixed number of result parameters to the callback function. But the connector will keep track of variables for you and return them to the callback. So in the above example, change the onSession function to remember the session variable, and you can refer to it in the onData function: function onSession = function(err, session) {   if (err) {     console.log(err);     ... error handling   }   session.find(Employee, 0, onData, session); }; function onData = function(err, emp, session) {   if (err) {     console.log(err);     ... error handling   }   console.log('Found: ', JSON.stringify(emp));   emp.giveRaise(0.12); // gee, thanks!   session.update(emp, onUpdate); // session is now in scope }; function onUpdate = function(err, emp) {   if (err) {     console.log(err);     ... error handling   } Inserting data Inserting data requires a mapped JavaScript user function (constructor) and a session. Create a variable and persist it: function onSession = function(err, session) {   var data = new Employee(999, 'Mat Keep', 20000000);   session.persist(data, onInsert);   } }; Deleting data To remove data from the database, use the session remove function. You use an instance of the domain object to identify the row you want to remove. Only the key field is relevant. function onSession = function(err, session) {   var key = new Employee(999);   session.remove(Employee, onDelete);   } }; More extensive queries We are working on the implementation of more extensive queries along the lines of the criteria query api. Stay tuned. How to evaluate The MySQL Connector for JavaScript is available for download from labs.mysql.com. Select the build: MySQL-Cluster-NoSQL-Connector-for-Node-js You can also clone the project on GitHub Since it is still early in development, feedback is especially valuable (so don't hesitate to leave comments on this blog, or head to the MySQL Cluster forum). Try it out and see how easy (and fast) it is to integrate MySQL Cluster into your Node.js platforms. You can learn more about other previewed functionality of MySQL Cluster 7.3 here

    Read the article

  • CodePlex Daily Summary for Tuesday, September 25, 2012

    CodePlex Daily Summary for Tuesday, September 25, 2012Popular ReleasesRawr: Rawr 5.0.0: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Coevery - Free CRM: Coevery 1.0.0.26: The zh-CN issue has been solved. We also add a project management module.VidCoder: 1.4.1 Beta: Updated to HandBrake 4971. This should fix some issues with stuck PGS subtitles. Fixed build break which prevented pre-compiled XML serializers from showing up. Fixed problem where a preset would get errantly marked as modified when re-opening the encode settings window or importing a new preset.D3 Loot Tracker: 1.3: Added the ability to reload a previous session to be able to resume it. Removed goblin detection, let's keep this an item tracking utility only. Fixed a bug with crafting sound setting not working properly. Completely re-styled the UI.JSLint for Visual Studio 2010: 1.4.0: VS2012 support is alphaBlackJumboDog: Ver5.7.2: 2012.09.23 Ver5.7.2 (1)InetTest?? (2)HTTP?????????????????100???????????Player Framework by Microsoft: Player Framework for Windows 8 (Preview 6): IMPORTANT: List of breaking changes from preview 5 Added separate samples download with .vsix dependencies instead of source dependencies Support for FreeWheel SmartXML ad responses Support for Smooth Streaming SDK DownloaderPlugins Support for VMAP and TTML polling for live scenarios Support for custom smooth streaming byte stream and scheme handlers Support for new play time and position tracking plugin Added IsLiveChanged event Added AdaptivePlugin.MaxBitrate property Add...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.8: Version: 2.5.0.8 (Milestone 8): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Mark the class DataModel as serializable. InfoMan: Minor improvements. InfoMan: Add unit tests for all modules. Othe...LogicCircuit: LogicCircuit 2.12.9.20: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionToolbars on text note dialog are more flexible now. You can select font face, size, color, and background of text you are typing. RAM now can be initialized to one of the following: random va...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.2020.421): New features: Disable a specific part of SiteMap to keep the data without displaying them in the CRM application. It simply comments XML part of the sitemap (thanks to rboyers for this feature request) Right click an item and click on "Disable" to disable it Items disabled are greyed and a suffix "- disabled" is added Right click an item and click on "Enable" to enable it Refresh list of web resources in the web resources pickerHigLabo: HigLabo_20120919: Add XXXAsync method to all Client class for async await pattern. (HttpClient,BoxNetClient,DropboxClient,FacebookClient,FtpClient,RssClient,SugarSyncClient,TwitterClient,WindowsLiveClient) Add all api to HigLabo.Net.Ftp project. Add strong name to all assembly. Add HttpBodyMultipartFormData to provide upload multipart form data with http protocol. Add HttpBodyFormUrlEncodedData to provide form url encoded post data with http protocol. FacebookClient,RssClient,WindowsLiveClient,BoxNetClient cl...AJAX Control Toolkit: September 2012 Release: AJAX Control Toolkit Release Notes - September 2012 Release Version 60919September 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ...Sense/Net CMS - Enterprise Content Management: SenseNet 6.1.2 Community Edition: Sense/Net 6.1.2 Community EditionMain new featuresOur current release brings a lot of bugfixes, including the resolution of js/css editing cache issues, xlsx file handling from Office, expense claim demo workspace fixes and much more. Besides fixes 6.1.2 introduces workflow start options and other minor features like a reusable Reject client button for approval scenarios and resource editor enhancements. We have also fixed an issue with our install package to bring you a flawless installation...Solution Extender for Microsoft Dynamics CRM 2011: Solution Extender (2.0.0.6): Fix a problem when serializing entity records (this fix the problem when exporting queues)Visual C++ Directories Editor: VC++ Directories 2012 Editor v1.0 ML (x32-x64): version 1.0 ML for Visual C++ 2012WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Python Tools for Visual Studio: 1.5 RC: PTVS 1.5RC Available! We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RC. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. The primary new feature for the 1.5 release is Django including Azure support! The http://www.djangoproject.com is a pop...Launchbar: Lanchbar 4.0.0: This application requires .NET 4.5 which you can find here: www.microsoft.com/visualstudio/downloadsAssaultCube Reloaded: 2.5.4 -: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: New logo Improved airstrike! Reset nukes...Extended WPF Toolkit: Extended WPF Toolkit - 1.7.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's new in the 1.7.0 Release?New controls Zoombox Pie New features / bug fixes PropertyGrid.ShowTitle property added to allow showing/hiding the PropertyGrid title. Modifications to the PropertyGrid.EditorDefinitions collection will now automatically be applied to the PropertyGrid. Modifications to the PropertyGrid.PropertyDefinitions collection will now be reflected automaticaly...New ProjectsAffine transformations (Iterated function system): Small app for generating fractals using iterated function system.aspnet mvc store: Lite ASP.NET MVC CMSAugmented Reality: .Autocomplete: AutocompleteBCS to provide stock information in SharePoint 2013: This .Net assembly BCS external system provides live, read only data on Dow Jones 30 stocks details from MSN money webservices.Blood Alcohol Measurement Tool: Az alkalmazás kijelzi a felhasználó véralkoholszint változását az elfogyasztott alkoholos italok függvényében.Busqueda Incremental con un TEXTBOX: Hola, aqui estoy de nuevo con un aporte mas para la comunidad, en muchos foros he visto que estan buscando como hacer una busqueda incremental en un TEXTBOX.CRM 2011: Reassign or Transfer Personal Views: This Project allows CRM Administrator to quickly transfer (i.e. assign) advanced find views from one CRM user to another CRM user. ctripITSM doc: this is a documentation share for ctripITSM projectet Sprint 3: etsprint3Finance App for Windows 8: This is an WinJS Windows 8 application that computes various financial metrics.gadgets: Windows Sidebar Gargetsjprj: jprjKerosene ORM: Kerosene is a self-adaptive and configuration-less ORM library, with a SQL syntax based on C# dynamics, WCF, and Entity Framework capabilities for POCO objects.Lobster: ?? ?? ?????????.My Google Map: MyGoogleMap est un outils de génération de carte. Onestop.Contrib.CustomAdmin: Onestop.Contrib.CustomAdmin is a Theme for Orchard CMS providing for changing the admin dashboard elements such as Title and Logo.Onestop.Contrib.Disqus: Onestop.Contrib.Disqus is an advanced commenting module for Orchard CMS that uses Disqus.Onestop.Contrib.LayoutSelector: Onestop.Contrib.LayoutSelector is a simple part for switching to different versions of Layout.cshtml when editing Orchard content items.Onestop.Contrib.Navigation: Onestop.Contrib.Navigation is an advanced Menu Management system designed for Orchard CMS. Onestop.Contrib.Seo: Onestop.Contrib.Seo is an advanced Search Engine Optimization module for Orchard CMSOnestop.Contrib.SlideShow: Onestop.Contrib.SlideShow is an advanced module for managing slide animations on a page.pf2012: Simple HTML5 game as PF2012 electronic greeting.PHP-2012: this is for php for schoolPruebaProyecto: Carmen Asencio AmbrosioPulawJS: MVC Platform for JavaScript. Inspired by Zend FrameworkRevolution Emulator: Habbo Hotel flash emulator targeting the .NET 4.5 VM, written in C#.Ruby Rookie: This Project is for learning Ruby purposesSISLOG_Proy2: SISLOGSkyShellEx: SkyShellEx allows to sync any folder to SkyDrive via a simple ShellExtension. The sync option appears on the context menu of folders where applicable. sosoft: Sosoft Project.C# WinForm.Include Alarm Clock.sound9: sound9testddgit09242012: dTFS Agile Work Item Rollover: This is a command line utility used to rollover incomplete work from one sprint to the next. The tool has both interactive and silent modes, allowing you to seTFS Deployment Studio: Helps deploying applications built using TFS to the servers where they belong.TreeView In-place Editing in MVVM: This project demonstrates a clean way of doing the in-place editing in the WPF TreeView controlUniSoft: Teste source controlValidation Framework for .NET: Framework for validation of method paramters and return values.WarOfDev: war of developer, make your coding interesting . Waterhouse: A C# console application that takes various text inputs and converts it to Morse Code by blinking the numlock indicator.Web Package Pro: in dev

    Read the article

  • CodePlex Daily Summary for Wednesday, June 04, 2014

    CodePlex Daily Summary for Wednesday, June 04, 2014Popular ReleasesSEToolbox: SEToolbox 01.032.018 Release 1: Added ability to merge/join two ships, regardless of origin. Added Language selection menu to set display text language (for SE resources only), and fixed inherent issues. Added full support for Dedicated Servers, allowing use on PC without Steam present, and only the Dedicated Install. Added Browse button for used specified save game selection in Load dialog. Installation of this version will replace older version.DNN Blog: 06.00.07: Highlights: Enhancement: Option to show management panel on child modules Fix: Changed SPROC and code to ensure the right people are notified of pending comments. (CP-24018) Fix: Fix to have notification actions authentication assume the right module id so these will work from the messaging center (CP-24019) Fix: Fix to issue in categories in a post that will not save when no categories and no tags selectedcsv2xlsx: Source code: Source code of applicationHigLabo: HigLabo_20140603: Bug fix of MimeParser for nested MimeContent parsing.Family Tree Analyzer: Version 4.0.0.0-beta2: This major release introduces a significant new feature the ability to search the UK Ordnance Survey 50k Gazetteer for small placenames that often don't appear on a modern Google Map. This means significant numbers of locations that were previously not found by Google Geocoding will now be found by using OS Geocoding. In addition I've added support for loading the Scottish 1930 Parish boundary maps, whilst slightly different from the parish boundaries in use in the 1800s that are familiar to...TEncoder: 4.0.0: --4.0.0 -Added: Video downloader -Added: Total progress will be updated more smoothly -Added: MP4Box progress will be shown -Added: A tool to create gif image from video -Added: An option to disable trimming -Added: Audio track option won't be used for mpeg sources as default -Fixed: Subtitle position wasn't used -Fixed: Duration info in the file list wasn't updated after trimming -Updated: FFMpegSSIS Multiple Hash: Multiple Hash 1.6.2.3: Release Notes This release is an SQL 2014 fix release. It addresses the following: The 1.6.1.2 release's SQL 2014 installer didn't work for SQL 2014 RTM. The x64 installer also showed x86 for both x64 and x86 in SQL 2014. Please download the x64 of x32 file based on the bitness of your SQL Server installation. BIDS or SSDT/BI ONLY INSTALLS ARE NOT DETECTED. You MUST use the Custom install, to install when the popup is shown, and select which versions of SQL Server are available. A war...QuickMon: Version 3.14 (Pie release): This is unofficially the 'Pie' release. There are two big changes.1. 'Presets' - basically templates. Future releases might build on this to allow users to add more presets. 2. MSI Installer now allows you to choose components (in case you don't want all collectors etc.). This means you don't have to download separate components anymore (AllAgents.zip still included in case you want to use them separately) Some other changes:1. Add/changed default file extension for monitor packs to *.qmp (...VeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Keepass2Android: 0.9.4-pre1: added plug-in support: See settings for how to get plug-ins! published QR plug-in (scan passwords, display passwords as QR code, transfer entries to other KP2A devices) published InputStick plugin (transfer credentials to your PC via bluetooth - requires InputStick USB stick) Third party apps can now simply implement querying KP2A for credentials. Are you a developer? Please add this to your app if suitable! added TOTP support (compatible with KeeOTP and TrayTotp) app should no l...The IRIS Toolbox Project: IRIS Reference Manual Release 20140602: Reference Manual for Release 20140602.Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from nuget or the Microsoft Download Center This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy https://www.owasp.org/index....Z SqlBulkCopy Extensions: SqlBulkCopy Extensions 1.0.0: SqlBulkCopy Extensions provide MUST-HAVE methods with outstanding performance missing from the SqlBulkCopy class like Delete, Update, Merge, Upsert. Compatible with .NET 2.0, SQL Server 2000, SQL Azure and more! Bulk MethodsBulkDelete BulkInsert BulkMerge BulkUpdate BulkUpsert Utility MethodsGetSqlConnection GetSqlTransaction You like this library? Find out how and why you should support Z Project Become a Memberhttp://zzzproject.com/resources/images/all/become-a-member.png|ht...Tweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...ClosedXML - The easy way to OpenXML: ClosedXML 0.71.2: More memory and performance improvements. Fixed an issue with pivot table field order.Magick.NET: Magick.NET 6.8.9.101: Magick.NET linked with ImageMagick 6.8.9.1. Breaking changes: - Int/short Set methods of WritablePixelCollection are now unsigned. - The Q16 build no longer uses HDRI, switch to the new Q16-HDRI build if you need HDRI.fnr.exe - Find And Replace Tool: 1.7: Bug fixes Refactored logic for encoding text values to command line to handle common edge cases where find/replace operation works in GUI but not in command line Fix for bug where selection in Encoding drop down was different when generating command line in some cases. It was reported in: https://findandreplace.codeplex.com/workitem/34 Fix for "Backslash inserted before dot in replacement text" reported here: https://findandreplace.codeplex.com/discussions/541024 Fix for finding replacing...SoundCloud Downloader: SC Downloader V1.0: Newest release Requirements .NET Framework 4.0/4.5 Changes since releasing the source Replaced ProgressBar and Buttons with the ones i created from scratch, you may use the .dll files on your own risk. Changed the authentication mode to Windows NOTE! When extracted, execute setup.exe to launch the installer! NOTE! ENJOY!VG-Ripper & PG-Ripper: VG-Ripper 2.9.59: changes NEW: Added Support for 'GokoImage.com' links NEW: Added Support for 'ViperII.com' links NEW: Added Support for 'PixxxView.com' links NEW: Added Support for 'ImgRex.com' links NEW: Added Support for 'PixLiv.com' links NEW: Added Support for 'imgsee.me' links NEW: Added Support for 'ImgS.it' linksNew Projects2112110148: 2112110148 Kieu Thi Phuong AnhAutoHelp: Inspired by Docy, AutoHelp exposes XML documentation embedded in sources code. a MVC 5 / typescripted connected on Webapi, AutoHelp is a modern HTML 5 appFeedbackata: An exercise in interactive development.Ideaa: A place to get inspired and find ideas about a topic you are interested inImpression Studio - Free presentation Maker Tool: Impression Studio is a free presentation maker tool based on impress.js library. It aims to be free alternative to prezi. This tool is under heavy development.LavaMite: LavaMite is a program that aims to capture the random wax formations that arise in a lava lamp before it is bubbling freely.Load Runner - HTTP Pressure Test Tool: A Lightweight HTTP pressure test tool supports multi methods / content type / requests, written in C# 4.0.Microsoft Research Software Radio SDK: Microsoft Research Software Radio SDKMsp: MspPower Torrent: Windows PowerShell module enabling BitTorrent (R) functionalities.

    Read the article

  • BizTalk: History of one project architecture

    - by Leonid Ganeline
    "In the beginning God made heaven and earth. Then he started to integrate." At the very start was the requirement: integrate two working systems. Small digging up: It was one system. It was good but IT guys want to change it to the new one, much better, chipper, more flexible, and more progressive in technologies, more suitable for the future, for the faster world and hungry competitors. One thing. One small, little thing. We cannot turn off the old system (call it A, because it was the first), turn on the new one (call it B, because it is second but not the last one). The A has a hundreds users all across a country, they must study B. A still has a lot nice custom features, home-made features that cannot disappear. These features have to be moved to the B and it is a long process, months and months of redevelopment. So, the decision was simple. Let’s move not jump, let’s both systems working side-by-side several months. In this time we could teach the users and move all custom A’s special functionality to B. That automatically means both systems should work side-by-side all these months and use the same data. Data in A and B must be in sync. That’s how the integration projects get birth. Moreover, the specific of the user tasks requires the both systems must be in sync in real-time. Nightly synchronization is not working, absolutely.   First draft The first draft seems simple. Both systems keep data in SQL databases. When data changes, the Create, Update, Delete operations performed on the data, and the sync process could be started. The obvious decision is to use triggers on tables. When we are talking about data, we are talking about several entities. For example, Orders and Items [in Orders]. We decided to use the BizTalk Server to synchronize systems. Why it was chosen is another story. Second draft   Let’s take an example how it works in more details. 1.       User creates a new entity in the A system. This fires an insert trigger on the entity table. Trigger has to pass the message “Entity created”. This message includes all attributes of the new entity, but I focused on the Id of this entity in the A system. Notation for this message is id.A. System A sends id.A to the BizTalk Server. 2.       BizTalk transforms id.A to the format of the system B. This is easiest part and I will not focus on this kind of transformations in the following text. The message on the picture is still id.A but it is in slightly different format, that’s why it is changing in color. BizTalk sends id.A to the system B. 3.       The system B creates the entity on its side. But it uses different id-s for entities, these id-s are id.B. System B saves id.A+id.B. System B sends the message id.A+id.B back to the BizTalk. 4.       BizTalk sends the message id.A+id.B to the system A. 5.       System A saves id.A+id.B. Why both id-s should be saved on both systems? It was one of the next requirements. Users of both systems have to know the systems are in sync or not in sync. Users working with the entity on the system A can see the id.B and use it to switch to the system B and work there with the copy of the same entity. The decision was to store the pairs of entity id-s on both sides. If there is only one id, the entities are not in sync yet (for the Create operation). Third draft Next problem was the reliability of the synchronization. The synchronizing process can be interrupted on each step, when message goes through the wires. It can be communication problem, timeout, temporary shutdown one of the systems, the second system cannot be synchronized by some internal reason. There were several potential problems that prevented from enclosing the whole synchronization process in one transaction. Decision was to restart the whole sync process if it was not finished (in case of the error). For this purpose was created an additional service. Let’s call it the Resync service. We still keep the id pairs in both systems, but only for the fast access not for the synchronization process. For the synchronizing these id-s now are kept in one main place, in the Resync service database. The Resync service keeps record as: ·       Id.A ·       Id.B ·       Entity.Type ·       Operation (Create, Update, Delete) ·       IsSyncStarted (true/false) ·       IsSyncFinished (true/false0 The example now looks like: 1.       System A creates id.A. id.A is saved on the A. Id.A is sent to the BizTalk. 2.       BizTalk sends id.A to the Resync and to the B. id.A is saved on the Resync. 3.       System B creates id.B. id.A+id.B are saved on the B. id.A+id.B are sent to the BizTalk. 4.       BizTalk sends id.A+id.B to the Resync and to the A. id.A+id.B are saved on the Resync. 5.       id.A+id.B are saved on the B. Resync changes the IsSyncStarted and IsSyncFinished flags accordingly. The Resync service implements three main methods: ·       Save (id.A, Entity.Type, Operation) ·       Save (id.A, id.B, Entity.Type, Operation) ·       Resync () Two Save() are used to save id-s to the service storage. See in the above example, in 2 and 4 steps. What about the Resync()? It is the method that finishes the interrupted synchronization processes. If Save() is started by the trigger event, the Resync() is working as an independent process. It periodically scans the Resync storage to find out “unfinished” records. Then it restarts the synchronization processes. It tries to synchronize them several times then gives up.     One more thing, both systems A and B must tolerate duplicates of one synchronizing process. Say on the step 3 the system B was not able to send id.A+id.B back. The Resync service must restart the synchronization process that will send the id.A to B second time. In this case system B must just send back again also created id.A+id.B pair without errors. That means “tolerate duplicates”. Fourth draft Next draft was created only because of the aesthetics. As it always happens, aesthetics gave significant performance gain to the whole system. First was the stupid question. Why do we need this additional service with special database? Can we just master the BizTalk to do something like this Resync() does? So the Resync orchestration is doing the same thing as the Resync service. It is started by the Id.A and finished by the id.A+id.B message. The first works as a Start message, the second works as a Finish message.     Here is a diagram the whole process without errors. It is pretty straightforward. The Resync orchestration is waiting for the Finish message specific period of time then resubmits the Id.A message. It resubmits the Id.A message specific number of times then gives up and gets suspended. It can be resubmitted then it starts the whole process again: waiting [, resubmitting [, get suspended]], finishing. Tuning up The Resync orchestration resubmits the id.A message with special “Resubmitted” flag. The subscription filter on the Resync orchestration includes predicate as (Resubmit_Flag != “Resubmitted”). That means only the first Sync orchestration starts the Resync orchestration. Other Sync orchestration instantiated by the resubmitting can finish this Resync orchestration but cannot start another instance of the Resync   Here is a diagram where system B was inaccessible for some period of time. The Resync orchestration resubmitted the id.A two times. Then system B got the response the id.A+id.B and this finished the Resync service execution. What is interesting about this, there were submitted several identical id.A messages and only one id.A+id.B message. Because of this, the system B and the Resync must tolerate the duplicate messages. We also told about this requirement for the system B. Now the same requirement is for the Resunc. Let’s assume the system B was very slow in the first response and the Resync service had time to resubmit two id.A messages. System B responded not, as it was in previous case, with one id.A+id.B but with two id.A+id.B messages. First of them finished the Resync execution for the id.A. What about the second id.A+id.B? Where it goes? So, we have to add one more internal requirement. The whole solution must tolerate many identical id.A+id.B messages. It is easy task with the BizTalk. I added the “SinkExtraMessages” subscriber (orchestration with one receive shape), that just get these messages and do nothing. Real design Real architecture is much more complex and interesting. In reality each system can submit several id.A almost simultaneously and completely unordered. There are not only the “Create entity” operation but the Update and Delete operations. And these operations relate each other. Say the Update operation after Delete means not the same as Update after Create. In reality there are entities related each other. Say the Order and Order Items. Change on one of it could start the series of the operations on another. Moreover, the system internals are the “black boxes” and we cannot predict the exact content and order of the operation series. It worth to say, I had to spend a time to manage the zombie message problems. The zombies are still here, but this is not a problem now. And this is another story. What is interesting in the last design? One orchestration works to help another to be more reliable. Why two orchestration design is more reliable, isn’t it something strange? The Synch orchestration takes all the message exchange between systems, here is the area where most of the errors could happen. The Resync orchestration sends and receives messages only within the BizTalk server. Is there another design? Sure. All Resync functionality could be implemented inside the Sync orchestration. Hey guys, some other ideas?

    Read the article

  • Managed Service Architectures Part I

    - by barryoreilly
    Instead of thinking about service oriented architecture, a concept that is continually defined, redefined, abused and mistreated, perhaps it is time to drop the acronym and consider what we actually need to get the job done.   ‘Pure’ SOA involves the modeling of an organisation’s processes, the so called ‘Top Down’ approach, followed by the implementation of these processes as services.     Another approach, more commonly seen in the wild, is the bottom up approach. This usually involves services that simply start popping up in the organization, and SOA in this case is often just an attempt to rein in these services. Such projects, although described as SOA projects for a variety of reasons, have clearly little relation to process driven architecture. Much has been written about these two approaches, with many deciding that a hybrid of both methods is needed to succeed with SOA.   These hybrid methods are a sensible compromise, but one gets the feeling that there is too much focus on ‘Succeeding with SOA’. Organisations who focus too much on bottom up development, or who waste too much time and money on top down approaches that don’t produce results, are often recommended to attempt an ‘agile’(Erl) or ‘middle-out’ (Microsoft) approach in order to succeed with SOA.  The problem with recommending this approach is that, in most cases, succeeding with SOA isn’t the aim of the project. If a project is started with the simple aim of ‘Succeeding with SOA’ then the reasons for the projects existence probably need to be questioned.   There are a number of things we can be sure of: ·         An organisation will have a number of disparate IT systems ·         Some of these systems will have redundant data and functionality ·         Integration will give considerable ROI ·         Integration will already be under way. ·         Services will already exist in the organisation ·         These services will be inconsistent in their implementation and in their governance   So there are three goals here: 1.       Alignment between the business and IT 2.     Integration of disparate systems 3.     Management of services.   2 and 3 are going to happen,  in fact they must happen if any degree of return is expected from the IT department. Ignoring 1 is considered a typical mistake in SOA implementations, as it ignores the business implications. However, the business implication of this approach is the money saved in more efficient IT processes. 2 and 3 are ongoing, and they will continue happening, even if a large project to produce a SOA metamodel is started. The result will then be an unstructured cackle of services, and a metamodel that is already going out of date. So we get stuck in and rebuild our services so that they match the metamodel, with the far reaching consequences that this will have on all our LOB systems are current. Lets imagine that this actually works ( how often do we rip and replace working software because it doesn't fit a certain pattern? Never -that's the point of integration), we will now be working with a metamodel that is out of date, and most likely incomplete if the organisation is large.      Accepting that an object can have more than one model over time, with perhaps more than one model being  at any given time will help us realise the limitations of the top down model. It is entirely normal , and perhaps necessary, for an organisation to be able to view an entity from different perspectives.   So, instead of trying to constantly force these goals in a straight line, why not let them happen in parallel, and manage the changes in each layer.     If  company A has chosen to model their business processes and create a business architecture, there will be a reason behind this. Often the aim is to make the business more flexible and able to cope with change, through alignment between the business and the IT department.   If company B’s IT department recognizes the problem of wild services springing up everywhere, and decides to do something about it, by designing a platform and processes for the introduction of services, is this not a valid approach?   With the hybrid approach, it is recommended that company A begin deploying services as quickly as possible. Based on models that are clearly incomplete, and which will therefore change rapidly and often in the near future. Natural business evolution will also mean that the models can be guaranteed to change in the not so near future. To ‘Succeed with SOA’ Company B needs to go back to the drawing board and start modeling processes and objects. So, in effect, we are telling business analysts to start developing code based on a model they are unsure of, and telling programmers to ignore the obvious and growing problems in their IT department and start drawing lines and boxes.     Could the problem be that there are two different problem domains? And the whole concept of SOA as it being described by clever salespeople today creates an example of oft dreaded ‘tight coupling’ between these two domains?   Could it be that we have taken two large problem areas, and bundled the solution together in order to create a magic bullet? And then convinced ourselves that the bullet actually exists?   Company A wants to have a closer relationship between the business and its IT department, in order to become a more flexible organization. Company B wants to decrease the maintenance costs of its IT infrastructure. If both companies focus on succeeding with SOA, then they aren’t focusing on their actual goals.   If Company A starts building services from incomplete models, without a gameplan, they will end up in the same situation as company B, with wild services. If company B focuses on modeling, they could easily end up with the same problems as company A.   Now we have two companies, who a short while ago had one problem each, that now have two problems each. This has happened because of a focus on ‘Succeeding with SOA’, rather than solving the problem at hand.   This is not to suggest that the two problem domains are unrelated, a strategy that encompasses both will obviously be good for the organization. But only if the organization realizes this and can develop such a strategy. This strategy cannot be bought in a box.       Anyone who has worked with SOA for a while will be used to analyzing the solutions to a problem and judging the solution’s level of coupling. If we have two applications that each perform separate functions, but need to communicate with each other, we create a integration layer between them, perhaps with a service, but we do all we can to reduce the dependency between the two systems. Using the same approach, we can separate the modeling (business architecture) and the service hosting (technical architecture).     The business architecture describes the processes and business objects in the business domain.   The technical architecture describes the hosting and management and implementation of services.   The glue that binds these together, the integration layer in our analogy, is the service contract, where the operations map the processes to their technical implementation, and the messages map business concepts to software objects in the implementation.   If we reduce the coupling between these layers, we should be able to allow developers to develop services, and business analysts to develop models, without the changes rippling through from one side to the other.   This would allow company A to carry on modeling, and company B to develop a service platform, each achieving their intended goal, without necessarily creating the problems seen in pure top down or bottom up approaches. Company B could then at a later date map their service infrastructure to a unified model, and company A could carry on modeling, insulating deployed services from changes in the ongoing modeling.   How do we do this?  The concept of service virtualization has been around for a while, and is instantly realizable in Microsoft’s Managed Services Engine. Here we can create a layer of virtual services, which represent the business analyst’s view, presenting uniform contracts to the outside world. These services can then transform and route messages to the actual service implementations. I like to think of the virtual services with their beautifully modeled interfaces as ‘SOA services’, and the implementations as simple integration ‘adapter’ services providing an interface to a technical implementation. The Managed Services Engine also provides policy based control over services, regardless of where they are deployed, simplifying handling of security, logging, exception handling etc.   This solves a big problem. The pressure to deliver services quickly is always there in projects. It is very important to quickly show value when implementing service architectures. There is also pressure to deliver quality, and you can’t easily do both at the same time. This approach allows quick delivery with quality increasing over time, allowing modeling and service development to occur in parallel and independent of each other. The link between business modeling and service implementation is not one that is obvious to many organizations, and requires a certain maturity to realize and drive forward. It is also completely possible that a company can benefit from one without the other, even if this approach is frowned upon today, there are many companies doing so and seeing ROI.   Of course there are disadvantages to this. The biggest one being the transformations necessary between the virtual interfaces and the service implementations. Bad choices in developing the services in the service implementation could mean that it is impossible to map the modeled processes to the implementation with redevelopment of the service. In many cases the architect will not have a choice here anyway, as proprietary systems are often delivered with predeveloped services. The alternative is to wait until the model is finished and then build the service according the model. However, if that approach worked we wouldn’t be having this discussion! And even when it does work, natural business evolution will mean that the two concepts (model and implementation) will immediately start to drift away from each other, so coupling them tightly together so that they are forever bound to the model that only applies at the time of the modeling work will not really achieve a great deal. Architecture is all about trade offs, and here a choice has to be made. The choice is between something will initially be of low quality but will work, or something that may well be impossible to achieve in most situations.         In conclusion, top-down is a natural approach for business analysts, and bottom-up  is a natural approach for developers. Instead of trying to force something on both that neither want, and which has not shown itself to be successful,  why not let them get on with their jobs, and let an enterprise architect coordinate the processes?

    Read the article

  • CodePlex Daily Summary for Friday, June 06, 2014

    CodePlex Daily Summary for Friday, June 06, 2014Popular ReleasesVirto Commerce Enterprise Open Source eCommerce Platform (asp.net mvc): Virto Commerce 1.10: Virto Commerce Community Edition version 1.10. To install the SDK package, please refer to SDK getting started documentation To configure source code package, please refer to Source code getting started documentation This release includes bug fixes and improvements (including Commerce Manager localization and https support). More details about this release can be found on our blog at http://blog.virtocommerce.com.Load Runner - HTTP Pressure Test Tool: Load Runner 1.3: 1. gracefully stop distributed engine 2. added error handling for distributed load tasks if fail to send the tasks. 3. added easter egg ;-) NPOI: NPOI 2.1: Assembly Version: 2.1.0 New Features a. XSSFSheet.CopySheet b. Excel2Html for XSSF c. insert picture in word 2007 d. Implement IfError function in formula engine Bug Fixes a. fix conditional formatting issue b. fix ctFont order issue c. fix vertical alignment issue in XSSF d. add IndexedColors to NPOI.SS.UserModel e. fix decimal point issue in non-English culture f. fix SetMargin issue in XSSF g.fix multiple images insert issue in XSSF h.fix rich text style missing issue in XSSF i. fix cell...HigLabo: HigLabo_20140605: Modify AsciiCharOnly method implementation. Code clean up of RssItem.Description.51Degrees - Device Detection and Redirection: 3.1.2.3: Version 3.1 HighlightsDevice detection algorithm is over 100 times faster. Regular expressions and levenshtein distance calculations are no longer used. The device detection algorithm performance is no longer limited by the number of device combinations contained in the dataset. Two modes of operation are available: Memory – the detection data set is loaded into memory and there is no continuous connection to the source data file. Slower initialisation time but faster detection performanc...CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.27.0: CodeMap now indicates the type name for all members Implemented running scripts 'as administrator'. Just add '//css_npp asadmin' to the script and run it as usual. 'Prepare script for distribution' now aggregates script dependency assemblies. Various improvements in CodeSnipptet, Autcompletion and MethodInfo interactions with each other. Added printing line number for the entries in CodeMap (subject of configuration value) Improved debugging step indication for classless scripts ...Kartris E-commerce: Kartris v2.6003: Bug fixes: Fixed issue where category could not be saved if parent categories not altered Updated split string function to 500 chars otherwise problems with attribute filtering in PowerPack Improvements: If a user has group pricing below that available through qty discounts, that will show in place of the qty discountClosedXML - The easy way to OpenXML: ClosedXML 0.72.3: 70426e13c415 ClosedXML for .Net 4.0 now uses Open XML SDK 2.5 b9ef53a6654f Merge branch 'master' of https://git01.codeplex.com/forks/vbjay/closedxml 727714e86416 Fix range.Merge(Boolean) for .Net 3.5 eb1ed478e50e Make public range.Merge(Boolean checkIntersects) 6284cf3c3991 More performance improvements when saving.SEToolbox: SEToolbox 01.032.018 Release 1: Added ability to merge/join two ships, regardless of origin. Added Language selection menu to set display text language (for SE resources only), and fixed inherent issues. Added full support for Dedicated Servers, allowing use on PC without Steam present, and only the Dedicated Server install. Added Browse button for used specified save game selection in Load dialog. Installation of this version will replace older version.DNN Blog: 06.00.07: Highlights: Enhancement: Option to show management panel on child modules Fix: Changed SPROC and code to ensure the right people are notified of pending comments. (CP-24018) Fix: Fix to have notification actions authentication assume the right module id so these will work from the messaging center (CP-24019) Fix: Fix to issue in categories in a post that will not save when no categories and no tags selectedTEncoder: 4.0.0: --4.0.0 -Added: Video downloader -Added: Total progress will be updated more smoothly -Added: MP4Box progress will be shown -Added: A tool to create gif image from video -Added: An option to disable trimming -Added: Audio track option won't be used for mpeg sources as default -Fixed: Subtitle position wasn't used -Fixed: Duration info in the file list wasn't updated after trimming -Updated: FFMpegVeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from nuget or the Microsoft Download Center This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy https://www.owasp.org/index....Z SqlBulkCopy Extensions: SqlBulkCopy Extensions 1.0.0: SqlBulkCopy Extensions provide MUST-HAVE methods with outstanding performance missing from the SqlBulkCopy class like Delete, Update, Merge, Upsert. Compatible with .NET 2.0, SQL Server 2000, SQL Azure and more! Bulk MethodsBulkDelete BulkInsert BulkMerge BulkUpdate BulkUpsert Utility MethodsGetSqlConnection GetSqlTransaction You like this library? Find out how and why you should support Z Project Become a Memberhttp://zzzproject.com/resources/images/all/become-a-member.png|ht...Tweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...Image View Slider: Image View Slider: This is a .NET component. We create this using VB.NET. Here you can use an Image Viewer with several properties to your application form. We wish somebody to improve freely. Try this out! Author : Steven Renaldo Antony Yustinus Arjuna Purnama Putra Andre Wijaya P Martin Lidau PBK GENAP 2014 - TI UKDWAspose for Apache POI: Missing Features of Apache POI WP - v 1.1: Release contain the Missing Features in Apache POI WP SDK in Comparison with Aspose.Words for dealing with Microsoft Word. What's New ?Following Examples: Insert Picture in Word Document Insert Comments Set Page Borders Mail Merge from XML Data Source Moving the Cursor Feedback and Suggestions Many more examples are yet to come here. Keep visiting us. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.babelua: 1.5.6.0: V1.5.6.0 - 2014.5.30New feature: support quick-cocos2d-x project now; support text search in scripts folder now, you can use this function in Search Result Window;Credit Component: Credit Component: This is a sample release of Credit Component that has been made by Microsoft Visual Studio 2010. To try and use it, you need .NET framework 4.0 and Microsoft Visual Studio 2010 or newer as a minimum requirement in this download you will get media player as a sample application that use this component credit component as a main component media player source code as source code and sample usage of credit component credit component source code as source code of credit component important...New ProjectsBack Up Your SharePoint: SPSBackUp is a PowerShell script tool to Backup your SharePoint environment. It's compatible with SharePoint 2010 & 2013.ChoMy: mrkidconsoledemo: a basic app about the knowledge in c# domainDecision Architect Server Client in C#: This project is a client for Decision Architect Server API.InnVIDEO365 Agile SharePoint-Kaltura App: InnVIDEO365 Agile SharePoint - Kaltura App for SharePoint Online is a complete video content solution for Microsoft SharePoint Online.JacOS: JacOS is an open-source operating system created with COSMOSKISS Workflow Foundation: This project was born from the idea that everyone should be able to use Workflow Foundation in best conditions. It contain sample to take WF jump start!Media Player (Technology Area): A now simple, yet somehow complex music, photo viewer I have some plans for the best tools for the player in the futureMetalurgicaInstitucional: MetalurgicaInstitucionalmoodlin8: Moodlin8 try to exploit significant parts of Moodle, the popular LMS, in a mobile or fixed environment based on the Modern Interface of Windows 8.1.NewsletterSystem: This is a test projectpesho2: Ala bala portokalaProyecto web Lab4 Gioia lucas: proyecto para lab4 utn pachecoRallyRacer: TSD Rally Compute helperVideo-JS Webpart for Sharepoint: Video-JS Webpart for SharepointYanZhiwei_CSharp_UtilHelp: ??C#???????????API: ????? Web API

    Read the article

  • CodePlex Daily Summary for Saturday, June 07, 2014

    CodePlex Daily Summary for Saturday, June 07, 2014Popular ReleasesSFDL.NET: SFDL.NET (2.2.9.3): Changelog: Retry Bugfix (Error Counter wurde nicht korrekt zurückgesetzt) Neue Einstellung: Retry Wartezeit ist nun EinstellbarLoad Runner - HTTP Pressure Test Tool: Load Runner 1.2: 1. added support for distributed load test (read documentation for details) 2. added detail documentationAspose for Apache POI: Aspose.Words vs Apache POI WP - v 1.1: Release contain the Code Comparison for Features in Apache POI WP SDK and Aspose.Words What's New ?Following Examples: Working with Headers Working with Footers Access Ranges in Document Delete Ranges in Document Insert Before and After Ranges Feedback and Suggestions Many more examples are yet to come here. Keep visiting us. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.babelua: 1.5.7.0: V1.5.7.0 - 2014.6.6Stability improvement: use "lua scripts folder" as lua search path when debugging;SEToolbox: SEToolbox 01.033.007 Release 1: Fixed breaking changes in Space Engineers in latest update. Installation of this version will replace older version.Virto Commerce Enterprise Open Source eCommerce Platform (asp.net mvc): Virto Commerce 1.10: Virto Commerce Community Edition version 1.10. To install the SDK package, please refer to SDK getting started documentation To configure source code package, please refer to Source code getting started documentation This release includes bug fixes and improvements (including Commerce Manager localization and https support). More details about this release can be found on our blog at http://blog.virtocommerce.com.NPOI: NPOI 2.1: Assembly Version: 2.1.0 New Features a. XSSFSheet.CopySheet b. Excel2Html for XSSF c. insert picture in word 2007 d. Implement IfError function in formula engine Bug Fixes a. fix conditional formatting issue b. fix ctFont order issue c. fix vertical alignment issue in XSSF d. add IndexedColors to NPOI.SS.UserModel e. fix decimal point issue in non-English culture f. fix SetMargin issue in XSSF g.fix multiple images insert issue in XSSF h.fix rich text style missing issue in XSSF i. fix cell...51Degrees - Device Detection and Redirection: 3.1.2.3: Version 3.1 HighlightsDevice detection algorithm is over 100 times faster. Regular expressions and levenshtein distance calculations are no longer used. The device detection algorithm performance is no longer limited by the number of device combinations contained in the dataset. Two modes of operation are available: Memory – the detection data set is loaded into memory and there is no continuous connection to the source data file. Slower initialisation time but faster detection performanc...CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.27.0: CodeMap now indicates the type name for all members Implemented running scripts 'as administrator'. Just add '//css_npp asadmin' to the script and run it as usual. 'Prepare script for distribution' now aggregates script dependency assemblies. Various improvements in CodeSnipptet, Autcompletion and MethodInfo interactions with each other. Added printing line number for the entries in CodeMap (subject of configuration value) Improved debugging step indication for classless scripts ...Kartris E-commerce: Kartris v2.6003: Bug fixes: Fixed issue where category could not be saved if parent categories not altered Updated split string function to 500 chars otherwise problems with attribute filtering in PowerPack Improvements: If a user has group pricing below that available through qty discounts, that will show in place of the qty discountClosedXML - The easy way to OpenXML: ClosedXML 0.72.3: 70426e13c415 ClosedXML for .Net 4.0 now uses Open XML SDK 2.5 b9ef53a6654f Merge branch 'master' of https://git01.codeplex.com/forks/vbjay/closedxml 727714e86416 Fix range.Merge(Boolean) for .Net 3.5 eb1ed478e50e Make public range.Merge(Boolean checkIntersects) 6284cf3c3991 More performance improvements when saving.DNN Blog: 06.00.07: Highlights: Enhancement: Option to show management panel on child modules Fix: Changed SPROC and code to ensure the right people are notified of pending comments. (CP-24018) Fix: Fix to have notification actions authentication assume the right module id so these will work from the messaging center (CP-24019) Fix: Fix to issue in categories in a post that will not save when no categories and no tags selectedcsv2xlsx: Source code: Source code of applicationFamily Tree Analyzer: Version 4.0.0.0-beta5: This major release introduces a significant new feature the ability to search the UK Ordnance Survey 50k Gazetteer for small placenames that often don't appear on a modern Google Map. This means significant numbers of locations that were previously not found by Google Geocoding will now be found by using OS Geocoding. In addition I've added support for loading the Scottish 1930 Parish boundary maps, whilst slightly different from the parish boundaries in use in the 1800s that are familiar to...TEncoder: 4.0.0: --4.0.0 -Added: Video downloader -Added: Total progress will be updated more smoothly -Added: MP4Box progress will be shown -Added: A tool to create gif image from video -Added: An option to disable trimming -Added: Audio track option won't be used for mpeg sources as default -Fixed: Subtitle position wasn't used -Fixed: Duration info in the file list wasn't updated after trimming -Updated: FFMpegVeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from nuget or the Microsoft Download Center This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy https://www.owasp.org/index....ConEmu - Windows console with tabs: ConEmu 140602 [Alpha]: ConEmu - developer build x86 and x64 versions. Written in C++, no additional packages required. Run "ConEmu.exe" or "ConEmu64.exe". Some useful information you may found: http://superuser.com/questions/tagged/conemu http://code.google.com/p/conemu-maximus5/wiki/ConEmuFAQ http://code.google.com/p/conemu-maximus5/wiki/TableOfContents If you want to use ConEmu in portable mode, just create empty "ConEmu.xml" file near to "ConEmu.exe" Tweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...New Projects.NET Extended Framework: .NET Extended FrameworkActivity Diagram: This is an attempt to parse C# Operation source and convert them into UML Activity Diagrams. The development of a suitable diagram editor is the next priorityBango Payment Flow in-app payments: Bango Payment Flow in-app paymentsDeckEditor: ?????????????????????????????????。 ?????????、CSV????、???????????。demodjango: just a demo project to record how to using djangoExport Bugs: This application will export all the bugs in a current project from TFS. It also has the option to export all attachments in the current projectFantom - Expression Based Dynamic Language Manager: Easy Dynamic Language (over MSSQL) Management for Asp.Net Web Forms. Very easy create dynamic multi language Asp.Net web site.JIRAShell: Provides PowerShell cmdlets for working with JIRA.PipeUtil: PipeUtil is a library that simplifies use the feature PIPE windows. Has the capability of client and server with threads support.Post Deployment Smoke Tester: This tool enables you to easily carry out post deployment smoke tests against your installed deployments to determine if your environment is in a good state.Quan Ly Dang Ki Internet: quan ly dang ki internetToolKD: ToolKD console application designed to change the binary code for directories that are used KitchenDraw program. Wiki Parsing Service: A web service to Parse wiki mark-up to HTML and vice versaXBee-API for .NET: This is .NET library for communication with XBee/XBee-Pro series 1 (IEEE 802.15.4) and series 2 (ZB/ZigBee Pro) OEM RF Modules. XPinterest: XPinterestZSoft-CMS: the project is going to use a new approach other than regular modular cms es.

    Read the article

  • CodePlex Daily Summary for Tuesday, June 03, 2014

    CodePlex Daily Summary for Tuesday, June 03, 2014Popular ReleasesQuickMon: Version 3.14 (Pie release): This is unofficially the 'Pie' release. There are two big changes.1. 'Presets' - basically templates. Future releases might build on this to allow users to add more presets. 2. MSI Installer now allows you to choose components (in case you don't want all collectors etc.). This means you don't have to download separate components anymore (AllAgents.zip still included in case you want to use them separately) Some other changes:1. Add/changed default file extension for monitor packs to *.qmp (...VeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Keepass2Android: 0.9.4-pre1: added plug-in support: See settings for how to get plug-ins! published QR plug-in (scan passwords, display passwords as QR code, transfer entries to other KP2A devices) published InputStick plugin (transfer credentials to your PC via bluetooth - requires InputStick USB stick) Third party apps can now simply implement querying KP2A for credentials. Are you a developer? Please add this to your app if suitable! added TOTP support (compatible with KeeOTP and TrayTotp) app should no l...Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from http://www.microsoft.com/en-us/download/details.aspx?id=43126 This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy htt...Z SqlBulkCopy Extensions: SqlBulkCopy Extensions 1.0.0: SqlBulkCopy Extensions provide MUST-HAVE methods with outstanding performance missing from the SqlBulkCopy class like Delete, Update, Merge, Upsert. Compatible with .NET 2.0, SQL Server 2000, SQL Azure and more! Bulk MethodsBulkDelete BulkInsert BulkMerge BulkUpdate BulkUpsert Utility MethodsGetSqlConnection GetSqlTransaction You like this library? Find out how and why you should support Z Project Become a Memberhttp://zzzproject.com/resources/images/all/become-a-member.png|ht...Portable Class Library for SQLite: Portable Class Library for SQLite - 3.8.4.4: This pull request from mattleibow addresses an issue with custom function creation (define functions in C# code and invoke them from SQLite as id they where regular SQL functions). Impact: Xamarin iOSTweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...Image View Slider: Image View Slider: This is a .NET component. We create this using VB.NET. Here you can use an Image Viewer with several properties to your application form. We wish somebody to improve freely. Try this out! Author : Steven Renaldo Antony Yustinus Arjuna Purnama Putra Andre Wijaya P Martin Lidau PBK GENAP 2014 - TI UKDWAspose for Apache POI: Missing Features of Apache POI WP - v 1.1: Release contain the Missing Features in Apache POI WP SDK in Comparison with Aspose.Words for dealing with Microsoft Word. What's New ?Following Examples: Insert Picture in Word Document Insert Comments Set Page Borders Mail Merge from XML Data Source Moving the Cursor Feedback and Suggestions Many more examples are yet to come here. Keep visiting us. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.babelua: V1.5.6.0: V1.5.6.0 - 2014.5.30New feature: support quick-cocos2d-x project now; support text search in scripts folder now, you can use this function in Search Result Window;Credit Component: Credit Component: This is a sample release of Credit Component that has been made by Microsoft Visual Studio 2010. To try and use it, you need .NET framework 4.0 and Microsoft Visual Studio 2010 or newer as a minimum requirement in this download you will get media player as a sample application that use this component credit component as a main component media player source code as source code and sample usage of credit component credit component source code as source code of credit component important...SEToolbox: 01.032.014 Release 1: Added fix when loading game Textures for icons causing 'Unable to read beyond the end of the stream'. Added new Resource Report, that displays all in game resources in a concise report. Added in temp directory cleaner, to keep excess files from building up. Fixed use of colors on the windows, to work better with desktop schemes. Adding base support for multilingual resources. This will allow loading of the Space Engineers resources to show localized names, and display localized date a...ClosedXML - The easy way to OpenXML: ClosedXML 0.71.2: More memory and performance improvements. Fixed an issue with pivot table field order.Composite Iconote: Composite Iconote: This is a composite has been made by Microsoft Visual Studio 2013. Requirement: To develop this composite or use this component in your application, your computer must have .NET framework 4.5 or newer.Magick.NET: Magick.NET 6.8.9.101: Magick.NET linked with ImageMagick 6.8.9.1. Breaking changes: - Int/short Set methods of WritablePixelCollection are now unsigned. - The Q16 build no longer uses HDRI, switch to the new Q16-HDRI build if you need HDRI.fnr.exe - Find And Replace Tool: 1.7: Bug fixes Refactored logic for encoding text values to command line to handle common edge cases where find/replace operation works in GUI but not in command line Fix for bug where selection in Encoding drop down was different when generating command line in some cases. It was reported in: https://findandreplace.codeplex.com/workitem/34 Fix for "Backslash inserted before dot in replacement text" reported here: https://findandreplace.codeplex.com/discussions/541024 Fix for finding replacing...VG-Ripper & PG-Ripper: VG-Ripper 2.9.59: changes NEW: Added Support for 'GokoImage.com' links NEW: Added Support for 'ViperII.com' links NEW: Added Support for 'PixxxView.com' links NEW: Added Support for 'ImgRex.com' links NEW: Added Support for 'PixLiv.com' links NEW: Added Support for 'imgsee.me' links NEW: Added Support for 'ImgS.it' linksToolbox for Dynamics CRM 2011/2013: XrmToolBox (v1.2014.5.28): XrmToolbox improvement XrmToolBox updates (v1.2014.5.28)Fix connecting to a connection with custom authentication without saved password Tools improvement New tool!Solution Components Mover (v1.2014.5.22) Transfer solution components from one solution to another one Import/Export NN relationships (v1.2014.3.7) Allows you to import and export many to many relationships Tools updatesAttribute Bulk Updater (v1.2014.5.28) Audit Center (v1.2014.5.28) View Layout Replicator (v1.2014.5.28) Scrip...Microsoft Ajax Minifier: Microsoft Ajax Minifier 5.10: Fix for Issue #20875 - echo switch doesn't work for CSS CSS should honor the SASS source-file comments JS should allow multi-line comment directivesNew ProjectsAirline Management Solutions: Three layers architecture PHP Maria DB Metro-StyleBAOnline: tttboomteam: Fitness videoscsv2xlsx: this project was created to simplify process of converting csv text files to Excel tables. It uses Apache POI to work with Excel Hazza.ShapeField: Adds a field that lets you input the name of a shape to be displayed.HP AGM Monitor Service: Just for internal usage.HP AGM RestAPI Wrapper: HP AGM Rest API .Net WrapperIO Performance Verifier: IO Performance verifier is for verifying IO from fx SAN/NAS in a virtualized environment on Windows servers. Useful to verify SLA or configuration change effectIRIS Tutorials: This a repository of tutorials for the IRIS Toolbox project.Node Service Host: Host application to run node app as a service. Runs service as root with app as specified user, restarts, logging. Linux, OSX and windows.SEND SMS ALERT FROM YOUR SOFTWARE / WEBSITE: SMS API allows you to send SMS to all mobile operators across Pakistan or any other country at very very cheap rates. It allows you to send SMS through http://CSharePoint Audit Facilities Demo: Sample demo code for SharePoint Audit Log extraction and methods used. SharePoint Permission Analyzer: Permission Analyzer will scan through a SharePoint site collection and create a permission structure of the site. Works on SharePoint 2010 and SP 2013Spotify WinRT Component: WinRT Component wrapper for libspotify https://developer.spotify.com/technologies/libspotify/Windows API Interop Library: A collection of interop code in C# for the Windows API. Key exported methods, constants and structures defined. Some extension methods for WinForms controls.WPF Pricing unit: await async wpf Entity framework 6zzswire: ????

    Read the article

  • CodePlex Daily Summary for Thursday, June 05, 2014

    CodePlex Daily Summary for Thursday, June 05, 2014Popular Releases51Degrees - Device Detection and Redirection: 3.1.2.3: Version 3.1 HighlightsDevice detection algorithm is over 100 times faster. Regular expressions and levenshtein distance calculations are no longer used. The device detection algorithm performance is no longer limited by the number of device combinations contained in the dataset. Two modes of operation are available: Memory – the detection data set is loaded into memory and there is no continuous connection to the source data file. Slower initialisation time but faster detection performanc...CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.27.0: CodeMap now indicates the type name for all members Implemented running scripts 'as administrator'. Just add '//css_npp asadmin' to the script and run it as usual. 'Prepare script for distribution' now aggregates script dependency assemblies. Various improvements in CodeSnipptet, Autcompletion and MethodInfo interactions with each other. Added printing line number for the entries in CodeMap (subject of configuration value) Improved debugging step indication for classless scripts ...Load Runner - HTTP Pressure Test Tool: Load Runner 1.1: 1. added support for measuring total in time / hits / traffic / requests 2. abstracted log, default log to console / text file 3. refactored codeKartris E-commerce: Kartris v2.6003: Bug fixes: Fixed issue where category could not be saved if parent categories not altered Updated split string function to 500 chars otherwise problems with attribute filtering in PowerPack Improvements: If a user has group pricing below that available through qty discounts, that will show in place of the qty discountMagicaVoxel: MagicaVoxel Renderer ver 0.01: MagicaVoxel Renderer (Win 0.01)ClosedXML - The easy way to OpenXML: ClosedXML 0.72.3: 70426e13c415 ClosedXML for .Net 4.0 now uses Open XML SDK 2.5 b9ef53a6654f Merge branch 'master' of https://git01.codeplex.com/forks/vbjay/closedxml 727714e86416 Fix range.Merge(Boolean) for .Net 3.5 eb1ed478e50e Make public range.Merge(Boolean checkIntersects) 6284cf3c3991 More performance improvements when saving.MCE Controller: MCE Controller V1.8.6: BETA. Adds option to disable all internal commands. If the "Disable Internal Commands" checkbox in settings is checked, MCE Controller will disable all internally defined commands (e.g. VK key codes, single characters, mouse commands, etc...) and only respond to commands defined in the MCEControl.commmands file. Attempts to fix inability for clients to reconnect after they forcibly close the connection. Build 1.8.6.37445Visual F# Tools: Daily Builds Preview 06-04-2014: This preview is released for use under a proprietary license.SEToolbox: SEToolbox 01.032.018 Release 1: Added ability to merge/join two ships, regardless of origin. Added Language selection menu to set display text language (for SE resources only), and fixed inherent issues. Added full support for Dedicated Servers, allowing use on PC without Steam present, and only the Dedicated Server install. Added Browse button for used specified save game selection in Load dialog. Installation of this version will replace older version.Service monitor: Version 3.1: Added the ability of the tool to restart itself in 'Admin' mode without UAC prompt. Note that this only works after the tool has run in 'Admin' mode at least once before. This is only supported on Vista, Windows 7 or later. See my blog post on how it is done.DNN Blog: 06.00.07: Highlights: Enhancement: Option to show management panel on child modules Fix: Changed SPROC and code to ensure the right people are notified of pending comments. (CP-24018) Fix: Fix to have notification actions authentication assume the right module id so these will work from the messaging center (CP-24019) Fix: Fix to issue in categories in a post that will not save when no categories and no tags selectedTEncoder: 4.0.0: --4.0.0 -Added: Video downloader -Added: Total progress will be updated more smoothly -Added: MP4Box progress will be shown -Added: A tool to create gif image from video -Added: An option to disable trimming -Added: Audio track option won't be used for mpeg sources as default -Fixed: Subtitle position wasn't used -Fixed: Duration info in the file list wasn't updated after trimming -Updated: FFMpegQuickMon: Version 3.14 (Pie release): This is unofficially the 'Pie' release. There are two big changes.1. 'Presets' - basically templates. Future releases might build on this to allow users to add more presets. 2. MSI Installer now allows you to choose components (in case you don't want all collectors etc.). This means you don't have to download separate components anymore (AllAgents.zip still included in case you want to use them separately) Some other changes:1. Add/changed default file extension for monitor packs to *.qmp (...VeraCrypt: VeraCrypt version 1.0d: Changes between 1.0c and 1.0d (03 June 2014) : Correct issue while creating hidden operating system. Minor fixes (look at git history for more details).Keepass2Android: 0.9.4-pre1: added plug-in support: See settings for how to get plug-ins! published QR plug-in (scan passwords, display passwords as QR code, transfer entries to other KP2A devices) published InputStick plugin (transfer credentials to your PC via bluetooth - requires InputStick USB stick) Third party apps can now simply implement querying KP2A for credentials. Are you a developer? Please add this to your app if suitable! added TOTP support (compatible with KeeOTP and TrayTotp) app should no l...Microsoft Web Protection Library: AntiXss Library 4.3.0: Download from nuget or the Microsoft Download Center This issue finally addresses the over zealous behaviour of the HTML Sanitizer which should now function as expected once again. HTML encoding has been changed to safelist a few more characters for webforms compatibility. This will be the last version of AntiXSS that contains a sanitizer. Any new releases will be encoding libraries only. We recommend you explore other sanitizer options, for example AntiSamy https://www.owasp.org/index....Z SqlBulkCopy Extensions: SqlBulkCopy Extensions 1.0.0: SqlBulkCopy Extensions provide MUST-HAVE methods with outstanding performance missing from the SqlBulkCopy class like Delete, Update, Merge, Upsert. Compatible with .NET 2.0, SQL Server 2000, SQL Azure and more! Bulk MethodsBulkDelete BulkInsert BulkMerge BulkUpdate BulkUpsert Utility MethodsGetSqlConnection GetSqlTransaction You like this library? Find out how and why you should support Z Project Become a Memberhttp://zzzproject.com/resources/images/all/become-a-member.png|ht...Tweetinvi a friendly Twitter C# API: Tweetinvi 0.9.3.x: Timelines- Added all the parameters available from the Timeline Endpoints in Tweetinvi. - This is available for HomeTimeline, UserTimeline, MentionsTimeline // Simple query var tweets = Timeline.GetHomeTimeline(); // Create a parameter for queries with specific parameters var timelineParameter = Timeline.CreateHomeTimelineRequestParameter(); timelineParameter.ExcludeReplies = true; timelineParameter.TrimUser = true; var tweets = Timeline.GetHomeTimeline(timelineParameter); Tweetinvi 0.9.3.1...Sandcastle Help File Builder: Help File Builder and Tools v2014.5.31.0: General InformationIMPORTANT: On some systems, the content of the ZIP file is blocked and the installer may fail to run. Before extracting it, right click on the ZIP file, select Properties, and click on the Unblock button if it is present in the lower right corner of the General tab in the properties dialog. This release completes removal of the branding transformations and implements the new VS2013 presentation style that utilizes the new lightweight website format. Several breaking cha...Magick.NET: Magick.NET 6.8.9.101: Magick.NET linked with ImageMagick 6.8.9.1. Breaking changes: - Int/short Set methods of WritablePixelCollection are now unsigned. - The Q16 build no longer uses HDRI, switch to the new Q16-HDRI build if you need HDRI.New Projects12306helper: 12306 ????2112110044: dasda2112110202: asdfsdfsdfsd2112110298: TieuDan2112110315: vftgcdxfrBaidu PCS: BaiduPCSExcel2Sobek: Here the project Excel2Sobek is hosted. Excel2Sobek is a preprocessor in Excel's VBA for the SOBEK 2 hydrological and hydraulic simulation model by Deltares.Help Desk Pro: Manage a business of customer serviceMARGYE: MARGYE CMS CMR E-CommerceMemberBS: MemberBS20140605MiKroTik API: Api untuk menghubungkan aplikasi anda dengan router Mikrotik.MiniProfiler + Log4Net: Use MiniProfiler and Log4Net in ASP.NET projectStore Apps Unity App Base for Prism: This simple library provides an Application Base for a Windows Store application that automatically connects Unity to provide injection as the container.SuperCaptcha for MVC: Custom captcha control for MVCTC760240: myappwithfailuresWindows Phone 8 Dilbert Comic Reader: Basic WP8.1 app that displays comics from http://www.dilbert.com.x86 Proved: Prove what you run!

    Read the article

  • StreamInsight 2.1, meet LINQ

    - by Roman Schindlauer
    Someone recently called LINQ “magic” in my hearing. I leapt to LINQ’s defense immediately. Turns out some people don’t realize “magic” is can be a pejorative term. I thought LINQ needed demystification. Here’s your best demystification resource: http://blogs.msdn.com/b/mattwar/archive/2008/11/18/linq-links.aspx. I won’t repeat much of what Matt Warren says in his excellent series, but will talk about some core ideas and how they affect the 2.1 release of StreamInsight. Let’s tell the story of a LINQ query. Compile time It begins with some code: IQueryable<Product> products = ...; var query = from p in products             where p.Name == "Widget"             select p.ProductID; foreach (int id in query) {     ... When the code is compiled, the C# compiler (among other things) de-sugars the query expression (see C# spec section 7.16): ... var query = products.Where(p => p.Name == "Widget").Select(p => p.ProductID); ... Overload resolution subsequently binds the Queryable.Where<Product> and Queryable.Select<Product, int> extension methods (see C# spec sections 7.5 and 7.6.5). After overload resolution, the compiler knows something interesting about the anonymous functions (lambda syntax) in the de-sugared code: they must be converted to expression trees, i.e.,“an object structure that represents the structure of the anonymous function itself” (see C# spec section 6.5). The conversion is equivalent to the following rewrite: ... var prm1 = Expression.Parameter(typeof(Product), "p"); var prm2 = Expression.Parameter(typeof(Product), "p"); var query = Queryable.Select<Product, int>(     Queryable.Where<Product>(         products,         Expression.Lambda<Func<Product, bool>>(Expression.Property(prm1, "Name"), prm1)),         Expression.Lambda<Func<Product, int>>(Expression.Property(prm2, "ProductID"), prm2)); ... If the “products” expression had type IEnumerable<Product>, the compiler would have chosen the Enumerable.Where and Enumerable.Select extension methods instead, in which case the anonymous functions would have been converted to delegates. At this point, we’ve reduced the LINQ query to familiar code that will compile in C# 2.0. (Note that I’m using C# snippets to illustrate transformations that occur in the compiler, not to suggest a viable compiler design!) Runtime When the above program is executed, the Queryable.Where method is invoked. It takes two arguments. The first is an IQueryable<> instance that exposes an Expression property and a Provider property. The second is an expression tree. The Queryable.Where method implementation looks something like this: public static IQueryable<T> Where<T>(this IQueryable<T> source, Expression<Func<T, bool>> predicate) {     return source.Provider.CreateQuery<T>(     Expression.Call(this method, source.Expression, Expression.Quote(predicate))); } Notice that the method is really just composing a new expression tree that calls itself with arguments derived from the source and predicate arguments. Also notice that the query object returned from the method is associated with the same provider as the source query. By invoking operator methods, we’re constructing an expression tree that describes a query. Interestingly, the compiler and operator methods are colluding to construct a query expression tree. The important takeaway is that expression trees are built in one of two ways: (1) by the compiler when it sees an anonymous function that needs to be converted to an expression tree, and; (2) by a query operator method that constructs a new queryable object with an expression tree rooted in a call to the operator method (self-referential). Next we hit the foreach block. At this point, the power of LINQ queries becomes apparent. The provider is able to determine how the query expression tree is evaluated! The code that began our story was intentionally vague about the definition of the “products” collection. Maybe it is a queryable in-memory collection of products: var products = new[]     { new Product { Name = "Widget", ProductID = 1 } }.AsQueryable(); The in-memory LINQ provider works by rewriting Queryable method calls to Enumerable method calls in the query expression tree. It then compiles the expression tree and evaluates it. It should be mentioned that the provider does not blindly rewrite all Queryable calls. It only rewrites a call when its arguments have been rewritten in a way that introduces a type mismatch, e.g. the first argument to Queryable.Where<Product> being rewritten as an expression of type IEnumerable<Product> from IQueryable<Product>. The type mismatch is triggered initially by a “leaf” expression like the one associated with the AsQueryable query: when the provider recognizes one of its own leaf expressions, it replaces the expression with the original IEnumerable<> constant expression. I like to think of this rewrite process as “type irritation” because the rewritten leaf expression is like a foreign body that triggers an immune response (further rewrites) in the tree. The technique ensures that only those portions of the expression tree constructed by a particular provider are rewritten by that provider: no type irritation, no rewrite. Let’s consider the behavior of an alternative LINQ provider. If “products” is a collection created by a LINQ to SQL provider: var products = new NorthwindDataContext().Products; the provider rewrites the expression tree as a SQL query that is then evaluated by your favorite RDBMS. The predicate may ultimately be evaluated using an index! In this example, the expression associated with the Products property is the “leaf” expression. StreamInsight 2.1 For the in-memory LINQ to Objects provider, a leaf is an in-memory collection. For LINQ to SQL, a leaf is a table or view. When defining a “process” in StreamInsight 2.1, what is a leaf? To StreamInsight a leaf is logic: an adapter, a sequence, or even a query targeting an entirely different LINQ provider! How do we represent the logic? Remember that a standing query may outlive the client that provisioned it. A reference to a sequence object in the client application is therefore not terribly useful. But if we instead represent the code constructing the sequence as an expression, we can host the sequence in the server: using (var server = Server.Connect(...)) {     var app = server.Applications["my application"];     var source = app.DefineObservable(() => Observable.Range(0, 10, Scheduler.NewThread));     var query = from i in source where i % 2 == 0 select i; } Example 1: defining a source and composing a query Let’s look in more detail at what’s happening in example 1. We first connect to the remote server and retrieve an existing app. Next, we define a simple Reactive sequence using the Observable.Range method. Notice that the call to the Range method is in the body of an anonymous function. This is important because it means the source sequence definition is in the form of an expression, rather than simply an opaque reference to an IObservable<int> object. The variation in Example 2 fails. Although it looks similar, the sequence is now a reference to an in-memory observable collection: var local = Observable.Range(0, 10, Scheduler.NewThread); var source = app.DefineObservable(() => local); // can’t serialize ‘local’! Example 2: error referencing unserializable local object The Define* methods support definitions of operator tree leaves that target the StreamInsight server. These methods all have the same basic structure. The definition argument is a lambda expression taking between 0 and 16 arguments and returning a source or sink. The method returns a proxy for the source or sink that can then be used for the usual style of LINQ query composition. The “define” methods exploit the compile-time C# feature that converts anonymous functions into translatable expression trees! Query composition exploits the runtime pattern that allows expression trees to be constructed by operators taking queryable and expression (Expression<>) arguments. The practical upshot: once you’ve Defined a source, you can compose LINQ queries in the familiar way using query expressions and operator combinators. Notably, queries can be composed using pull-sequences (LINQ to Objects IQueryable<> inputs), push sequences (Reactive IQbservable<> inputs), and temporal sequences (StreamInsight IQStreamable<> inputs). You can even construct processes that span these three domains using “bridge” method overloads (ToEnumerable, ToObservable and To*Streamable). Finally, the targeted rewrite via type irritation pattern is used to ensure that StreamInsight computations can leverage other LINQ providers as well. Consider the following example (this example depends on Interactive Extensions): var source = app.DefineEnumerable((int id) =>     EnumerableEx.Using(() =>         new NorthwindDataContext(), context =>             from p in context.Products             where p.ProductID == id             select p.ProductName)); Within the definition, StreamInsight has no reason to suspect that it ‘owns’ the Queryable.Where and Queryable.Select calls, and it can therefore defer to LINQ to SQL! Let’s use this source in the context of a StreamInsight process: var sink = app.DefineObserver(() => Observer.Create<string>(Console.WriteLine)); var query = from name in source(1).ToObservable()             where name == "Widget"             select name; using (query.Bind(sink).Run("process")) {     ... } When we run the binding, the source portion which filters on product ID and projects the product name is evaluated by SQL Server. Outside of the definition, responsibility for evaluation shifts to the StreamInsight server where we create a bridge to the Reactive Framework (using ToObservable) and evaluate an additional predicate. It’s incredibly easy to define computations that span multiple domains using these new features in StreamInsight 2.1! Regards, The StreamInsight Team

    Read the article

  • Enterprise Process Maps: A Process Picture worth a Million Words

    - by raul.goycoolea
    p { margin-bottom: 0.08in; }h1 { margin-top: 0.33in; margin-bottom: 0in; color: rgb(54, 95, 145); page-break-inside: avoid; }h1.western { font-family: "Cambria",serif; font-size: 14pt; }h1.cjk { font-family: "DejaVu Sans"; font-size: 14pt; }h1.ctl { font-size: 14pt; } Getting Started with Business Transformations A well-known proverb states that "A picture is worth a thousand words." In relation to Business Process Management (BPM), a credible analyst might have a few questions. What if the picture was taken from some particular angle, like directly overhead? What if it was taken from only an inch away or a mile away? What if the photographer did not focus the camera correctly? Does the value of the picture depend on who is looking at it? Enterprise Process Maps are analogous in this sense of relative value. Every BPM project (holistic BPM kick-off, enterprise system implementation, Service-oriented Architecture, business process transformation, corporate performance management, etc.) should be begin with a clear understanding of the business environment, from the biggest picture representations down to the lowest level required or desired for the particular project type, scope and objectives. The Enterprise Process Map serves as an entry point for the process architecture and is defined: the single highest level of process mapping for an organization. It is constructed and evaluated during the Strategy Phase of the Business Process Management Lifecycle. (see Figure 1) Fig. 1: Business Process Management Lifecycle Many organizations view such maps as visual abstractions, constructed for the single purpose of process categorization. This, in turn, results in a lesser focus on the inherent intricacies of the Enterprise Process view, which are explored in the course of this paper. With the main focus of a large scale process documentation effort usually underlying an ERP or other system implementation, it is common for the work to be driven by the desire to "get to the details," and to the type of modeling that will derive near-term tangible results. For instance, a project in American Pharmaceutical Company X is driven by the Director of IT. With 120+ systems in place, and a lack of standardized processes across the United States, he and the VP of IT have decided to embark on a long-term ERP implementation. At the forethought of both are questions, such as: How does my application architecture map to the business? What are each application's functionalities, and where do the business processes utilize them? Where can we retire legacy systems? Well-developed BPM methodologies prescribe numerous model types to capture such information and allow for thorough analysis in these areas. Process to application maps, Event Driven Process Chains, etc. provide this level of detail and facilitate the completion of such project-specific questions. These models and such analysis are appropriately carried out at a relatively low level of process detail. (see figure 2) Fig. 2: The Level Concept, Generic Process HierarchySome of the questions remaining are ones of documentation longevity, the continuation of BPM practice in the organization, process governance and ownership, process transparency and clarity in business process objectives and strategy. The Level Concept in Brief Figure 2 shows a generic, four-level process hierarchy depicting the breakdown of a "Process Area" into progressively more detailed process classifications. The number of levels and the names of these levels are flexible, and can be fit to the standards of the organization's chosen terminology or any other chosen reference model that makes logical sense for both short and long term process description. It is at Level 1 (in this case the Process Area level), that the Enterprise Process Map is created. This map and its contained objects become the foundation for a top-down approach to subsequent mapping, object relationship development, and analysis of the organization's processes and its supporting infrastructure. Additionally, this picture serves as a communication device, at an executive level, describing the design of the business in its service to a customer. It seems, then, imperative that the process development effort, and this map, start off on the right foot. Figuring out just what that right foot is, however, is critical and trend-setting in an evolving organization. Key Considerations Enterprise Process Maps are usually not as living and breathing as other process maps. Just as it would be an extremely difficult task to change the foundation of the Sears Tower or a city plan for the entire city of Chicago, the Enterprise Process view of an organization usually remains unchanged once developed (unless, of course, an organization is at a stage where it is capable of true, high-level process innovation). Regardless, the Enterprise Process map is a key first step, and one that must be taken in a precise way. What makes this groundwork solid depends on not only the materials used to construct it (process areas), but also the layout plan and knowledge base of what will be built (the entire process architecture). It seems reasonable that care and consideration are required to create this critical high level map... but what are the important factors? Does the process modeler need to worry about how many process areas there are? About who is looking at it? Should he only use the color pink because it's his boss' favorite color? Interestingly, and perhaps surprisingly, these are all valid considerations that may just require a bit of structure. Below are Three Key Factors to consider when building an Enterprise Process Map: Company Strategic Focus Process Categorization: Customer is Core End-to-end versus Functional Processes Company Strategic Focus As mentioned above, the Enterprise Process Map is created during the Strategy Phase of the Business Process Management Lifecycle. From Oracle Business Process Management methodology for business transformation, it is apparent that business processes exist for the purpose of achieving the strategic objectives of an organization. In a prescribed, top-down approach to process development, it must be ensured that each process fulfills its objectives, and in an aggregated manner, drives fulfillment of the strategic objectives of the company, whether for particular business segments or in a broader sense. This is a crucial point, as the strategic messages of the company must therefore resound in its process maps, in particular one that spans the processes of the complete business: the Enterprise Process Map. One simple example from Company X is shown below (see figure 3). Fig. 3: Company X Enterprise Process Map In reviewing Company X's Enterprise Process Map, one can immediately begin to understand the general strategic mindset of the organization. It shows that Company X is focused on its customers, defining 10 of its process areas belonging to customer-focused categories. Additionally, the organization views these end-customer-oriented process areas as part of customer-fulfilling value chains, while support process areas do not provide as much contiguous value. However, by including both support and strategic process categorizations, it becomes apparent that all processes are considered vital to the success of the customer-oriented focus processes. Below is an example from Company Y (see figure 4). Fig. 4: Company Y Enterprise Process Map Company Y, although also a customer-oriented company, sends a differently focused message with its depiction of the Enterprise Process Map. Along the top of the map is the company's product tree, overarching the process areas, which when executed deliver the products themselves. This indicates one strategic objective of excellence in product quality. Additionally, the view represents a less linear value chain, with strong overlaps of the various process areas. Marketing and quality management are seen as a key support processes, as they span the process lifecycle. Often, companies may incorporate graphics, logos and symbols representing customers and suppliers, and other objects to truly send the strategic message to the business. Other times, Enterprise Process Maps may show high level of responsibility to organizational units, or the application types that support the process areas. It is possible that hundreds of formats and focuses can be applied to an Enterprise Process Map. What is of vital importance, however, is which formats and focuses are chosen to truly represent the direction of the company, and serve as a driver for focusing the business on the strategic objectives set forth in that right. Process Categorization: Customer is Core In the previous two examples, processes were grouped using differing categories and techniques. Company X showed one support and three customer process categorizations using encompassing chevron objects; Customer Y achieved a less distinct categorization using a gradual color scheme. Either way, and in general, modeling of the process areas becomes even more valuable and easily understood within the context of business categorization, be it strategic or otherwise. But how one categorizes their processes is typically more complex than simply choosing object shapes and colors. Previously, it was stated that the ideal is a prescribed top-down approach to developing processes, to make certain linkages all the way back up to corporate strategy. But what about external influences? What forces push and pull corporate strategy? Industry maturity, product lifecycle, market profitability, competition, etc. can all drive the critical success factors of a particular business segment, or the company as a whole, in addition to previous corporate strategy. This may seem to be turning into a discussion of theory, but that is far from the case. In fact, in years of recent study and evolution of the way businesses operate, cross-industry and across the globe, one invariable has surfaced with such strength to make it undeniable in the game plan of any strategy fit for survival. That constant is the customer. Many of a company's critical success factors, in any business segment, relate to the customer: customer retention, satisfaction, loyalty, etc. Businesses serve customers, and so do a business's processes, mapped or unmapped. The most effective way to categorize processes is in a manner that visualizes convergence to what is core for a company. It is the value chain, beginning with the customer in mind, and ending with the fulfillment of that customer, that becomes the core or the centerpiece of the Enterprise Process Map. (See figure 5) Fig. 5: Company Z Enterprise Process Map Company Z has what may be viewed as several different perspectives or "cuts" baked into their Enterprise Process Map. It has divided its processes into three main categories (top, middle, and bottom) of Management Processes, the Core Value Chain and Supporting Processes. The Core category begins with Corporate Marketing (which contains the activities of beginning to engage customers) and ends with Customer Service Management. Within the value chain, this company has divided into the focus areas of their two primary business lines, Foods and Beverages. Does this mean that areas, such as Strategy, Information Management or Project Management are not as important as those in the Core category? No! In some cases, though, depending on the organization's understanding of high-level BPM concepts, use of category names, such as "Core," "Management" or "Support," can be a touchy subject. What is important to understand, is that no matter the nomenclature chosen, the Core processes are those that drive directly to customer value, Support processes are those which make the Core processes possible to execute, and Management Processes are those which steer and influence the Core. Some common terms for these three basic categorizations are Core, Customer Fulfillment, Customer Relationship Management, Governing, Controlling, Enabling, Support, etc. End-to-end versus Functional Processes Every high and low level of process: function, task, activity, process/work step (whatever an organization calls it), should add value to the flow of business in an organization. Suppose that within the process "Deliver package," there is a documented task titled "Stop for ice cream." It doesn't take a process expert to deduce the room for improvement. Though stopping for ice cream may create gain for the one person performing it, it likely benefits neither the organization nor, more importantly, the customer. In most cases, "Stop for ice cream" wouldn't make it past the first pass of To-Be process development. What would make the cut, however, would be a flow of tasks that, each having their own value add, build up to greater and greater levels of process objective. In this case, those tasks would combine to achieve a status of "package delivered." Figure 3 shows a simple example: Just as the package can only be delivered (outcome of the process) without first being retrieved, loaded, and the travel destination reached (outcomes of the process steps), some higher level of process "Play Practical Joke" (e.g., main process or process area) cannot be completed until a package is delivered. It seems that isolated or functionally separated processes, such as "Deliver Package" (shown in Figure 6), are necessary, but are always part of a bigger value chain. Each of these individual processes must be analyzed within the context of that value chain in order to ensure successful end-to-end process performance. For example, this company's "Create Joke Package" process could be operating flawlessly and efficiently, but if a joke is never developed, it cannot be created, so the end-to-end process breaks. Fig. 6: End to End Process Construction That being recognized, it is clear that processes must be viewed as end-to-end, customer-to-customer, and in the context of company strategy. But as can also be seen from the previous example, these vital end-to-end processes cannot be built without the functionally oriented building blocks. Without one, the other cannot be had, or at least not in a complete and organized fashion. As it turns out, but not discussed in depth here, the process modeling effort, BPM organizational development, and comprehensive coverage cannot be fully realized without a semi-functional, process-oriented approach. Then, an Enterprise Process Map should be concerned with both views, the building blocks, and access points to the business-critical end-to-end processes, which they construct. Without the functional building blocks, all streams of work needed for any business transformation would be lost mess of process disorganization. End-to-end views are essential for utilization in optimization in context, understanding customer impacts, base-lining all project phases and aligning objectives. Including both views on an Enterprise Process Map allows management to understand the functional orientation of the company's processes, while still providing access to end-to-end processes, which are most valuable to them. (See figures 7 and 8). Fig. 7: Simplified Enterprise Process Map with end-to-end Access Point The above examples show two unique ways to achieve a successful Enterprise Process Map. The first example is a simple map that shows a high level set of process areas and a separate section with the end-to-end processes of concern for the organization. This particular map is filtered to show just one vital end-to-end process for a project-specific focus. Fig. 8: Detailed Enterprise Process Map showing connected Functional Processes The second example shows a more complex arrangement and categorization of functional processes (the names of each process area has been removed). The end-to-end perspective is achieved at this level through the connections (interfaces at lower levels) between these functional process areas. An important point to note is that the organization of these two views of the Enterprise Process Map is dependent, in large part, on the orientation of its audience, and the complexity of the landscape at the highest level. If both are not apparent, the Enterprise Process Map is missing an opportunity to serve as a holistic, high-level view. Conclusion In the world of BPM, and specifically regarding Enterprise Process Maps, a picture can be worth as many words as the thought and effort that is put into it. Enterprise Process Maps alone cannot change an organization, but they serve more purposes than initially meet the eye, and therefore must be designed in a way that enables a BPM mindset, business process understanding and business transformation efforts. Every Enterprise Process Map will and should be different when looking across organizations. Its design will be driven by company strategy, a level of customer focus, and functional versus end-to-end orientations. This high-level description of the considerations of the Enterprise Process Maps is not a prescriptive "how to" guide. However, a company attempting to create one may not have the practical BPM experience to truly explore its options or impacts to the coming work of business process transformation. The biggest takeaway is that process modeling, at all levels, is a science and an art, and art is open to interpretation. It is critical that the modeler of the highest level of process mapping be a cognoscente of the message he is delivering and the factors at hand. Without sufficient focus on the design of the Enterprise Process Map, an entire BPM effort may suffer. For additional information please check: Oracle Business Process Management.

    Read the article

  • Oracle BI Server Modeling, Part 1- Designing a Query Factory

    - by bob.ertl(at)oracle.com
      Welcome to Oracle BI Development's BI Foundation blog, focused on helping you get the most value from your Oracle Business Intelligence Enterprise Edition (BI EE) platform deployments.  In my first series of posts, I plan to show developers the concepts and best practices for modeling in the Common Enterprise Information Model (CEIM), the semantic layer of Oracle BI EE.  In this segment, I will lay the groundwork for the modeling concepts.  First, I will cover the big picture of how the BI Server fits into the system, and how the CEIM controls the query processing. Oracle BI EE Query Cycle The purpose of the Oracle BI Server is to bridge the gap between the presentation services and the data sources.  There are typically a variety of data sources in a variety of technologies: relational, normalized transaction systems; relational star-schema data warehouses and marts; multidimensional analytic cubes and financial applications; flat files, Excel files, XML files, and so on. Business datasets can reside in a single type of source, or, most of the time, are spread across various types of sources. Presentation services users are generally business people who need to be able to query that set of sources without any knowledge of technologies, schemas, or how sources are organized in their company. They think of business analysis in terms of measures with specific calculations, hierarchical dimensions for breaking those measures down, and detailed reports of the business transactions themselves.  Most of them create queries without knowing it, by picking a dashboard page and some filters.  Others create their own analysis by selecting metrics and dimensional attributes, and possibly creating additional calculations. The BI Server bridges that gap from simple business terms to technical physical queries by exposing just the business focused measures and dimensional attributes that business people can use in their analyses and dashboards.   After they make their selections and start the analysis, the BI Server plans the best way to query the data sources, writes the optimized sequence of physical queries to those sources, post-processes the results, and presents them to the client as a single result set suitable for tables, pivots and charts. The CEIM is a model that controls the processing of the BI Server.  It provides the subject areas that presentation services exposes for business users to select simplified metrics and dimensional attributes for their analysis.  It models the mappings to the physical data access, the calculations and logical transformations, and the data access security rules.  The CEIM consists of metadata stored in the repository, authored by developers using the Administration Tool client.     Presentation services and other query clients create their queries in BI EE's SQL-92 language, called Logical SQL or LSQL.  The API simply uses ODBC or JDBC to pass the query to the BI Server.  Presentation services writes the LSQL query in terms of the simplified objects presented to the users.  The BI Server creates a query plan, and rewrites the LSQL into fully-detailed SQL or other languages suitable for querying the physical sources.  For example, the LSQL on the left below was rewritten into the physical SQL for an Oracle 11g database on the right. Logical SQL   Physical SQL SELECT "D0 Time"."T02 Per Name Month" saw_0, "D4 Product"."P01  Product" saw_1, "F2 Units"."2-01  Billed Qty  (Sum All)" saw_2 FROM "Sample Sales" ORDER BY saw_0, saw_1       WITH SAWITH0 AS ( select T986.Per_Name_Month as c1, T879.Prod_Dsc as c2,      sum(T835.Units) as c3, T879.Prod_Key as c4 from      Product T879 /* A05 Product */ ,      Time_Mth T986 /* A08 Time Mth */ ,      FactsRev T835 /* A11 Revenue (Billed Time Join) */ where ( T835.Prod_Key = T879.Prod_Key and T835.Bill_Mth = T986.Row_Wid) group by T879.Prod_Dsc, T879.Prod_Key, T986.Per_Name_Month ) select SAWITH0.c1 as c1, SAWITH0.c2 as c2, SAWITH0.c3 as c3 from SAWITH0 order by c1, c2   Probably everybody reading this blog can write SQL or MDX.  However, the trick in designing the CEIM is that you are modeling a query-generation factory.  Rather than hand-crafting individual queries, you model behavior and relationships, thus configuring the BI Server machinery to manufacture millions of different queries in response to random user requests.  This mass production requires a different mindset and approach than when you are designing individual SQL statements in tools such as Oracle SQL Developer, Oracle Hyperion Interactive Reporting (formerly Brio), or Oracle BI Publisher.   The Structure of the Common Enterprise Information Model (CEIM) The CEIM has a unique structure specifically for modeling the relationships and behaviors that fill the gap from logical user requests to physical data source queries and back to the result.  The model divides the functionality into three specialized layers, called Presentation, Business Model and Mapping, and Physical, as shown below. Presentation services clients can generally only see the presentation layer, and the objects in the presentation layer are normally the only ones used in the LSQL request.  When a request comes into the BI Server from presentation services or another client, the relationships and objects in the model allow the BI Server to select the appropriate data sources, create a query plan, and generate the physical queries.  That's the left to right flow in the diagram below.  When the results come back from the data source queries, the right to left relationships in the model show how to transform the results and perform any final calculations and functions that could not be pushed down to the databases.   Business Model Think of the business model as the heart of the CEIM you are designing.  This is where you define the analytic behavior seen by the users, and the superset library of metric and dimension objects available to the user community as a whole.  It also provides the baseline business-friendly names and user-readable dictionary.  For these reasons, it is often called the "logical" model--it is a virtual database schema that persists no data, but can be queried as if it is a database. The business model always has a dimensional shape (more on this in future posts), and its simple shape and terminology hides the complexity of the source data models. Besides hiding complexity and normalizing terminology, this layer adds most of the analytic value, as well.  This is where you define the rich, dimensional behavior of the metrics and complex business calculations, as well as the conformed dimensions and hierarchies.  It contributes to the ease of use for business users, since the dimensional metric definitions apply in any context of filters and drill-downs, and the conformed dimensions enable dashboard-wide filters and guided analysis links that bring context along from one page to the next.  The conformed dimensions also provide a key to hiding the complexity of many sources, including federation of different databases, behind the simple business model. Note that the expression language in this layer is LSQL, so that any expression can be rewritten into any data source's query language at run time.  This is important for federation, where a given logical object can map to several different physical objects in different databases.  It is also important to portability of the CEIM to different database brands, which is a key requirement for Oracle's BI Applications products. Your requirements process with your user community will mostly affect the business model.  This is where you will define most of the things they specifically ask for, such as metric definitions.  For this reason, many of the best-practice methodologies of our consulting partners start with the high-level definition of this layer. Physical Model The physical model connects the business model that meets your users' requirements to the reality of the data sources you have available. In the query factory analogy, think of the physical layer as the bill of materials for generating physical queries.  Every schema, table, column, join, cube, hierarchy, etc., that will appear in any physical query manufactured at run time must be modeled here at design time. Each physical data source will have its own physical model, or "database" object in the CEIM.  The shape of each physical model matches the shape of its physical source.  In other words, if the source is normalized relational, the physical model will mimic that normalized shape.  If it is a hypercube, the physical model will have a hypercube shape.  If it is a flat file, it will have a denormalized tabular shape. To aid in query optimization, the physical layer also tracks the specifics of the database brand and release.  This allows the BI Server to make the most of each physical source's distinct capabilities, writing queries in its syntax, and using its specific functions. This allows the BI Server to push processing work as deep as possible into the physical source, which minimizes data movement and takes full advantage of the database's own optimizer.  For most data sources, native APIs are used to further optimize performance and functionality. The value of having a distinct separation between the logical (business) and physical models is encapsulation of the physical characteristics.  This encapsulation is another enabler of packaged BI applications and federation.  It is also key to hiding the complex shapes and relationships in the physical sources from the end users.  Consider a routine drill-down in the business model: physically, it can require a drill-through where the first query is MDX to a multidimensional cube, followed by the drill-down query in SQL to a normalized relational database.  The only difference from the user's point of view is that the 2nd query added a more detailed dimension level column - everything else was the same. Mappings Within the Business Model and Mapping Layer, the mappings provide the binding from each logical column and join in the dimensional business model, to each of the objects that can provide its data in the physical layer.  When there is more than one option for a physical source, rules in the mappings are applied to the query context to determine which of the data sources should be hit, and how to combine their results if more than one is used.  These rules specify aggregate navigation, vertical partitioning (fragmentation), and horizontal partitioning, any of which can be federated across multiple, heterogeneous sources.  These mappings are usually the most sophisticated part of the CEIM. Presentation You might think of the presentation layer as a set of very simple relational-like views into the business model.  Over ODBC/JDBC, they present a relational catalog consisting of databases, tables and columns.  For business users, presentation services interprets these as subject areas, folders and columns, respectively.  (Note that in 10g, subject areas were called presentation catalogs in the CEIM.  In this blog, I will stick to 11g terminology.)  Generally speaking, presentation services and other clients can query only these objects (there are exceptions for certain clients such as BI Publisher and Essbase Studio). The purpose of the presentation layer is to specialize the business model for different categories of users.  Based on a user's role, they will be restricted to specific subject areas, tables and columns for security.  The breakdown of the model into multiple subject areas organizes the content for users, and subjects superfluous to a particular business role can be hidden from that set of users.  Customized names and descriptions can be used to override the business model names for a specific audience.  Variables in the object names can be used for localization. For these reasons, you are better off thinking of the tables in the presentation layer as folders than as strict relational tables.  The real semantics of tables and how they function is in the business model, and any grouping of columns can be included in any table in the presentation layer.  In 11g, an LSQL query can also span multiple presentation subject areas, as long as they map to the same business model. Other Model Objects There are some objects that apply to multiple layers.  These include security-related objects, such as application roles, users, data filters, and query limits (governors).  There are also variables you can use in parameters and expressions, and initialization blocks for loading their initial values on a static or user session basis.  Finally, there are Multi-User Development (MUD) projects for developers to check out units of work, and objects for the marketing feature used by our packaged customer relationship management (CRM) software.   The Query Factory At this point, you should have a grasp on the query factory concept.  When developing the CEIM model, you are configuring the BI Server to automatically manufacture millions of queries in response to random user requests. You do this by defining the analytic behavior in the business model, mapping that to the physical data sources, and exposing it through the presentation layer's role-based subject areas. While configuring mass production requires a different mindset than when you hand-craft individual SQL or MDX statements, it builds on the modeling and query concepts you already understand. The following posts in this series will walk through the CEIM modeling concepts and best practices in detail.  We will initially review dimensional concepts so you can understand the business model, and then present a pattern-based approach to learning the mappings from a variety of physical schema shapes and deployments to the dimensional model.  Along the way, we will also present the dimensional calculation template, and learn how to configure the many additivity patterns.

    Read the article

  • CodePlex Daily Summary for Saturday, April 07, 2012

    CodePlex Daily Summary for Saturday, April 07, 2012Popular ReleasesHarness - Internet Explorer Automation: Harness 2.0.3: support the operation fo frameset, frame and iframe Add commands SwitchFrame GetUrl GoBack GoForward Refresh SetTimeout GetTimeout Rename commands GetActiveWindow to GetActiveBrowser SetActiveWindow to SetActiveBrowser FindWindowAll to FindBrowser NewWindow to NewBrowser GetMajorVersion to GetVersionBetter Explorer: Better Explorer 2.0.0.861 Alpha: - fixed new folder button operation not work well in some situations - removed some unnecessary code like subclassing that is not needed anymore - Added option to make Better Exlorer default (at least for WIN+E operations) - Added option to enable file operation replacements (like Terracopy) to work with Better Explorer - Added some basic usability to "Share" button - Other fixesLightFarsiDictionary - ??????? ??? ?????/???????: LightFarsiDictionary - v1: LightFarsiDictionary - v1WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.3: Version: 2.5.0.3 (Milestone 3): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete [O] WAF: Mark the StringBuilderExtensions class as obsolete because the AppendInNewLine method can be replaced with string.Jo...RiP-Ripper & PG-Ripper: RiP-Ripper 2.9.30: changes NEW: Added Support for "DirectUpload.net" links NEW: Added Support for "PixRoute.com" links NEW: Added Support for "ImagePicasa.com" links FIXED: "PixHub.eu" linksCommunity TFS Build Extensions: April 2012: Release notes to follow...ClosedXML - The easy way to OpenXML: ClosedXML 0.65.2: Aside from many bug fixes we now have Conditional Formatting The conditional formatting was sponsored by http://www.bewing.nl (big thanks) New on v0.65.1 Fixed issue when loading conditional formatting with default values for icon sets New on v0.65.2 Fixed issue loading conditional formatting Improved inserts performanceLiberty: v3.2.0.0 Release 4th April 2012: Change Log-Added -Halo 3 support (invincibility, ammo editing) -Halo 3: ODST support (invincibility, ammo editing) -The file transfer page now shows its progress in the Windows 7 taskbar -"About this build" settings page -Reach Change what an object is carrying -Reach Change which node a carried object is attached to -Reach Object node viewer and exporter -Reach Change which weapons you are carrying from the object editor -Reach Edit the weapon controller of vehicles and turrets -An error dia...MSBuild Extension Pack: April 2012: Release Blog Post The MSBuild Extension Pack April 2012 release provides a collection of over 435 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GUID’...DotNetNuke® Community Edition CMS: 06.01.05: Major Highlights Fixed issue that stopped users from creating vocabularies when the portal ID was not zero Fixed issue that caused modules configured to be displayed on all pages to be added to the wrong container in new pages Fixed page quota restriction issue in the Ribbon Bar Removed restriction that would not allow users to use a dash in page names. Now users can create pages with names like "site-map" Fixed issue that was causing the wrong container to be loaded in modules wh...51Degrees.mobi - Mobile Device Detection and Redirection: 2.1.3.1: One Click Install from NuGet Changes to Version 2.1.3.11. [assembly: AllowPartiallyTrustedCallers] has been added back into the AssemblyInfo.cs file to prevent failures with other assemblies in Medium trust environments. 2. The Lite data embedded into the assembly has been updated to include devices from December 2011. The 42 new RingMark properties will return Unknown if RingMark data is not available. Changes to Version 2.1.2.11Code Changes 1. The project is now licenced under the Mozilla...MVC Controls Toolkit: Mvc Controls Toolkit 2.0.0: Added Support for Mvc4 beta and WebApi The SafeqQuery and HttpSafeQuery IQueryable implementations that works as wrappers aroung any IQueryable to protect it from unwished queries. "Client Side" pager specialized in paging javascript data coming either from a remote data source, or from local data. LinQ like fluent javascript api to build queries either against remote data sources, or against local javascript data, with exactly the same interface. There are 3 different query objects exp...ExtAspNet: ExtAspNet v3.1.2: ExtAspNet - ?? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ?????????? ExtAspNet ????? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ??????????。 ExtAspNet ??????? JavaScript,?? CSS,?? UpdatePanel,?? ViewState,?? WebServices ???????。 ??????: IE 7.0, Firefox 3.6, Chrome 3.0, Opera 10.5, Safari 3.0+ ????:Apache License 2.0 (Apache) ??:http://extasp.net/ ??:http://bbs.extasp.net/ ??:http://extaspnet.codeplex.com/ ??:http://sanshi.cnblogs.com/ ????: +2012-04-04 v3.1.2 -??IE?????????????BUG(??"about:blank"?...nopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.50: Highlight features & improvements: • Significant performance optimization. • Allow store owners to create several shipments per order. Added a new shipping status: “Partially shipped”. • Pre-order support added. Enables your customers to place a Pre-Order and pay for the item in advance. Displays “Pre-order” button instead of “Buy Now” on the appropriate pages. Makes it possible for customer to buy available goods and Pre-Order items during one session. It can be managed on a product variant ...WiX Toolset: WiX v3.6 RC0: WiX v3.6 RC0 (3.6.2803.0) provides support for VS11 and a more stable Burn engine. For more information see Rob's blog post about the release: http://robmensching.com/blog/posts/2012/4/3/WiX-v3.6-Release-Candidate-Zero-availableSageFrame: SageFrame 2.0: Sageframe is an open source ASP.NET web development framework developed using ASP.NET 3.5 with service pack 1 (sp1) technology. It is designed specifically to help developers build dynamic website by providing core functionality common to most web applications.iTuner - The iTunes Companion: iTuner 1.5.4475: Fix to parse empty playlists in iTunes LibraryDocument.Editor: 2012.2: Whats New for Document.Editor 2012.2: New Save Copy support New Page Setup support Minor Bug Fix's, improvements and speed upsVidCoder: 1.3.2: Added option for the minimum title length to scan. Added support to enable or disable LibDVDNav. Added option to prompt to delete source files after clearing successful completed items. Added option to disable remembering recent files and folders. Tweaked number box to only select all on a quick click.MJP's DirectX 11 Samples: Light Indexed Deferred Rendering: Implements light indexed deferred using per-tile light lists calculated in a compute shader, as well as a traditional deferred renderer that uses a compute shader for per-tile light culling and per-pixel shading.New ProjectsAdvertising Management: Ph?n m?m qu?n lý qu?ng cáoAgile Compact Database: It is database for all. AssemblyTransformer: AssemblyTransformer is a tool for modifying .NET assemblies using Mono Cecil. It handles the entire transformation process including strong name signing and offers a simple command-line interface and a basic framework for creating and configuring specific transformations.Cafe For You: Ph?n m?m gi?i thi?u và qu?n lý quán cafeClient-side Templated Script Control: Allows a developer to add a repeater-style templated list control to a web page that will be data bound client-side, and may respond to client events. The control may be data bound by a web service call on initialization, and may also have it's data source set via client code.CRM Project - Beginner Sample: Sample to help beginners to start in C# development. Ejemplo para ayudar a quienes inician con el desarrollo en C#.Deployment Made Easy: The goal of this project is to make deployments to windows servers easy using the web deployment toolEasyCMS: EasyCMSExcel to SQL Server Database Bulk Transfer: Quick and simple WPF tool to allow users export data from an Excel spreadsheet to a SQL Server database table. Provided as is. But if you need any help let me know. HTML Client demo for WCF RIA Services: Demo application with HTML client (upshot.js + knockout.js) on WCF RIA ServicesKOI: Kinect Open Interface: Kinect Open Interface, KOI, provides a way to detect and have the user confirm 11 gestures for your UI. Please read my blog for info: http://www.kinecthelp.com/2012/04/koi-kinect-open-interface.htmlLazyWinAdmin: LazyWinAdmin is a Powershell script to manage local or remote machine ressources.LCDSmartie dll to display Audio spectrum on Windows 7: An LCDSmartie plugin that displays anything being played as an audio spectrum.LiveHelpChatApp: With Live chat help you can provide online / Offline help to your client it has facebook style chat for online and offline users Download and EnjoyMailSender: Small tool for sending mail messages contains multiple attachements with sum size bigger than allowed size. You can drag'n'drop attachments and click send - application split all attachments to parts and sent it separately. There is not address book yet. Mauricio: Mauricio Lima PageMiddleware and Enterprise services foundation: Define a model of deployment and management for Middleware and enterprise applicationsMyFirstPro: This is a test projOld Games Launcher: Old Games Launcher is a combined DosBox frontend & a Direct Draw game/application starter.Pharmakos Studio: Pharmakos Studio is an extensible IDE. It was originally written specifically as an UnrealScript editor for the UDK, however it is being written so that any language can be supported via plugins.Proyecto Eclipse-Android: Proyectos con Eclipse-AdroidProyectos II: Proyecto para Farmaciapullsource: pull source directsource filterQuizzer: Awesome program for quizzes and tests.Solution Settings for Visual Studio: Solution Settings for Visual Studio allows a file containing settings such as formatting, fonts and colors to be included with a project. When the solution is opened, these settings are automatically applied, and when it is closed, the changes are reverted.sundance: test test testWebcomic Reader: A little Idea for an on-, and offline usable, touch-friendly Windows 7 Webcomic Reader.WinRT PathTextBlock: WinRT PathTextBlock is a control that overcomes some of the limitations in the built in WinRT TextBlock, such as not being able to outline the text, and not being able to distort the text, for example to draw it along a circle. Previously, you could use a tool like Expression Design to create the text and export it as a Path, but this wouldn't work for text that needed to be specified at run time. This control allows you to specify the Text property and it will generate the proper Path obj...Yaplex open source projects: Yaplex open source projects????API SDK-VB6(oauth2): ????API SDK-VB6(oauth2)????????API SDK VB6: ??????????API SDK vb?

    Read the article

  • CodePlex Daily Summary for Wednesday, November 28, 2012

    CodePlex Daily Summary for Wednesday, November 28, 2012Popular ReleasesCommand Line Parser Library: 1.9.3.23 beta: Fixes an issue notified by github user sbambrick about parsing negative numbers.MCEBuddy 2.x: MCEBuddy 2.3.10: Critical Update to 2.3.9: Changelog for 2.3.10 (32bit and 64bit) 1. AsfBin executable missing from build 2. Removed extra references from build to avoid conflict 3. Showanalyzer installation now checked on remote engine machine Changelog for 2.3.9 (32bit and 64bit) 1. Added support for WTV output profile 2. Added support for minimizing MCEBuddy to the system tray 3. Added support for custom archive folder 4. Added support to disable subdirectory monitoring 5. Added support for better TS fil...DotNetNuke® Community Edition CMS: 07.00.00: Major Highlights Fixed issue that caused profiles of deleted users to be available Removed the postback after checkboxes are selected in Page Settings > Taxonomy Implemented the functionality required to edit security role names and social group names Fixed JavaScript error when using a ";" semicolon as a profile property Fixed issue when using DateTime properties in profiles Fixed viewstate error when using Facebook authentication in conjunction with "require valid profile fo...CODE Framework: 4.0.21128.0: See change notes in the documentation section for details on what's new.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.76: Fixed a typo in ObjectLiteralProperty.IsConstant that caused all object literals to be treated like they were constants, and possibly moved around in the code when they shouldn't be.Kooboo CMS: Kooboo CMS 3.3.0: New features: Dropdown/Radio/Checkbox Lists no longer references the userkey. Instead they refer to the UUID field for input value. You can now delete, export, import content from database in the site settings. Labels can now be imported and exported. You can now set the required password strength and maximum number of incorrect login attempts. Child sites can inherit plugins from its parent sites. The view parameter can be changed through the page_context.current value. Addition of c...Facebook Windows 8 Sample: Facebook Windows 8 Sample: The current drop holds two versions of the sample: A basic version that uses a Facebook application to list the content of facebook page. A full version including the use of Bing Maps sdk for positioning the restaurant in a map, and showing how to get there. See Developing a Windows Store App to learn how to use the Bing Maps AJAX Control to add Bing Maps to your Windows Store app.Distributed Publish/Subscribe (Pub/Sub) Event System: Distributed Pub Sub Event System Version 3.0: Important Wsp 3.0 is NOT backward compatible with Wsp 2.1. Prerequisites You need to install the Microsoft Visual C++ 2010 Redistributable Package. You can find it at: x64 http://www.microsoft.com/download/en/details.aspx?id=14632x86 http://www.microsoft.com/download/en/details.aspx?id=5555 Wsp now uses Rx (Reactive Extensions) and .Net 4.0 3.0 Enhancements I changed the topology from a hierarchy to peer-to-peer groups. This should provide much greater scalability and more fault-resi...datajs - JavaScript Library for data-centric web applications: datajs version 1.1.0: datajs is a cross-browser and UI agnostic JavaScript library that enables data-centric web applications with the following features: OData client that enables CRUD operations including batching and metadata support using both ATOM and JSON payloads. Single store abstraction that provides a common API on top of HTML5 local storage technologies. Data cache component that allows reading data ranges from a collection and storing them locally to reduce the number of network requests. Changes...Team Foundation Server Administration Tool: 2.2: TFS Administration Tool 2.2 supports the Team Foundation Server 2012 Object Model. Visual Studio 2012 or Team Explorer 2012 must be installed before you can install this tool. You can download and install Team Explorer 2012 from http://aka.ms/TeamExplorer2012. There are no functional changes between the previous release (2.1) and this release.XrmServiceToolkit - A CRM 2011 JavaScript Library: XrmServiceToolkit 1.3.1: Version: 1.3.1 Date: November, 2012 Dependency: JSON2, jQuery (latest or 1.7.2 above) New Feature - A change of logic to increase peroformance when returning large number of records New Function - XrmServiceToolkit.Soap.QueryAll: Return all available records by query options (>5k+) New Fix - XrmServiceToolkit.Rest.RetrieveMultiple not returning records more than 50 New Fix - XrmServiceToolkit.Soap.Business error when refering number fields like (int, double, float) New ...Coding Guidelines for C# 3.0, C# 4.0 and C# 5.0: Coding Guidelines for CSharp 3.0, 4.0 and 5.0: See Change History for a detailed list of modifications.Math.NET Numerics: Math.NET Numerics v2.3.0: Portable Library Build: Adds support for WP8 (.Net 4.0 and higher, SL5, WP8 and .NET for Windows Store apps) New: portable build also for F# extensions (.Net 4.5, SL5 and .NET for Windows Store apps) NuGet: portable builds are now included in the main packages, no more need for special portable packages Linear Algebra: Continued major storage rework, in this release focusing on vectors (previous release was on matrices) Thin QR decomposition (in addition to existing full QR) Static Cr...ExtJS based ASP.NET 2.0 Controls: FineUI v3.2.1: +2012-11-25 v3.2.1 +????????。 -MenuCheckBox?CheckedChanged??????,??????????。 -???????window.IDS??????????????。 -?????(??TabCollection,ControlBaseCollection)???,????????????????。 +Grid??。 -??SelectAllRows??。 -??PageItems??,?????????????,?????、??、?????。 -????grid/gridpageitems.aspx、grid/gridpageitemsrowexpander.aspx、grid/gridpageitems_pagesize.aspx。 -???????????????????。 -??ExpandAllRowExpanders??,?????????????????(grid/gridrowexpanderexpandall2.aspx)。 -??????ExpandRowExpande...VidCoder: 1.4.9 Beta: Updated HandBrake core to SVN 5079. Fixed crashes when encoding DVDs with title gaps.ZXing.Net: ZXing.Net 0.10.0.0: On the way to a release 1.0 the API should be stable now with this version. sync with rev. 2521 of the java version windows phone 8 assemblies improvements and fixesBlackJumboDog: Ver5.7.3: 2012.11.24 Ver5.7.3 (1)SMTP???????、?????????、??????????????????????? (2)?????????、?????????????????????????? (3)DNS???????CNAME????CNAME????????????????? (4)DNS????????????TTL???????? (5)???????????????????????、?????????????????? (6)???????????????????????????????Liberty: v3.4.3.0 Release 23rd November 2012: Change Log -Added -H4 A dialog which gives further instructions when attempting to open a "Halo 4 Data" file -H4 Added a short note to the weapon editor stating that dropping your weapons will cap their ammo -Reach Edit the world's gravity -Reach Fine invincibility controls in the object editor -Reach Edit object velocity -Reach Change the teams of AI bipeds and vehicles -Reach Enable/disable fall damage on the biped editor screen -Reach Make AIs deaf and/or blind in the objec...Umbraco CMS: Umbraco 4.11.1: NugetNuGet BlogRead the release blog post for 4.11.0. Read the release blog post for 4.11.1. Whats new50 bugfixes (see the issue tracker for a complete list) Read the documentation for the MVC bits. Breaking changesGetPropertyValue now returns an object, not a string (only affects upgrades from 4.10.x to 4.11.0) NoteIf you need Courier use the release candidate (as of build 26). The code editor has been greatly improved, but is sometimes problematic in Internet Explorer 9 and lower. Pr...Audio Pitch & Shift: Audio Pitch And Shift 5.1.0.3: Fixed supported files list on open dialog (added .pls and .m3u) Impulse Media Player splash message (can be disabled anyway)New Projects[Mini Game] Roll the Dice !: By Dede Wahyu H. and Wahyu Dwi W - System Information students of Ma Chung University Malang Indonesia Features: -Dice gambling mini game -With Bet System -Placongtieudung: It can not be sharedCRM 2011 Activity Summary: Activity Summary is add-on to Microsoft Dynamics CRM 2011. It shows list of activities with convenient preview panel. The layout is similar to Microsoft OutlookDas Klub: Das Klub is an international Industrial Dance Kommunity Web Application. Live site is: http://dasklub.comDigital Image Processing Assignment #1: My prof asks me to write it.Eastridge Web Includes: Eastridge Web Includes gives SharePoint developers and designers the ability to include CSS and JavaScript in a more intelligent way.Exception Hub: ????????????,???? “????????” ??? “web????” ??。 ??????(webform/winform)??? “????????” ,????????Exception?????,?????????? “web????” ,??,????????????????,?????。Kanbanize! .NET: The Kanbanize! .NET project is a C#/.NET (4.0) library created for accessing the http://www.kanbanize.com Kanban board API.KnockoutSP: This is a javascript library that will create a knockout ViewModel for the lists and libraries on a sharepoint site. MailTool: MailToolmv0901blog: blog modelMyPocket: Personal Budget PlannerObservableEntityCollection: Observable Entity Collection class that overcome limitations of EntityCollection when binding in MVVM applications. Overview on this link: PCIIApps: Projetos de AulaPIINFO Prototype: Prototipo a ser presentado el ramo PIINFOPIM: Projeto desenvolvido para o curso de Análise e desenvolvimento de sistemas, 3º/4º semestre.Prose: Prose is an playground for an experimental JavaScript like language compiler. Eventually it will implement 0-CFA, CFA2, and a Tracing JITScapLIB: Light weight Screen capture Library for C# .NET. Designed to enable programmers to make high quality and performance Screen capture programs quickly and easily.SCD Merge Wizard: SCD Merge Wizard will help you generate query script for Slowly Changing Dimension (SCD) transformations using Microsoft's TSQL Merge statement.Semantria SDK: Turn Unstructured Content Into Actionable Data Semantria’s API helps organizations to extract meaning from large amounts of unstructured text.SharePoint Person or Group columns: In one of my recent client project, I had a requirement send out a reminder to all users in Person or Group columns from custom tasks list in custom timer job.shopabc: an shop demoSID and STAR generator: Generate SID, STAR and transition data for virtual ATC clients such as VRC.Sistema de Gestión de Trámite Documentario (SGTD): Sistema de Gestión de Trámite Documentario (SGTD)Sistema de Gestion Medica WEB: Sistema de Gestion Medica WEB Realido en C# 2010 Express Incluye: 1. Reporta de Evento Adverso 2. Estadistica Diaria de Medicos sp_wcProject: sp_wcProject is a stored procedure designed to allow you to easily define the projection of columns used in a table, view, or table valued function and then optionally run the query. The column names may be specified using wildcards, hence the "wc" name prefix. TimmyPersonal: PersonalTLDRML: Python/JSON inspired markup language designed to be extremely terse.Typing: Like Typeracer.comUdostepnianie plików: Takie tamWeather.com plug-in for HouseBot: This plug-in will parse the weather.com xml feed and display in HouseBot automation software. Please note that it requires the c# wrapper to work found here: http://www.housebot.com/forums/viewtopic.php?f=4&t=856395YoloEngine: Yolo Game Engine????: ???????????。

    Read the article

< Previous Page | 9 10 11 12 13 14  | Next Page >