Search Results

Search found 13516 results on 541 pages for 'common dialog'.

Page 524/541 | < Previous Page | 520 521 522 523 524 525 526 527 528 529 530 531  | Next Page >

  • referencing part of the composite primary key

    - by Zavael
    I have problems with setting the reference on database table. I have following structure: CREATE TABLE club( id INTEGER NOT NULL, name_short VARCHAR(30), name_full VARCHAR(70) NOT NULL ); CREATE UNIQUE INDEX club_uix ON club(id); ALTER TABLE club ADD CONSTRAINT club_pk PRIMARY KEY (id); CREATE TABLE team( id INTEGER NOT NULL, club_id INTEGER NOT NULL, team_name VARCHAR(30) ); CREATE UNIQUE INDEX team_uix ON team(id, club_id); ALTER TABLE team ADD CONSTRAINT team_pk PRIMARY KEY (id, club_id); ALTER TABLE team ADD FOREIGN KEY (club_id) REFERENCES club(id); CREATE TABLE person( id INTEGER NOT NULL, first_name VARCHAR(20), last_name VARCHAR(20) NOT NULL ); CREATE UNIQUE INDEX person_uix ON person(id); ALTER TABLE person ADD PRIMARY KEY (id); CREATE TABLE contract( person_id INTEGER NOT NULL, club_id INTEGER NOT NULL, wage INTEGER ); CREATE UNIQUE INDEX contract_uix on contract(person_id); ALTER TABLE contract ADD CONSTRAINT contract_pk PRIMARY KEY (person_id); ALTER TABLE contract ADD FOREIGN KEY (club_id) REFERENCES club(id); ALTER TABLE contract ADD FOREIGN KEY (person_id) REFERENCES person(id); CREATE TABLE player( person_id INTEGER NOT NULL, team_id INTEGER, height SMALLINT, weight SMALLINT ); CREATE UNIQUE INDEX player_uix on player(person_id); ALTER TABLE player ADD CONSTRAINT player_pk PRIMARY KEY (person_id); ALTER TABLE player ADD FOREIGN KEY (person_id) REFERENCES person(id); -- ALTER TABLE player ADD FOREIGN KEY (team_id) REFERENCES team(id); --this is not working It gives me this error: Error code -5529, SQL state 42529: a UNIQUE constraint does not exist on referenced columns: TEAM in statement [ALTER TABLE player ADD FOREIGN KEY (team_id) REFERENCES team(id)] As you can see, team table has composite primary key (club_id + id), the person references club through contract. Person has some common attributes for player and other staff types. One club can have multiple teams. Employed person has to have a contract with a club. Player (is the specification of person) - if emplyed - can be assigned to one of the club's teams. Is there better way to design my structure? I thought about excluding the club_id from team's primary key, but I would like to know if this is the only way. Thanks. UPDATE 1 I would like to have the id as team identification only within the club, so multiple teams can have equal id as long as they belong to different clubs. Is it possible? UPDATE 2 updated the naming convention as adviced by philip Some business rules to better understand the structure: One club can have 1..n teams (Main squad, Reserve squad, Youth squad or Team A, Team B... only team can play match, not club) One team belongs to one club only A player is type of person (other types (staff) are scouts, coaches etc so they do not need to belong to specific team, just to the club, if employed) Person can have 0..1 contract with 1 club (that means he is employed or unemployed) Player (if employed) belongs to one team of the club Now thinking about it - moving team_id from player to contract would solve my problem, and it could hold the condition "Player (if employed) belongs to one team of the club", but it would be redundant for other staff types. What do you think?

    Read the article

  • ASP.NET MVC 3 - New Features

    - by imran_ku07
    Introduction:          ASP.NET MVC 3 just released by ASP.NET MVC team which includes some new features, some changes, some improvements and bug fixes. In this article, I will show you the new features of ASP.NET MVC 3. This will help you to get started using the new features of ASP.NET MVC 3. Full details of this announcement is available at Announcing release of ASP.NET MVC 3, IIS Express, SQL CE 4, Web Farm Framework, Orchard, WebMatrix.   Description:       New Razor View Engine:              Razor view engine is one of the most coolest new feature in ASP.NET MVC 3. Razor is speeding things up just a little bit more. It is much smaller and lighter in size. Also it is very easy to learn. You can say ' write less, do more '. You can get start and learn more about Razor at Introducing “Razor” – a new view engine for ASP.NET.         Granular Request Validation:             Another biggest new feature in ASP.NET MVC 3 is Granular Request Validation. Default request validator will throw an exception when he see < followed by an exclamation(like <!) or < followed by the letters a through z(like <s) or & followed by a pound sign(like &#123) as a part of querystring, posted form, headers and cookie collection. In previous versions of ASP.NET MVC, you can control request validation using ValidateInputAttriubte. In ASP.NET MVC 3 you can control request validation at Model level by annotating your model properties with a new attribute called AllowHtmlAttribute. For details see Granular Request Validation in ASP.NET MVC 3.       Sessionless Controller Support:             Sessionless Controller is another great new feature in ASP.NET MVC 3. With Sessionless Controller you can easily control your session behavior for controllers. For example, you can make your HomeController's Session as Disabled or ReadOnly, allowing concurrent request execution for single user. For details see Concurrent Requests In ASP.NET MVC and HowTo: Sessionless Controller in MVC3 – what & and why?.       Unobtrusive Ajax and  Unobtrusive Client Side Validation is Supported:             Another cool new feature in ASP.NET MVC 3 is support for Unobtrusive Ajax and Unobtrusive Client Side Validation.  This feature allows separation of responsibilities within your web application by separating your html with your script. For details see Unobtrusive Ajax in ASP.NET MVC 3 and Unobtrusive Client Validation in ASP.NET MVC 3.       Dependency Resolver:             Dependency Resolver is another great feature of ASP.NET MVC 3. It allows you to register a dependency resolver that will be used by the framework. With this approach your application will not become tightly coupled and the dependency will be injected at run time. For details see ASP.NET MVC 3 Service Location.       New Helper Methods:             ASP.NET MVC 3 includes some helper methods of ASP.NET Web Pages technology that are used for common functionality. These helper methods includes: Chart, Crypto, WebGrid, WebImage and WebMail. For details of these helper methods, please see ASP.NET MVC 3 Release Notes. For using other helper methods of ASP.NET Web Pages see Using ASP.NET Web Pages Helpers in ASP.NET MVC.       Child Action Output Caching:             ASP.NET MVC 3 also includes another feature called Child Action Output Caching. This allows you to cache only a portion of the response when you are using Html.RenderAction or Html.Action. This cache can be varied by action name, action method signature and action method parameter values. For details see this.       RemoteAttribute:             ASP.NET MVC 3 allows you to validate a form field by making a remote server call through Ajax. This makes it very easy to perform remote validation at client side and quickly give the feedback to the user. For details see How to: Implement Remote Validation in ASP.NET MVC.       CompareAttribute:             ASP.NET MVC 3 includes a new validation attribute called CompareAttribute. CompareAttribute allows you to compare the values of two different properties of a model. For details see CompareAttribute in ASP.NET MVC 3.       Miscellaneous New Features:                    ASP.NET MVC 2 includes FormValueProvider, QueryStringValueProvider, RouteDataValueProvider and HttpFileCollectionValueProvider. ASP.NET MVC 3 adds two additional value providers, ChildActionValueProvider and JsonValueProvider(JsonValueProvider is not physically exist).  ChildActionValueProvider is used when you issue a child request using Html.Action and/or Html.RenderAction methods, so that your explicit parameter values in Html.Action and/or Html.RenderAction will always take precedence over other value providers. JsonValueProvider is used to model bind JSON data. For details see Sending JSON to an ASP.NET MVC Action Method Argument.           In ASP.NET MVC 3, a new property named FileExtensions added to the VirtualPathProviderViewEngine class. This property is used when looking up a view by path (and not by name), so that only views with a file extension contained in the list specified by this new property is considered. For details see VirtualPathProviderViewEngine.FileExtensions Property .           ASP.NET MVC 3 installation package also includes the NuGet Package Manager which will be automatically installed when you install ASP.NET MVC 3. NuGet makes it easy to install and update open source libraries and tools in Visual Studio. See this for details.           In ASP.NET MVC 2, client side validation will not trigger for overridden model properties. For example, if have you a Model that contains some overridden properties then client side validation will not trigger for overridden properties in ASP.NET MVC 2 but client side validation will work for overridden properties in ASP.NET MVC 3.           Client side validation is not supported for StringLengthAttribute.MinimumLength property in ASP.NET MVC 2. In ASP.NET MVC 3 client side validation will work for StringLengthAttribute.MinimumLength property.           ASP.NET MVC 3 includes new action results like HttpUnauthorizedResult, HttpNotFoundResult and HttpStatusCodeResult.           ASP.NET MVC 3 includes some new overloads of LabelFor and LabelForModel methods. For details see LabelExtensions.LabelForModel and LabelExtensions.LabelFor.           In ASP.NET MVC 3, IControllerFactory includes a new method GetControllerSessionBehavior. This method is used to get controller's session behavior. For details see IControllerFactory.GetControllerSessionBehavior Method.           In ASP.NET MVC 3, Controller class includes a new property ViewBag which is of type dynamic. This property allows you to access ViewData Dictionary using C # 4.0 dynamic features. For details see ControllerBase.ViewBag Property.           ModelMetadata includes a property AdditionalValues which is of type Dictionary. In ASP.NET MVC 3 you can populate this property using AdditionalMetadataAttribute. For details see AdditionalMetadataAttribute Class.           In ASP.NET MVC 3 you can also use MvcScaffolding to scaffold your Views and Controller. For details see Scaffold your ASP.NET MVC 3 project with the MvcScaffolding package.           If you want to convert your application from ASP.NET MVC 2 to ASP.NET MVC 3 then there is an excellent tool that automatically converts ASP.NET MVC 2 application to ASP.NET MVC 3 application. For details see MVC 3 Project Upgrade Tool.           In ASP.NET MVC 2 DisplayAttribute is not supported but in ASP.NET MVC 3 DisplayAttribute will work properly.           ASP.NET MVC 3 also support model level validation via the new IValidatableObject interface.           ASP.NET MVC 3 includes a new helper method Html.Raw. This helper method allows you to display unencoded HTML.     Summary:          In this article I showed you the new features of ASP.NET MVC 3. This will help you a lot when you start using ASP MVC 3. I also provide you the links where you can find further details. Hopefully you will enjoy this article too.  

    Read the article

  • Version Assemblies with TFS 2010 Continuous Integration

    - by Steve Michelotti
    When I first heard that TFS 2010 had moved to Workflow Foundation for Team Build, I was *extremely* skeptical. I’ve loved MSBuild and didn’t quite understand the reasons for this change. In fact, given that I’ve been exclusively using Cruise Control for Continuous Integration (CI) for the last 5+ years of my career, I was skeptical of TFS for CI in general. However, after going through the learning process for TFS 2010 recently, I’m starting to become a believer. I’m also starting to see some of the benefits with Workflow Foundation for the overall processing because it gives you constructs not available in MSBuild such as parallel tasks, better control flow constructs, and a slightly better customization story. The first customization I had to make to the build process was to version the assemblies of my solution. This is not new. In fact, I’d recommend reading Mike Fourie’s well known post on Versioning Code in TFS before you get started. This post describes several foundational aspects of versioning assemblies regardless of your version of TFS. The main points are: 1) don’t use source control operations for your version file, 2) use a schema like <Major>.<Minor>.<IncrementalNumber>.0, and 3) do not keep AssemblyVersion and AssemblyFileVersion in sync. To do this in TFS 2010, the best post I’ve found has been Jim Lamb’s post of building a custom TFS 2010 workflow activity. Overall, this post is excellent but the primary issue I have with it is that the assembly version numbers produced are based in a date and look like this: “2010.5.15.1”. This is definitely not what I want. I want to be able to communicate to the developers and stakeholders that we are producing the “1.1 release” or “1.2 release” – which would have an assembly version number of “1.1.317.0” for example. In this post, I’ll walk through the process of customizing the assembly version number based on this method – customizing the concepts in Lamb’s post to suit my needs. I’ll also be combining this with the concepts of Fourie’s post – particularly with regards to the standards around how to version the assemblies. The first thing I’ll do is add a file called SolutionAssemblyVersionInfo.cs to the root of my solution that looks like this: 1: using System; 2: using System.Reflection; 3: [assembly: AssemblyVersion("1.1.0.0")] 4: [assembly: AssemblyFileVersion("1.1.0.0")] I’ll then add that file as a Visual Studio link file to each project in my solution by right-clicking the project, “Add – Existing Item…” then when I click the SolutionAssemblyVersionInfo.cs file, making sure I “Add As Link”: Now the Solution Explorer will show our file. We can see that it’s a “link” file because of the black arrow in the icon within all our projects. Of course you’ll need to remove the AssemblyVersion and AssemblyFileVersion attributes from the AssemblyInfo.cs files to avoid the duplicate attributes since they now leave in the SolutionAssemblyVersionInfo.cs file. This is an extremely common technique so that all the projects in our solution can be versioned as a unit. At this point, we’re ready to write our custom activity. The primary consideration is that I want the developer and/or tech lead to be able to easily be in control of the Major.Minor and then I want the CI process to add the third number with a unique incremental number. We’ll leave the fourth position always “0” for now – it’s held in reserve in case the day ever comes where we need to do an emergency patch to Production based on a branched version.   Writing the Custom Workflow Activity Similar to Lamb’s post, I’m going to write two custom workflow activities. The “outer” activity (a xaml activity) will be pretty straight forward. It will check if the solution version file exists in the solution root and, if so, delegate the replacement of version to the AssemblyVersionInfo activity which is a CodeActivity highlighted in red below:   Notice that the arguments of this activity are the “solutionVersionFile” and “tfsBuildNumber” which will be passed in. The tfsBuildNumber passed in will look something like this: “CI_MyApplication.4” and we’ll need to grab the “4” (i.e., the incremental revision number) and put that in the third position. Then we’ll need to honor whatever was specified for Major.Minor in the SolutionAssemblyVersionInfo.cs file. For example, if the SolutionAssemblyVersionInfo.cs file had “1.1.0.0” for the AssemblyVersion (as shown in the first code block near the beginning of this post), then we want to resulting file to have “1.1.4.0”. Before we do anything, let’s put together a unit test for all this so we can know if we get it right: 1: [TestMethod] 2: public void Assembly_version_should_be_parsed_correctly_from_build_name() 3: { 4: // arrange 5: const string versionFile = "SolutionAssemblyVersionInfo.cs"; 6: WriteTestVersionFile(versionFile); 7: var activity = new VersionAssemblies(); 8: var arguments = new Dictionary<string, object> { 9: { "tfsBuildNumber", "CI_MyApplication.4"}, 10: { "solutionVersionFile", versionFile} 11: }; 12:   13: // act 14: var result = WorkflowInvoker.Invoke(activity, arguments); 15:   16: // assert 17: Assert.AreEqual("1.2.4.0", (string)result["newAssemblyFileVersion"]); 18: var lines = File.ReadAllLines(versionFile); 19: Assert.IsTrue(lines.Contains("[assembly: AssemblyVersion(\"1.2.0.0\")]")); 20: Assert.IsTrue(lines.Contains("[assembly: AssemblyFileVersion(\"1.2.4.0\")]")); 21: } 22: 23: private void WriteTestVersionFile(string versionFile) 24: { 25: var fileContents = "using System.Reflection;\n" + 26: "[assembly: AssemblyVersion(\"1.2.0.0\")]\n" + 27: "[assembly: AssemblyFileVersion(\"1.2.0.0\")]"; 28: File.WriteAllText(versionFile, fileContents); 29: }   At this point, the code for our AssemblyVersion activity is pretty straight forward: 1: [BuildActivity(HostEnvironmentOption.Agent)] 2: public class AssemblyVersionInfo : CodeActivity 3: { 4: [RequiredArgument] 5: public InArgument<string> FileName { get; set; } 6:   7: [RequiredArgument] 8: public InArgument<string> TfsBuildNumber { get; set; } 9:   10: public OutArgument<string> NewAssemblyFileVersion { get; set; } 11:   12: protected override void Execute(CodeActivityContext context) 13: { 14: var solutionVersionFile = this.FileName.Get(context); 15: 16: // Ensure that the file is writeable 17: var fileAttributes = File.GetAttributes(solutionVersionFile); 18: File.SetAttributes(solutionVersionFile, fileAttributes & ~FileAttributes.ReadOnly); 19:   20: // Prepare assembly versions 21: var majorMinor = GetAssemblyMajorMinorVersionBasedOnExisting(solutionVersionFile); 22: var newBuildNumber = GetNewBuildNumber(this.TfsBuildNumber.Get(context)); 23: var newAssemblyVersion = string.Format("{0}.{1}.0.0", majorMinor.Item1, majorMinor.Item2); 24: var newAssemblyFileVersion = string.Format("{0}.{1}.{2}.0", majorMinor.Item1, majorMinor.Item2, newBuildNumber); 25: this.NewAssemblyFileVersion.Set(context, newAssemblyFileVersion); 26:   27: // Perform the actual replacement 28: var contents = this.GetFileContents(newAssemblyVersion, newAssemblyFileVersion); 29: File.WriteAllText(solutionVersionFile, contents); 30:   31: // Restore the file's original attributes 32: File.SetAttributes(solutionVersionFile, fileAttributes); 33: } 34:   35: #region Private Methods 36:   37: private string GetFileContents(string newAssemblyVersion, string newAssemblyFileVersion) 38: { 39: var cs = new StringBuilder(); 40: cs.AppendLine("using System.Reflection;"); 41: cs.AppendFormat("[assembly: AssemblyVersion(\"{0}\")]", newAssemblyVersion); 42: cs.AppendLine(); 43: cs.AppendFormat("[assembly: AssemblyFileVersion(\"{0}\")]", newAssemblyFileVersion); 44: return cs.ToString(); 45: } 46:   47: private Tuple<string, string> GetAssemblyMajorMinorVersionBasedOnExisting(string filePath) 48: { 49: var lines = File.ReadAllLines(filePath); 50: var versionLine = lines.Where(x => x.Contains("AssemblyVersion")).FirstOrDefault(); 51:   52: if (versionLine == null) 53: { 54: throw new InvalidOperationException("File does not contain [assembly: AssemblyVersion] attribute"); 55: } 56:   57: return ExtractMajorMinor(versionLine); 58: } 59:   60: private static Tuple<string, string> ExtractMajorMinor(string versionLine) 61: { 62: var firstQuote = versionLine.IndexOf('"') + 1; 63: var secondQuote = versionLine.IndexOf('"', firstQuote); 64: var version = versionLine.Substring(firstQuote, secondQuote - firstQuote); 65: var versionParts = version.Split('.'); 66: return new Tuple<string, string>(versionParts[0], versionParts[1]); 67: } 68:   69: private string GetNewBuildNumber(string buildName) 70: { 71: return buildName.Substring(buildName.LastIndexOf(".") + 1); 72: } 73:   74: #endregion 75: }   At this point the final step is to incorporate this activity into the overall build template. Make a copy of the DefaultTempate.xaml – we’ll call it DefaultTemplateWithVersioning.xaml. Before the build and labeling happens, drag the VersionAssemblies activity in. Then set the LabelName variable to “BuildDetail.BuildDefinition.Name + "-" + newAssemblyFileVersion since the newAssemblyFileVersion was produced by our activity.   Configuring CI Once you add your solution to source control, you can configure CI with the build definition window as shown here. The main difference is that we’ll change the Process tab to reflect a different build number format and choose our custom build process file:   When the build completes, we’ll see the name of our project with the unique revision number:   If we look at the detailed build log for the latest build, we’ll see the label being created with our custom task:     We can now look at the history labels in TFS and see the project name with the labels (the Assignment activity I added to the workflow):   Finally, if we look at the physical assemblies that are produced, we can right-click on any assembly in Windows Explorer and see the assembly version in its properties:   Full Traceability We now have full traceability for our code. There will never be a question of what code was deployed to Production. You can always see the assembly version in the properties of the physical assembly. That can be traced back to a label in TFS where the unique revision number matches. The label in TFS gives you the complete snapshot of the code in your source control repository at the time the code was built. This type of process for full traceability has been used for many years for CI – in fact, I’ve done similar things with CCNet and SVN for quite some time. This is simply the TFS implementation of that pattern. The new features that TFS 2010 give you to make these types of customizations in your build process are quite easy once you get over the initial curve.

    Read the article

  • Upgrading from TFS 2010 RC to TFS 2010 RTM done

    - by Martin Hinshelwood
    Today is the big day, with the Launch of Visual Studio 2010 already done in Asia, and rolling around the world towards us, we are getting ready for the RTM (Released). We have had TFS 2010 in Production for nearly 6 months and have had only minimal problems. Update 12th April 2010  – Added Scott Hanselman’s tweet about the MSDN download release time. SSW was the first company in the world outside of Microsoft to deploy Visual Studio 2010 Team Foundation Server to production, not once, but twice. I am hoping to make it 3 in a row, but with all the hype around the new version, and with it being a production release and not just a go-live, I think there will be a lot of competition. Developers: MSDN will be updated with #vs2010 downloads and details at 10am PST *today*! @shanselman - Scott Hanselman Same as before, we need to Uninstall 2010 RC and install 2010 RTM. The installer will take care of all the complexity of actually upgrading any schema changes. If you are upgrading from TFS 2008 to TFS2010 you can follow our Rules To Better TFS 2010 Migration and read my post on our successes.   We run TFS 2010 in a Hyper-V virtual environment, so we have the advantage of running a snapshot as well as taking a DB backup. Done - Snapshot the hyper-v server Microsoft does not support taking a snapshot of a running server, for very good reason, and Brian Harry wrote a post after my last upgrade with the reason why you should never snapshot a running server. Done - Uninstall Visual Studio Team Explorer 2010 RC You will need to uninstall all of the Visual Studio 2010 RC client bits that you have on the server. Done - Uninstall TFS 2010 RC Done - Install TFS 2010 RTM Done - Configure TFS 2010 RTM Pick the Upgrade option and point it at your existing “tfs_Configuration” database to load all of the existing settings Done - Upgrade the SharePoint Extensions Upgrade Build Servers (Pending) Test the server The back out plan, and you should always have one, is to restore the snapshot. Upgrading to Team Foundation Server 2010 – Done The first thing you need to do is off the TFS server and then log into the Hyper-v server and create a snapshot. Figure: Make sure you turn the server off and delete all old snapshots before you take a new one I noticed that the snapshot that was taken before the Beta 2 to RC upgrade was still there. You should really delete old snapshots before you create a new one, but in this case the SysAdmin (who is currently tucked up in bed) asked me not to. I guess he is worried about a developer messing up his server Turn your server on and wait for it to boot in anticipation of all the nice shiny RTM’ness that is coming next. The upgrade procedure for TFS2010 is to uninstal the old version and install the new one. Figure: Remove Visual Studio 2010 Team Foundation Server RC from the system.   Figure: Most of the heavy lifting is done by the Uninstaller, but make sure you have removed any of the client bits first. Specifically Visual Studio 2010 or Team Explorer 2010.  Once the uninstall is complete, this took around 5 minutes for me, you can begin the install of the RTM. Running the 64 bit OS will allow the application to use more than 2GB RAM, which while not common may be of use in heavy load situations. Figure: It is always recommended to install the 64bit version of a server application where possible. I do not think it is likely, with SharePoint 2010 and Exchange 2010  and even Windows Server 2008 R2 being 64 bit only, I do not think there will be another release of a server app that is 32bit. You then need to choose what it is you want to install. This depends on how you are running TFS and on how many servers. In our case we run TFS and the Team Foundation Build Service (controller only) on out TFS server along with Analysis services and Reporting Services. But our SharePoint server lives elsewhere. Figure: This always confuses people, but in reality it makes sense. Don’t install what you do not need. Every extra you install has an impact of performance. If you are integrating with SharePoint you will need to run this install on every Front end server in your farm and don’t forget to upgrade your Build servers and proxy servers later. Figure: Selecting only Team Foundation Server (TFS) and Team Foundation Build Services (TFBS)   It is worth noting that if you have a lot of builds kicking off, and hence a lot of get operations against your TFS server, you can use a proxy server to cache the source control on another server in between your TFS server and your build servers. Figure: Installing Microsoft .NET Framework 4 takes the most time. Figure: Now run Windows Update, and SSW Diagnostic to make sure all your bits and bobs are up to date. Note: SSW Diagnostic will check your Power Tools, Add-on’s, Check in Policies and other bits as well. Configure Team Foundation Server 2010 – Done Now you can configure the server. If you have no key you will need to pick “Install a Trial Licence”, but it is only £500, or free with a MSDN subscription. Anyway, if you pick Trial you get 90 days to get your key. Figure: You can pick trial and add your key later using the TFS Server Admin. Here is where the real choices happen. We are doing an Upgrade from a previous version, so I will pick Upgrade the same as all you folks that are using the RC or TFS 2008. Figure: The upgrade wizard takes your existing 2010 or 2008 databases and upgraded them to the release.   Once you have entered your database server name you can click “List available databases” and it will show what it can upgrade. Figure: Select your database from the list and at this point, make sure you have a valid backup. At this point you have not made ANY changes to the databases. At this point the configuration wizard will load configuration from your existing database if you have one. If you are upgrading TFS 2008 refer to Rules To Better TFS 2010 Migration. Mostly during the wizard the default values will suffice, but depending on the configuration you want you can pick different options. Figure: Set the application tier account and Authentication method to use. We use NTLM to keep things simple as we host our TFS server externally for our remote developers.  Figure: Setting your TFS server URL’s to be the remote URL’s allows the reports to be accessed without using VPN. Very handy for those remote developers. Figure: Detected the existing Warehouse no problem. Figure: Again we love green ticks. It gives us a warm fuzzy feeling. Figure: The username for connecting to Reporting services should be a domain account (if you are on a domain that is). Figure: Setup the SharePoint integration to connect to your external SharePoint server. You can take the option to connect later.   You then need to run all of your readiness checks. These check can save your life! it will check all of the settings that you have entered as well as checking all the external services are configures and running properly. There are two reasons that TFS 2010 is so easy and painless to install where previous version were not. Microsoft changes the install to two steps, Install and configuration. The second reason is that they have pulled out all of the stops in making the install run all the checks necessary to make sure that once you start the install that it will complete. if you find any errors I recommend that you report them on http://connect.microsoft.com so everyone can benefit from your misery.   Figure: Now we have everything setup the configuration wizard can do its work.  Figure: Took a while on the “Web site” stage for some point, but zipped though after that.  Figure: last wee bit. TFS Needs to do a little tinkering with the data to complete the upgrade. Figure: All upgraded. I am not worried about the yellow triangle as SharePoint was being a little silly Exception Message: TF254021: The account name or password that you specified is not valid. (type TfsAdminException) Exception Stack Trace:    at Microsoft.TeamFoundation.Management.Controls.WizardCommon.AccountSelectionControl.TestLogon(String connectionString)    at System.ComponentModel.BackgroundWorker.WorkerThreadStart(Object argument) [Info   @16:10:16.307] Benign exception caught as part of verify: Exception Message: TF255329: The following site could not be accessed: http://projects.ssw.com.au/. The server that you specified did not return the expected response. Either you have not installed the Team Foundation Server Extensions for SharePoint Products on this server, or a firewall is blocking access to the specified site or the SharePoint Central Administration site. For more information, see the Microsoft Web site (http://go.microsoft.com/fwlink/?LinkId=161206). (type TeamFoundationServerException) Exception Stack Trace:    at Microsoft.TeamFoundation.Client.SharePoint.WssUtilities.VerifyTeamFoundationSharePointExtensions(ICredentials credentials, Uri url)    at Microsoft.TeamFoundation.Admin.VerifySharePointSitesUrl.Verify() Inner Exception Details: Exception Message: TF249064: The following Web service returned an response that is not valid: http://projects.ssw.com.au/_vti_bin/TeamFoundationIntegrationService.asmx. This Web service is used for the Team Foundation Server Extensions for SharePoint Products. Either the extensions are not installed, the request resulted in HTML being returned, or there is a problem with the URL. Verify that the following URL points to a valid SharePoint Web application and that the application is available: http://projects.ssw.com.au. If the URL is correct and the Web application is operating normally, verify that a firewall is not blocking access to the Web application. (type TeamFoundationServerInvalidResponseException) Exception Data Dictionary: ResponseStatusCode = InternalServerError I’ll look at SharePoint after, probably the SharePoint box just needs a restart or a kick If there is a problem with SharePoint it will come out in testing, But I will definatly be passing this on to Microsoft.   Upgrading the SharePoint connector to TFS 2010 You will need to upgrade the Extensions for SharePoint Products and Technologies on all of your SharePoint farm front end servers. To do this uninstall  the TFS 2010 RC from it in the same way as the server, and then install just the RTM Extensions. Figure: Only install the SharePoint Extensions on your SharePoint front end servers. TFS 2010 supports both SharePoint 2007 and SharePoint 2010.   Figure: When you configure SharePoint it uploads all of the solutions and templates. Figure: Everything is uploaded Successfully. Figure: TFS even remembered the settings from the previous installation, fantastic.   Upgrading the Team Foundation Build Servers to TFS 2010 Just like on the SharePoint servers you will need to upgrade the Build Server to the RTM. Just uninstall TFS 2010 RC and then install only the Team Foundation Build Services component. Unlike on the SharePoint server you will probably have some version of Visual Studio installed. You will need to remove this as well. (Coming Soon) Connecting Visual Studio 2010 / 2008 / 2005 and Eclipse to TFS2010 If you have developers still on Visual Studio 2005 or 2008 you will need do download the respective compatibility pack: Visual Studio Team System 2005 Service Pack 1 Forward Compatibility Update for Team Foundation Server 2010 Visual Studio Team System 2008 Service Pack 1 Forward Compatibility Update for Team Foundation Server 2010 If you are using Eclipse you can download the new Team Explorer Everywhere install for connecting to TFS. Get your developers to check that you have the latest version of your applications with SSW Diagnostic which will check for Service Packs and hot fixes to Visual Studio as well.   Technorati Tags: TFS,TFS2010,TFS 2010,Upgrade

    Read the article

  • Announcing release of ASP.NET MVC 3, IIS Express, SQL CE 4, Web Farm Framework, Orchard, WebMatrix

    - by ScottGu
    I’m excited to announce the release today of several products: ASP.NET MVC 3 NuGet IIS Express 7.5 SQL Server Compact Edition 4 Web Deploy and Web Farm Framework 2.0 Orchard 1.0 WebMatrix 1.0 The above products are all free. They build upon the .NET 4 and VS 2010 release, and add a ton of additional value to ASP.NET (both Web Forms and MVC) and the Microsoft Web Server stack. ASP.NET MVC 3 Today we are shipping the final release of ASP.NET MVC 3.  You can download and install ASP.NET MVC 3 here.  The ASP.NET MVC 3 source code (released under an OSI-compliant open source license) can also optionally be downloaded here. ASP.NET MVC 3 is a significant update that brings with it a bunch of great features.  Some of the improvements include: Razor ASP.NET MVC 3 ships with a new view-engine option called “Razor” (in addition to continuing to support/enhance the existing .aspx view engine).  Razor minimizes the number of characters and keystrokes required when writing a view template, and enables a fast, fluid coding workflow. Unlike most template syntaxes, with Razor you do not need to interrupt your coding to explicitly denote the start and end of server blocks within your HTML. The Razor parser is smart enough to infer this from your code. This enables a compact and expressive syntax which is clean, fast and fun to type.  You can learn more about Razor from some of the blog posts I’ve done about it over the last 6 months Introducing Razor New @model keyword in Razor Layouts with Razor Server-Side Comments with Razor Razor’s @: and <text> syntax Implicit and Explicit code nuggets with Razor Layouts and Sections with Razor Today’s release supports full code intellisense support for Razor (both VB and C#) with Visual Studio 2010 and the free Visual Web Developer 2010 Express. JavaScript Improvements ASP.NET MVC 3 enables richer JavaScript scenarios and takes advantage of emerging HTML5 capabilities. The AJAX and Validation helpers in ASP.NET MVC 3 now use an Unobtrusive JavaScript based approach.  Unobtrusive JavaScript avoids injecting inline JavaScript into HTML, and enables cleaner separation of behavior using the new HTML 5 “data-“ attribute convention (which conveniently works on older browsers as well – including IE6). This keeps your HTML tight and clean, and makes it easier to optionally swap out or customize JS libraries.  ASP.NET MVC 3 now includes built-in support for posting JSON-based parameters from client-side JavaScript to action methods on the server.  This makes it easier to exchange data across the client and server, and build rich JavaScript front-ends.  We think this capability will be particularly useful going forward with scenarios involving client templates and data binding (including the jQuery plugins the ASP.NET team recently contributed to the jQuery project).  Previous releases of ASP.NET MVC included the core jQuery library.  ASP.NET MVC 3 also now ships the jQuery Validate plugin (which our validation helpers use for client-side validation scenarios).  We are also now shipping and including jQuery UI by default as well (which provides a rich set of client-side JavaScript UI widgets for you to use within projects). Improved Validation ASP.NET MVC 3 includes a bunch of validation enhancements that make it even easier to work with data. Client-side validation is now enabled by default with ASP.NET MVC 3 (using an onbtrusive javascript implementation).  Today’s release also includes built-in support for Remote Validation - which enables you to annotate a model class with a validation attribute that causes ASP.NET MVC to perform a remote validation call to a server method when validating input on the client. The validation features introduced within .NET 4’s System.ComponentModel.DataAnnotations namespace are now supported by ASP.NET MVC 3.  This includes support for the new IValidatableObject interface – which enables you to perform model-level validation, and allows you to provide validation error messages specific to the state of the overall model, or between two properties within the model.  ASP.NET MVC 3 also supports the improvements made to the ValidationAttribute class in .NET 4.  ValidationAttribute now supports a new IsValid overload that provides more information about the current validation context, such as what object is being validated.  This enables richer scenarios where you can validate the current value based on another property of the model.  We’ve shipped a built-in [Compare] validation attribute  with ASP.NET MVC 3 that uses this support and makes it easy out of the box to compare and validate two property values. You can use any data access API or technology with ASP.NET MVC.  This past year, though, we’ve worked closely with the .NET data team to ensure that the new EF Code First library works really well for ASP.NET MVC applications.  These two posts of mine cover the latest EF Code First preview and demonstrates how to use it with ASP.NET MVC 3 to enable easy editing of data (with end to end client+server validation support).  The final release of EF Code First will ship in the next few weeks. Today we are also publishing the first preview of a new MvcScaffolding project.  It enables you to easily scaffold ASP.NET MVC 3 Controllers and Views, and works great with EF Code-First (and is pluggable to support other data providers).  You can learn more about it – and install it via NuGet today - from Steve Sanderson’s MvcScaffolding blog post. Output Caching Previous releases of ASP.NET MVC supported output caching content at a URL or action-method level. With ASP.NET MVC V3 we are also enabling support for partial page output caching – which allows you to easily output cache regions or fragments of a response as opposed to the entire thing.  This ends up being super useful in a lot of scenarios, and enables you to dramatically reduce the work your application does on the server.  The new partial page output caching support in ASP.NET MVC 3 enables you to easily re-use cached sub-regions/fragments of a page across multiple URLs on a site.  It supports the ability to cache the content either on the web-server, or optionally cache it within a distributed cache server like Windows Server AppFabric or memcached. I’ll post some tutorials on my blog that show how to take advantage of ASP.NET MVC 3’s new output caching support for partial page scenarios in the future. Better Dependency Injection ASP.NET MVC 3 provides better support for applying Dependency Injection (DI) and integrating with Dependency Injection/IOC containers. With ASP.NET MVC 3 you no longer need to author custom ControllerFactory classes in order to enable DI with Controllers.  You can instead just register a Dependency Injection framework with ASP.NET MVC 3 and it will resolve dependencies not only for Controllers, but also for Views, Action Filters, Model Binders, Value Providers, Validation Providers, and Model Metadata Providers that you use within your application. This makes it much easier to cleanly integrate dependency injection within your projects. Other Goodies ASP.NET MVC 3 includes dozens of other nice improvements that help to both reduce the amount of code you write, and make the code you do write cleaner.  Here are just a few examples: Improved New Project dialog that makes it easy to start new ASP.NET MVC 3 projects from templates. Improved Add->View Scaffolding support that enables the generation of even cleaner view templates. New ViewBag property that uses .NET 4’s dynamic support to make it easy to pass late-bound data from Controllers to Views. Global Filters support that allows specifying cross-cutting filter attributes (like [HandleError]) across all Controllers within an app. New [AllowHtml] attribute that allows for more granular request validation when binding form posted data to models. Sessionless controller support that allows fine grained control over whether SessionState is enabled on a Controller. New ActionResult types like HttpNotFoundResult and RedirectPermanent for common HTTP scenarios. New Html.Raw() helper to indicate that output should not be HTML encoded. New Crypto helpers for salting and hashing passwords. And much, much more… Learn More about ASP.NET MVC 3 We will be posting lots of tutorials and samples on the http://asp.net/mvc site in the weeks ahead.  Below are two good ASP.NET MVC 3 tutorials available on the site today: Build your First ASP.NET MVC 3 Application: VB and C# Building the ASP.NET MVC 3 Music Store We’ll post additional ASP.NET MVC 3 tutorials and videos on the http://asp.net/mvc site in the future. Visit it regularly to find new tutorials as they are published. How to Upgrade Existing Projects ASP.NET MVC 3 is compatible with ASP.NET MVC 2 – which means it should be easy to update existing MVC projects to ASP.NET MVC 3.  The new features in ASP.NET MVC 3 build on top of the foundational work we’ve already done with the MVC 1 and MVC 2 releases – which means that the skills, knowledge, libraries, and books you’ve acquired are all directly applicable with the MVC 3 release.  MVC 3 adds new features and capabilities – it doesn’t obsolete existing ones. You can upgrade existing ASP.NET MVC 2 projects by following the manual upgrade steps in the release notes.  Alternatively, you can use this automated ASP.NET MVC 3 upgrade tool to easily update your  existing projects. Localized Builds Today’s ASP.NET MVC 3 release is available in English.  We will be releasing localized versions of ASP.NET MVC 3 (in 9 languages) in a few days.  I’ll blog pointers to the localized downloads once they are available. NuGet Today we are also shipping NuGet – a free, open source, package manager that makes it easy for you to find, install, and use open source libraries in your projects. It works with all .NET project types (including ASP.NET Web Forms, ASP.NET MVC, WPF, WinForms, Silverlight, and Class Libraries).  You can download and install it here. NuGet enables developers who maintain open source projects (for example, .NET projects like Moq, NHibernate, Ninject, StructureMap, NUnit, Windsor, Raven, Elmah, etc) to package up their libraries and register them with an online gallery/catalog that is searchable.  The client-side NuGet tools – which include full Visual Studio integration – make it trivial for any .NET developer who wants to use one of these libraries to easily find and install it within the project they are working on. NuGet handles dependency management between libraries (for example: library1 depends on library2). It also makes it easy to update (and optionally remove) libraries from your projects later. It supports updating web.config files (if a package needs configuration settings). It also allows packages to add PowerShell scripts to a project (for example: scaffold commands). Importantly, NuGet is transparent and clean – and does not install anything at the system level. Instead it is focused on making it easy to manage libraries you use with your projects. Our goal with NuGet is to make it as simple as possible to integrate open source libraries within .NET projects.  NuGet Gallery This week we also launched a beta version of the http://nuget.org web-site – which allows anyone to easily search and browse an online gallery of open source packages available via NuGet.  The site also now allows developers to optionally submit new packages that they wish to share with others.  You can learn more about how to create and share a package here. There are hundreds of open-source .NET projects already within the NuGet Gallery today.  We hope to have thousands there in the future. IIS Express 7.5 Today we are also shipping IIS Express 7.5.  IIS Express is a free version of IIS 7.5 that is optimized for developer scenarios.  It works for both ASP.NET Web Forms and ASP.NET MVC project types. We think IIS Express combines the ease of use of the ASP.NET Web Server (aka Cassini) currently built-into Visual Studio today with the full power of IIS.  Specifically: It’s lightweight and easy to install (less than 5Mb download and a quick install) It does not require an administrator account to run/debug applications from Visual Studio It enables a full web-server feature set – including SSL, URL Rewrite, and other IIS 7.x modules It supports and enables the same extensibility model and web.config file settings that IIS 7.x support It can be installed side-by-side with the full IIS web server as well as the ASP.NET Development Server (they do not conflict at all) It works on Windows XP and higher operating systems – giving you a full IIS 7.x developer feature-set on all Windows OS platforms IIS Express (like the ASP.NET Development Server) can be quickly launched to run a site from a directory on disk.  It does not require any registration/configuration steps. This makes it really easy to launch and run for development scenarios.  You can also optionally redistribute IIS Express with your own applications if you want a lightweight web-server.  The standard IIS Express EULA now includes redistributable rights. Visual Studio 2010 SP1 adds support for IIS Express.  Read my VS 2010 SP1 and IIS Express blog post to learn more about what it enables.  SQL Server Compact Edition 4 Today we are also shipping SQL Server Compact Edition 4 (aka SQL CE 4).  SQL CE is a free, embedded, database engine that enables easy database storage. No Database Installation Required SQL CE does not require you to run a setup or install a database server in order to use it.  You can simply copy the SQL CE binaries into the \bin directory of your ASP.NET application, and then your web application can use it as a database engine.  No setup or extra security permissions are required for it to run. You do not need to have an administrator account on the machine. Just copy your web application onto any server and it will work. This is true even of medium-trust applications running in a web hosting environment. SQL CE runs in-memory within your ASP.NET application and will start-up when you first access a SQL CE database, and will automatically shutdown when your application is unloaded.  SQL CE databases are stored as files that live within the \App_Data folder of your ASP.NET Applications. Works with Existing Data APIs SQL CE 4 works with existing .NET-based data APIs, and supports a SQL Server compatible query syntax.  This means you can use existing data APIs like ADO.NET, as well as use higher-level ORMs like Entity Framework and NHibernate with SQL CE.  This enables you to use the same data programming skills and data APIs you know today. Supports Development, Testing and Production Scenarios SQL CE can be used for development scenarios, testing scenarios, and light production usage scenarios.  With the SQL CE 4 release we’ve done the engineering work to ensure that SQL CE won’t crash or deadlock when used in a multi-threaded server scenario (like ASP.NET).  This is a big change from previous releases of SQL CE – which were designed for client-only scenarios and which explicitly blocked running in web-server environments.  Starting with SQL CE 4 you can use it in a web-server as well. There are no license restrictions with SQL CE.  It is also totally free. Tooling Support with VS 2010 SP1 Visual Studio 2010 SP1 adds support for SQL CE 4 and ASP.NET Projects.  Read my VS 2010 SP1 and SQL CE 4 blog post to learn more about what it enables.  Web Deploy and Web Farm Framework 2.0 Today we are also releasing Microsoft Web Deploy V2 and Microsoft Web Farm Framework V2.  These services provide a flexible and powerful way to deploy ASP.NET applications onto either a single server, or across a web farm of machines. You can learn more about these capabilities from my previous blog posts on them: Introducing the Microsoft Web Farm Framework Automating Deployment with Microsoft Web Deploy Visit the http://iis.net website to learn more and install them. Both are free. Orchard 1.0 Today we are also releasing Orchard v1.0.  Orchard is a free, open source, community based project.  It provides Content Management System (CMS) and Blogging System support out of the box, and makes it possible to easily create and manage web-sites without having to write code (site owners can customize a site through the browser-based editing tools built-into Orchard).  Read these tutorials to learn more about how you can setup and manage your own Orchard site. Orchard itself is built as an ASP.NET MVC 3 application using Razor view templates (and by default uses SQL CE 4 for data storage).  Developers wishing to extend an Orchard site with custom functionality can open and edit it as a Visual Studio project – and add new ASP.NET MVC Controllers/Views to it.  WebMatrix 1.0 WebMatrix is a new, free, web development tool from Microsoft that provides a suite of technologies that make it easier to enable website development.  It enables a developer to start a new site by browsing and downloading an app template from an online gallery of web applications (which includes popular apps like Umbraco, DotNetNuke, Orchard, WordPress, Drupal and Joomla).  Alternatively it also enables developers to create and code web sites from scratch. WebMatrix is task focused and helps guide developers as they work on sites.  WebMatrix includes IIS Express, SQL CE 4, and ASP.NET - providing an integrated web-server, database and programming framework combination.  It also includes built-in web publishing support which makes it easy to find and deploy sites to web hosting providers. You can learn more about WebMatrix from my Introducing WebMatrix blog post this summer.  Visit http://microsoft.com/web to download and install it today. Summary I’m really excited about today’s releases – they provide a bunch of additional value that makes web development with ASP.NET, Visual Studio and the Microsoft Web Server a lot better.  A lot of folks worked hard to share this with you today. On behalf of my whole team – we hope you enjoy them! Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Opening Skype, Opera, OpenOffice logs me off

    - by anjanesh
    Whats common among Skype, Opera, OpenOffice in Ubuntu ? Whenever I open these applications I get logged off and shows back me the login screen. This started happening since the 10.10 upgrade. Forgot to mention : Yes, its x64.Each time I open these applications, the UI shows and then crashes. I started each app & logged the last few lines of /var/log/syslog after each crash. Looks like something to do with sound drivers ? Opera :Jan 8 09:33:20 al-ubuntu pulseaudio[11532]: pid.c: Daemon already running. Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: snd_pcm_avail_delay() returned strange values: delay 0 is less than avail 8. Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Most likely this is a bug in the ALSA driver 'snd_hda_intel'. Please report this issue to the ALSA developers. Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: snd_pcm_dump(): Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Soft volume PCM Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Control: PCM Playback Volume Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: min_dB: -51 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: max_dB: 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: resolution: 256 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Its setup is: Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: stream : CAPTURE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: format : S16_LE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: subformat : STD Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: channels : 2 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: rate : 44100 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: msbits : 16 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: buffer_size : 88192 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_size : 44096 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_time : 999909 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_step : 1 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: avail_min : 87310 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_event : 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: start_threshold : -1 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: silence_threshold: 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: silence_size : 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Slave: Hardware PCM card 0 'HDA Intel' device 0 subdevice 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: Its setup is: Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: stream : CAPTURE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: format : S16_LE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: subformat : STD Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: channels : 2 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: rate : 44100 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: msbits : 16 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: buffer_size : 88192 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_size : 44096 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_time : 999909 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_step : 1 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: avail_min : 87310 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: period_event : 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: start_threshold : -1 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: silence_threshold: 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: silence_size : 0 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: appl_ptr : 87320 Jan 8 09:33:21 al-ubuntu pulseaudio[11429]: alsa-util.c: hw_ptr : 87320 Jan 8 09:33:22 al-ubuntu kernel: [ 4962.078306] opera[11036]: segfault at 261 ip 0000000000000261 sp 00007fffed7cd9a8 error 14 in opera[400000+122b000] anjanesh@al-ubuntu:~$ SkypeJan 8 09:40:21 al-ubuntu pulseaudio[12602]: pid.c: Daemon already running. Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: snd_pcm_avail_delay() returned strange values: delay 0 is less than avail 8. Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Most likely this is a bug in the ALSA driver 'snd_hda_intel'. Please report this issue to the ALSA developers. Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: snd_pcm_dump(): Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Soft volume PCM Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Control: PCM Playback Volume Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: min_dB: -51 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: max_dB: 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: resolution: 256 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Its setup is: Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: stream : CAPTURE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: format : S16_LE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: subformat : STD Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: channels : 2 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: rate : 44100 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: msbits : 16 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: buffer_size : 88192 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_size : 44096 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_time : 999909 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_step : 1 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: avail_min : 87310 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_event : 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: start_threshold : -1 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: silence_threshold: 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: silence_size : 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Slave: Hardware PCM card 0 'HDA Intel' device 0 subdevice 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: Its setup is: Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: stream : CAPTURE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: format : S16_LE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: subformat : STD Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: channels : 2 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: rate : 44100 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: msbits : 16 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: buffer_size : 88192 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_size : 44096 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_time : 999909 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_step : 1 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: avail_min : 87310 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: period_event : 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: start_threshold : -1 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: silence_threshold: 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: silence_size : 0 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: appl_ptr : 87312 Jan 8 09:40:23 al-ubuntu pulseaudio[12485]: alsa-util.c: hw_ptr : 87312 anjanesh@al-ubuntu:~$ Open OfficeJan 8 09:43:46 al-ubuntu pulseaudio[13157]: pid.c: Daemon already running. Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: snd_pcm_avail_delay() returned strange values: delay 0 is less than avail 16. Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Most likely this is a bug in the ALSA driver 'snd_hda_intel'. Please report this issue to the ALSA developers. Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: snd_pcm_dump(): Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Soft volume PCM Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Control: PCM Playback Volume Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: min_dB: -51 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: max_dB: 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: resolution: 256 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Its setup is: Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: stream : CAPTURE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: format : S16_LE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: subformat : STD Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: channels : 2 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: rate : 44100 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: msbits : 16 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: buffer_size : 88192 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_size : 44096 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_time : 999909 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_step : 1 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: avail_min : 87310 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_event : 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: start_threshold : -1 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: silence_threshold: 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: silence_size : 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Slave: Hardware PCM card 0 'HDA Intel' device 0 subdevice 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: Its setup is: Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: stream : CAPTURE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: access : MMAP_INTERLEAVED Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: format : S16_LE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: subformat : STD Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: channels : 2 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: rate : 44100 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: exact rate : 44100 (44100/1) Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: msbits : 16 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: buffer_size : 88192 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_size : 44096 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_time : 999909 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: tstamp_mode : ENABLE Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_step : 1 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: avail_min : 87310 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: period_event : 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: start_threshold : -1 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: stop_threshold : 6205960286516543488 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: silence_threshold: 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: silence_size : 0 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: boundary : 6205960286516543488 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: appl_ptr : 87320 Jan 8 09:43:48 al-ubuntu pulseaudio[13064]: alsa-util.c: hw_ptr : 87320 anjanesh@al-ubuntu:~$

    Read the article

  • West Wind WebSurge - an easy way to Load Test Web Applications

    - by Rick Strahl
    A few months ago on a project the subject of load testing came up. We were having some serious issues with a Web application that would start spewing SQL lock errors under somewhat heavy load. These sort of errors can be tough to catch, precisely because they only occur under load and not during typical development testing. To replicate this error more reliably we needed to put a load on the application and run it for a while before these SQL errors would flare up. It’s been a while since I’d looked at load testing tools, so I spent a bit of time looking at different tools and frankly didn’t really find anything that was a good fit. A lot of tools were either a pain to use, didn’t have the basic features I needed, or are extravagantly expensive. In  the end I got frustrated enough to build an initially small custom load test solution that then morphed into a more generic library, then gained a console front end and eventually turned into a full blown Web load testing tool that is now called West Wind WebSurge. I got seriously frustrated looking for tools every time I needed some quick and dirty load testing for an application. If my aim is to just put an application under heavy enough load to find a scalability problem in code, or to simply try and push an application to its limits on the hardware it’s running I shouldn’t have to have to struggle to set up tests. It should be easy enough to get going in a few minutes, so that the testing can be set up quickly so that it can be done on a regular basis without a lot of hassle. And that was the goal when I started to build out my initial custom load tester into a more widely usable tool. If you’re in a hurry and you want to check it out, you can find more information and download links here: West Wind WebSurge Product Page Walk through Video Download link (zip) Install from Chocolatey Source on GitHub For a more detailed discussion of the why’s and how’s and some background continue reading. How did I get here? When I started out on this path, I wasn’t planning on building a tool like this myself – but I got frustrated enough looking at what’s out there to think that I can do better than what’s available for the most common simple load testing scenarios. When we ran into the SQL lock problems I mentioned, I started looking around what’s available for Web load testing solutions that would work for our whole team which consisted of a few developers and a couple of IT guys both of which needed to be able to run the tests. It had been a while since I looked at tools and I figured that by now there should be some good solutions out there, but as it turns out I didn’t really find anything that fit our relatively simple needs without costing an arm and a leg… I spent the better part of a day installing and trying various load testing tools and to be frank most of them were either terrible at what they do, incredibly unfriendly to use, used some terminology I couldn’t even parse, or were extremely expensive (and I mean in the ‘sell your liver’ range of expensive). Pick your poison. There are also a number of online solutions for load testing and they actually looked more promising, but those wouldn’t work well for our scenario as the application is running inside of a private VPN with no outside access into the VPN. Most of those online solutions also ended up being very pricey as well – presumably because of the bandwidth required to test over the open Web can be enormous. When I asked around on Twitter what people were using– I got mostly… crickets. Several people mentioned Visual Studio Load Test, and most other suggestions pointed to online solutions. I did get a bunch of responses though with people asking to let them know what I found – apparently I’m not alone when it comes to finding load testing tools that are effective and easy to use. As to Visual Studio, the higher end skus of Visual Studio and the test edition include a Web load testing tool, which is quite powerful, but there are a number of issues with that: First it’s tied to Visual Studio so it’s not very portable – you need a VS install. I also find the test setup and terminology used by the VS test runner extremely confusing. Heck, it’s complicated enough that there’s even a Pluralsight course on using the Visual Studio Web test from Steve Smith. And of course you need to have one of the high end Visual Studio Skus, and those are mucho Dinero ($$$) – just for the load testing that’s rarely an option. Some of the tools are ultra extensive and let you run analysis tools on the target serves which is useful, but in most cases – just plain overkill and only distracts from what I tend to be ultimately interested in: Reproducing problems that occur at high load, and finding the upper limits and ‘what if’ scenarios as load is ramped up increasingly against a site. Yes it’s useful to have Web app instrumentation, but often that’s not what you’re interested in. I still fondly remember early days of Web testing when Microsoft had the WAST (Web Application Stress Tool) tool, which was rather simple – and also somewhat limited – but easily allowed you to create stress tests very quickly. It had some serious limitations (mainly that it didn’t work with SSL),  but the idea behind it was excellent: Create tests quickly and easily and provide a decent engine to run it locally with minimal setup. You could get set up and run tests within a few minutes. Unfortunately, that tool died a quiet death as so many of Microsoft’s tools that probably were built by an intern and then abandoned, even though there was a lot of potential and it was actually fairly widely used. Eventually the tools was no longer downloadable and now it simply doesn’t work anymore on higher end hardware. West Wind Web Surge – Making Load Testing Quick and Easy So I ended up creating West Wind WebSurge out of rebellious frustration… The goal of WebSurge is to make it drop dead simple to create load tests. It’s super easy to capture sessions either using the built in capture tool (big props to Eric Lawrence, Telerik and FiddlerCore which made that piece a snap), using the full version of Fiddler and exporting sessions, or by manually or programmatically creating text files based on plain HTTP headers to create requests. I’ve been using this tool for 4 months now on a regular basis on various projects as a reality check for performance and scalability and it’s worked extremely well for finding small performance issues. I also use it regularly as a simple URL tester, as it allows me to quickly enter a URL plus headers and content and test that URL and its results along with the ability to easily save one or more of those URLs. A few weeks back I made a walk through video that goes over most of the features of WebSurge in some detail: Note that the UI has slightly changed since then, so there are some UI improvements. Most notably the test results screen has been updated recently to a different layout and to provide more information about each URL in a session at a glance. The video and the main WebSurge site has a lot of info of basic operations. For the rest of this post I’ll talk about a few deeper aspects that may be of interest while also giving a glance at how WebSurge works. Session Capturing As you would expect, WebSurge works with Sessions of Urls that are played back under load. Here’s what the main Session View looks like: You can create session entries manually by individually adding URLs to test (on the Request tab on the right) and saving them, or you can capture output from Web Browsers, Windows Desktop applications that call services, your own applications using the built in Capture tool. With this tool you can capture anything HTTP -SSL requests and content from Web pages, AJAX calls, SOAP or REST services – again anything that uses Windows or .NET HTTP APIs. Behind the scenes the capture tool uses FiddlerCore so basically anything you can capture with Fiddler you can also capture with Web Surge Session capture tool. Alternately you can actually use Fiddler as well, and then export the captured Fiddler trace to a file, which can then be imported into WebSurge. This is a nice way to let somebody capture session without having to actually install WebSurge or for your customers to provide an exact playback scenario for a given set of URLs that cause a problem perhaps. Note that not all applications work with Fiddler’s proxy unless you configure a proxy. For example, .NET Web applications that make HTTP calls usually don’t show up in Fiddler by default. For those .NET applications you can explicitly override proxy settings to capture those requests to service calls. The capture tool also has handy optional filters that allow you to filter by domain, to help block out noise that you typically don’t want to include in your requests. For example, if your pages include links to CDNs, or Google Analytics or social links you typically don’t want to include those in your load test, so by capturing just from a specific domain you are guaranteed content from only that one domain. Additionally you can provide url filters in the configuration file – filters allow to provide filter strings that if contained in a url will cause requests to be ignored. Again this is useful if you don’t filter by domain but you want to filter out things like static image, css and script files etc. Often you’re not interested in the load characteristics of these static and usually cached resources as they just add noise to tests and often skew the overall url performance results. In my testing I tend to care only about my dynamic requests. SSL Captures require Fiddler Note, that in order to capture SSL requests you’ll have to install the Fiddler’s SSL certificate. The easiest way to do this is to install Fiddler and use its SSL configuration options to get the certificate into the local certificate store. There’s a document on the Telerik site that provides the exact steps to get SSL captures to work with Fiddler and therefore with WebSurge. Session Storage A group of URLs entered or captured make up a Session. Sessions can be saved and restored easily as they use a very simple text format that simply stored on disk. The format is slightly customized HTTP header traces separated by a separator line. The headers are standard HTTP headers except that the full URL instead of just the domain relative path is stored as part of the 1st HTTP header line for easier parsing. Because it’s just text and uses the same format that Fiddler uses for exports, it’s super easy to create Sessions by hand manually or under program control writing out to a simple text file. You can see what this format looks like in the Capture window figure above – the raw captured format is also what’s stored to disk and what WebSurge parses from. The only ‘custom’ part of these headers is that 1st line contains the full URL instead of the domain relative path and Host: header. The rest of each header are just plain standard HTTP headers with each individual URL isolated by a separator line. The format used here also uses what Fiddler produces for exports, so it’s easy to exchange or view data either in Fiddler or WebSurge. Urls can also be edited interactively so you can modify the headers easily as well: Again – it’s just plain HTTP headers so anything you can do with HTTP can be added here. Use it for single URL Testing Incidentally I’ve also found this form as an excellent way to test and replay individual URLs for simple non-load testing purposes. Because you can capture a single or many URLs and store them on disk, this also provides a nice HTTP playground where you can record URLs with their headers, and fire them one at a time or as a session and see results immediately. It’s actually an easy way for REST presentations and I find the simple UI flow actually easier than using Fiddler natively. Finally you can save one or more URLs as a session for later retrieval. I’m using this more and more for simple URL checks. Overriding Cookies and Domains Speaking of HTTP headers – you can also overwrite cookies used as part of the options. One thing that happens with modern Web applications is that you have session cookies in use for authorization. These cookies tend to expire at some point which would invalidate a test. Using the Options dialog you can actually override the cookie: which replaces the cookie for all requests with the cookie value specified here. You can capture a valid cookie from a manual HTTP request in your browser and then paste into the cookie field, to replace the existing Cookie with the new one that is now valid. Likewise you can easily replace the domain so if you captured urls on west-wind.com and now you want to test on localhost you can do that easily easily as well. You could even do something like capture on store.west-wind.com and then test on localhost/store which would also work. Running Load Tests Once you’ve created a Session you can specify the length of the test in seconds, and specify the number of simultaneous threads to run each session on. Sessions run through each of the URLs in the session sequentially by default. One option in the options list above is that you can also randomize the URLs so each thread runs requests in a different order. This avoids bunching up URLs initially when tests start as all threads run the same requests simultaneously which can sometimes skew the results of the first few minutes of a test. While sessions run some progress information is displayed: By default there’s a live view of requests displayed in a Console-like window. On the bottom of the window there’s a running total summary that displays where you’re at in the test, how many requests have been processed and what the requests per second count is currently for all requests. Note that for tests that run over a thousand requests a second it’s a good idea to turn off the console display. While the console display is nice to see that something is happening and also gives you slight idea what’s happening with actual requests, once a lot of requests are processed, this UI updating actually adds a lot of CPU overhead to the application which may cause the actual load generated to be reduced. If you are running a 1000 requests a second there’s not much to see anyway as requests roll by way too fast to see individual lines anyway. If you look on the options panel, there is a NoProgressEvents option that disables the console display. Note that the summary display is still updated approximately once a second so you can always tell that the test is still running. Test Results When the test is done you get a simple Results display: On the right you get an overall summary as well as breakdown by each URL in the session. Both success and failures are highlighted so it’s easy to see what’s breaking in your load test. The report can be printed or you can also open the HTML document in your default Web Browser for printing to PDF or saving the HTML document to disk. The list on the right shows you a partial list of the URLs that were fired so you can look in detail at the request and response data. The list can be filtered by success and failure requests. Each list is partial only (at the moment) and limited to a max of 1000 items in order to render reasonably quickly. Each item in the list can be clicked to see the full request and response data: This particularly useful for errors so you can quickly see and copy what request data was used and in the case of a GET request you can also just click the link to quickly jump to the page. For non-GET requests you can find the URL in the Session list, and use the context menu to Test the URL as configured including any HTTP content data to send. You get to see the full HTTP request and response as well as a link in the Request header to go visit the actual page. Not so useful for a POST as above, but definitely useful for GET requests. Finally you can also get a few charts. The most useful one is probably the Request per Second chart which can be accessed from the Charts menu or shortcut. Here’s what it looks like:   Results can also be exported to JSON, XML and HTML. Keep in mind that these files can get very large rather quickly though, so exports can end up taking a while to complete. Command Line Interface WebSurge runs with a small core load engine and this engine is plugged into the front end application I’ve shown so far. There’s also a command line interface available to run WebSurge from the Windows command prompt. Using the command line you can run tests for either an individual URL (similar to AB.exe for example) or a full Session file. By default when it runs WebSurgeCli shows progress every second showing total request count, failures and the requests per second for the entire test. A silent option can turn off this progress display and display only the results. The command line interface can be useful for build integration which allows checking for failures perhaps or hitting a specific requests per second count etc. It’s also nice to use this as quick and dirty URL test facility similar to the way you’d use Apache Bench (ab.exe). Unlike ab.exe though, WebSurgeCli supports SSL and makes it much easier to create multi-URL tests using either manual editing or the WebSurge UI. Current Status Currently West Wind WebSurge is still in Beta status. I’m still adding small new features and tweaking the UI in an attempt to make it as easy and self-explanatory as possible to run. Documentation for the UI and specialty features is also still a work in progress. I plan on open-sourcing this product, but it won’t be free. There’s a free version available that provides a limited number of threads and request URLs to run. A relatively low cost license  removes the thread and request limitations. Pricing info can be found on the Web site – there’s an introductory price which is $99 at the moment which I think is reasonable compared to most other for pay solutions out there that are exorbitant by comparison… The reason code is not available yet is – well, the UI portion of the app is a bit embarrassing in its current monolithic state. The UI started as a very simple interface originally that later got a lot more complex – yeah, that never happens, right? Unless there’s a lot of interest I don’t foresee re-writing the UI entirely (which would be ideal), but in the meantime at least some cleanup is required before I dare to publish it :-). The code will likely be released with version 1.0. I’m very interested in feedback. Do you think this could be useful to you and provide value over other tools you may or may not have used before? I hope so – it already has provided a ton of value for me and the work I do that made the development worthwhile at this point. You can leave a comment below, or for more extensive discussions you can post a message on the West Wind Message Board in the WebSurge section Microsoft MVPs and Insiders get a free License If you’re a Microsoft MVP or a Microsoft Insider you can get a full license for free. Send me a link to your current, official Microsoft profile and I’ll send you a not-for resale license. Send any messages to [email protected]. Resources For more info on WebSurge and to download it to try it out, use the following links. West Wind WebSurge Home Download West Wind WebSurge Getting Started with West Wind WebSurge Video© Rick Strahl, West Wind Technologies, 2005-2014Posted in ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Step by Step:How to use Web Services in ASP.NET AJAX

    - by Yousef_Jadallah
    In my Article Preventing Duplicate Date With ASP.NET AJAX I’ve used ASP.NET AJAX With Web Service Technology, Therefore I add this topic as an introduction how to access Web services from client script in AJAX-enabled ASP.NET Web pages. As well I write this topic to answer the common questions which most of the developers face while working with ASP.NET Ajax Web Services especially in Microsoft ASP.NET official forum http://forums.asp.net/. ASP.NET enables you to create Web services can be accessed from client script in Web pages by using AJAX technology to make Web service calls. Data is exchanged asynchronously between client and server, typically in JSON format.   Lets go a head with the steps :   1-Create a new project , if you are using VS 2005 you have to create ASP.NET Ajax Enabled Web site.   2-Add new Item , Choose Web Service file .     3-To make your Web Services accessible from script, first it must be an .asmx Web service whose Web service class is qualified with the ScriptServiceAttribute attribute and every method you are using to be called from Client script must be qualified with the WebMethodAttribute attribute. On other hand you can use your Web page( CS or VB files) to add static methods accessible from Client Script , just you need to add WebMethod Attribute and set the EnablePageMethods attribute of the ScriptManager control to true..   The other condition is to register the ScriptHandlerFactory HTTP handler, which processes calls made from script to .asmx Web services : <system.web> <httpHandlers> <remove verb="*" path="*.asmx"/> <add verb="*" path="*.asmx" type="System.Web.Script.Services.ScriptHandlerFactory" validate="false"/> </httpHandlers> <system.web> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } but this already added automatically for any Web.config file of any ASP.NET AJAX Enabled WebSite or Project, So you don’t need to add it.   4-Avoid the default Method HelloWorld, then add your method in your asmx file lets say  OurServerOutput , As a consequence your Web service will be like this : using System; using System.Collections.Generic; using System.Linq; using System.Web; using System.Web.Services;     [WebService(Namespace = "http://tempuri.org/")] [WebServiceBinding(ConformsTo = WsiProfiles.BasicProfile1_1)] [System.Web.Script.Services.ScriptService] public class WebService : System.Web.Services.WebService {     [WebMethod] public string OurServerOutput() { return "The Server Date and Time is : " + DateTime.Now.ToString(); } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   5-Add ScriptManager Contol to your aspx file then reference the Web service by adding an asp:ServiceReference child element to the ScriptManager control and setting its path attribute to point to the Web service, That generate a JavaScript proxy class for calling the specified Web service from client script.   <asp:ScriptManager runat="server" ID="scriptManager"> <Services> <asp:ServiceReference Path="WebService.asmx" /> </Services> </asp:ScriptManager> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   Basically ,to enable your application to call Web services(.asmx files) by using client script, the server asynchronous communication layer automatically generates JavaScript proxy classes. A proxy class is generated for each Web service for which an <asp:ServiceReference> element is included under the <asp:ScriptManager> control in the page.   6-Create new button to call the JavaSciprt function and a label to display the returned value . <input id="btnCallDateTime" type="button" value="Call Web Service" onclick="CallDateTime()"/> <asp:Label ID="lblOutupt" runat="server" Text="Label"></asp:Label> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; }   7-Define the JavaScript code to call the Web Service : <script language="javascript" type="text/javascript">   function CallDateTime() {   WebService.OurServerOutput(OnSucceeded); }   function OnSucceeded(result) { var lblOutput = document.getElementById("lblOutupt"); lblOutput.innerHTML = result; } </script> .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } CallDateTime function calls the Web Service Method OurServerOutput… OnSucceeded function Used as the callback function that processes the Web Service return value. which the result parameter is a simple parameter contain the Server Date Time value returned from the Web Service . Finally , when you complete these steps and run your application you can press the button and retrieve Server Date time without postback.   Conclusion: In this topic I describes how to access Web services from client script in AJAX-enabled ASP.NET Web pages With a full .NET Framework/JSON serialize, direct integration with the familiar .asmx Web services ,Using  simple example,Also you can connect with the database to return value by create WebMethod in your Web Service file and the same steps you can use. Next time I will show you more complex example which returns a complex type like objects.   Hope this help.

    Read the article

  • PowerShell Script to Deploy Multiple VM on Azure in Parallel #azure #powershell

    - by Marco Russo (SQLBI)
    This blog is usually dedicated to Business Intelligence and SQL Server, but I didn’t found easily on the web simple PowerShell scripts to help me deploying a number of virtual machines on Azure that I use for testing and development. Since I need to deploy, start, stop and remove many virtual machines created from a common image I created (you know, Tabular is not part of the standard images provided by Microsoft…), I wanted to minimize the time required to execute every operation from my Windows Azure PowerShell console (but I suggest you using Windows PowerShell ISE), so I also wanted to fire the commands as soon as possible in parallel, without losing the result in the console. In order to execute multiple commands in parallel, I used the Start-Job cmdlet, and using Get-Job and Receive-Job I wait for job completion and display the messages generated during background command execution. This technique allows me to reduce execution time when I have to deploy, start, stop or remove virtual machines. Please note that a few operations on Azure acquire an exclusive lock and cannot be really executed in parallel, but only one part of their execution time is subject to this lock. Thus, you obtain a better response time also in these scenarios (this is the case of the provisioning of a new VM). Finally, when you remove the VMs you still have the disk containing the virtual machine to remove. This cannot be done just after the VM removal, because you have to wait that the removal operation is completed on Azure. So I wrote a script that you have to run a few minutes after VMs removal and delete disks (and VHD) no longer related to a VM. I just check that the disk were associated to the original image name used to provision the VMs (so I don’t remove other disks deployed by other batches that I might want to preserve). These examples are specific for my scenario, if you need more complex configurations you have to change and adapt the code. But if your need is to create multiple instances of the same VM running in a workgroup, these scripts should be good enough. I prepared the following PowerShell scripts: ProvisionVMs: Provision many VMs in parallel starting from the same image. It creates one service for each VM. RemoveVMs: Remove all the VMs in parallel – it also remove the service created for the VM StartVMs: Starts all the VMs in parallel StopVMs: Stops all the VMs in parallel RemoveOrphanDisks: Remove all the disks no longer used by any VMs. Run this script a few minutes after RemoveVMs script. ProvisionVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   # Name of storage account (where VMs will be deployed) $StorageAccount = "Copy the Label property you get from Get-AzureStorageAccount"   function ProvisionVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName) $Location = "Copy the Location property you get from Get-AzureStorageAccount" $InstanceSize = "A5" # You can use any other instance, such as Large, A6, and so on $AdminUsername = "UserName" # Write the name of the administrator account in the new VM $Password = "Password"      # Write the password of the administrator account in the new VM $Image = "Copy the ImageName property you get from Get-AzureVMImage" # You can list your own images using the following command: # Get-AzureVMImage | Where-Object {$_.PublisherName -eq "User" }         New-AzureVMConfig -Name $VmName -ImageName $Image -InstanceSize $InstanceSize |             Add-AzureProvisioningConfig -Windows -Password $Password -AdminUsername $AdminUsername|             New-AzureVM -Location $Location -ServiceName "$VmName" -Verbose     } }   # Set the proper storage - you might remove this line if you have only one storage in the subscription Set-AzureSubscription -SubscriptionName $SubscriptionName -CurrentStorageAccount $StorageAccount   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list provisions one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed ProvisionVM "test10" ProvisionVM "test11" ProvisionVM "test12" ProvisionVM "test13" ProvisionVM "test14" ProvisionVM "test15" ProvisionVM "test16" ProvisionVM "test17" ProvisionVM "test18" ProvisionVM "test19" ProvisionVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup of jobs Remove-Job *   # Displays batch completed echo "Provisioning VM Completed" RemoveVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function RemoveVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Remove-AzureService -ServiceName $VmName -Force -Verbose     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list remove one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed RemoveVM "test10" RemoveVM "test11" RemoveVM "test12" RemoveVM "test13" RemoveVM "test14" RemoveVM "test15" RemoveVM "test16" RemoveVM "test17" RemoveVM "test18" RemoveVM "test19" RemoveVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Remove VM Completed" StartVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function StartVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Start-AzureVM -Name $VmName -ServiceName $VmName -Verbose     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list starts one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed StartVM "test10" StartVM "test11" StartVM "test11" StartVM "test12" StartVM "test13" StartVM "test14" StartVM "test15" StartVM "test16" StartVM "test17" StartVM "test18" StartVM "test19" StartVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Start VM Completed"   StopVMs # Name of subscription $SubscriptionName = "Copy the SubscriptionName property you get from Get-AzureSubscription"   function StopVM( [string]$VmName ) {     Start-Job -ArgumentList $VmName {         param($VmName)         Stop-AzureVM -Name $VmName -ServiceName $VmName -Verbose -Force     } }   # Select the subscription - this line is fundamental if you have access to multiple subscription # You might remove this line if you have only one subscription Select-AzureSubscription -SubscriptionName $SubscriptionName   # Every line in the following list stops one VM using the name specified in the argument # You can change the number of lines - use a unique name for every VM - don't reuse names # already used in other VMs already deployed StopVM "test10" StopVM "test11" StopVM "test12" StopVM "test13" StopVM "test14" StopVM "test15" StopVM "test16" StopVM "test17" StopVM "test18" StopVM "test19" StopVM "test20"   # Wait for all to complete While (Get-Job -State "Running") {     Get-Job -State "Completed" | Receive-Job     Start-Sleep 1 }   # Display output from all jobs Get-Job | Receive-Job   # Cleanup Remove-Job *   # Displays batch completed echo "Stop VM Completed" RemoveOrphanDisks $Image = "Copy the ImageName property you get from Get-AzureVMImage" # You can list your own images using the following command: # Get-AzureVMImage | Where-Object {$_.PublisherName -eq "User" }   # Remove all orphan disks coming from the image specified in $ImageName Get-AzureDisk |     Where-Object {$_.attachedto -eq $null -and $_.SourceImageName -eq $ImageName} |     Remove-AzureDisk -DeleteVHD -Verbose  

    Read the article

  • Parallelism in .NET – Part 3, Imperative Data Parallelism: Early Termination

    - by Reed
    Although simple data parallelism allows us to easily parallelize many of our iteration statements, there are cases that it does not handle well.  In my previous discussion, I focused on data parallelism with no shared state, and where every element is being processed exactly the same. Unfortunately, there are many common cases where this does not happen.  If we are dealing with a loop that requires early termination, extra care is required when parallelizing. Often, while processing in a loop, once a certain condition is met, it is no longer necessary to continue processing.  This may be a matter of finding a specific element within the collection, or reaching some error case.  The important distinction here is that, it is often impossible to know until runtime, what set of elements needs to be processed. In my initial discussion of data parallelism, I mentioned that this technique is a candidate when you can decompose the problem based on the data involved, and you wish to apply a single operation concurrently on all of the elements of a collection.  This covers many of the potential cases, but sometimes, after processing some of the elements, we need to stop processing. As an example, lets go back to our previous Parallel.ForEach example with contacting a customer.  However, this time, we’ll change the requirements slightly.  In this case, we’ll add an extra condition – if the store is unable to email the customer, we will exit gracefully.  The thinking here, of course, is that if the store is currently unable to email, the next time this operation runs, it will handle the same situation, so we can just skip our processing entirely.  The original, serial case, with this extra condition, might look something like the following: foreach(var customer in customers) { // Run some process that takes some time... DateTime lastContact = theStore.GetLastContact(customer); TimeSpan timeSinceContact = DateTime.Now - lastContact; // If it's been more than two weeks, send an email, and update... if (timeSinceContact.Days > 14) { // Exit gracefully if we fail to email, since this // entire process can be repeated later without issue. if (theStore.EmailCustomer(customer) == false) break; customer.LastEmailContact = DateTime.Now; } } .csharpcode, .csharpcode pre { font-size: small; color: black; font-family: consolas, "Courier New", courier, monospace; background-color: #ffffff; /*white-space: pre;*/ } .csharpcode pre { margin: 0em; } .csharpcode .rem { color: #008000; } .csharpcode .kwrd { color: #0000ff; } .csharpcode .str { color: #006080; } .csharpcode .op { color: #0000c0; } .csharpcode .preproc { color: #cc6633; } .csharpcode .asp { background-color: #ffff00; } .csharpcode .html { color: #800000; } .csharpcode .attr { color: #ff0000; } .csharpcode .alt { background-color: #f4f4f4; width: 100%; margin: 0em; } .csharpcode .lnum { color: #606060; } Here, we’re processing our loop, but at any point, if we fail to send our email successfully, we just abandon this process, and assume that it will get handled correctly the next time our routine is run.  If we try to parallelize this using Parallel.ForEach, as we did previously, we’ll run into an error almost immediately: the break statement we’re using is only valid when enclosed within an iteration statement, such as foreach.  When we switch to Parallel.ForEach, we’re no longer within an iteration statement – we’re a delegate running in a method. This needs to be handled slightly differently when parallelized.  Instead of using the break statement, we need to utilize a new class in the Task Parallel Library: ParallelLoopState.  The ParallelLoopState class is intended to allow concurrently running loop bodies a way to interact with each other, and provides us with a way to break out of a loop.  In order to use this, we will use a different overload of Parallel.ForEach which takes an IEnumerable<T> and an Action<T, ParallelLoopState> instead of an Action<T>.  Using this, we can parallelize the above operation by doing: Parallel.ForEach(customers, (customer, parallelLoopState) => { // Run some process that takes some time... DateTime lastContact = theStore.GetLastContact(customer); TimeSpan timeSinceContact = DateTime.Now - lastContact; // If it's been more than two weeks, send an email, and update... if (timeSinceContact.Days > 14) { // Exit gracefully if we fail to email, since this // entire process can be repeated later without issue. if (theStore.EmailCustomer(customer) == false) parallelLoopState.Break(); else customer.LastEmailContact = DateTime.Now; } }); There are a couple of important points here.  First, we didn’t actually instantiate the ParallelLoopState instance.  It was provided directly to us via the Parallel class.  All we needed to do was change our lambda expression to reflect that we want to use the loop state, and the Parallel class creates an instance for our use.  We also needed to change our logic slightly when we call Break().  Since Break() doesn’t stop the program flow within our block, we needed to add an else case to only set the property in customer when we succeeded.  This same technique can be used to break out of a Parallel.For loop. That being said, there is a huge difference between using ParallelLoopState to cause early termination and to use break in a standard iteration statement.  When dealing with a loop serially, break will immediately terminate the processing within the closest enclosing loop statement.  Calling ParallelLoopState.Break(), however, has a very different behavior. The issue is that, now, we’re no longer processing one element at a time.  If we break in one of our threads, there are other threads that will likely still be executing.  This leads to an important observation about termination of parallel code: Early termination in parallel routines is not immediate.  Code will continue to run after you request a termination. This may seem problematic at first, but it is something you just need to keep in mind while designing your routine.  ParallelLoopState.Break() should be thought of as a request.  We are telling the runtime that no elements that were in the collection past the element we’re currently processing need to be processed, and leaving it up to the runtime to decide how to handle this as gracefully as possible.  Although this may seem problematic at first, it is a good thing.  If the runtime tried to immediately stop processing, many of our elements would be partially processed.  It would be like putting a return statement in a random location throughout our loop body – which could have horrific consequences to our code’s maintainability. In order to understand and effectively write parallel routines, we, as developers, need a subtle, but profound shift in our thinking.  We can no longer think in terms of sequential processes, but rather need to think in terms of requests to the system that may be handled differently than we’d first expect.  This is more natural to developers who have dealt with asynchronous models previously, but is an important distinction when moving to concurrent programming models. As an example, I’ll discuss the Break() method.  ParallelLoopState.Break() functions in a way that may be unexpected at first.  When you call Break() from a loop body, the runtime will continue to process all elements of the collection that were found prior to the element that was being processed when the Break() method was called.  This is done to keep the behavior of the Break() method as close to the behavior of the break statement as possible. We can see the behavior in this simple code: var collection = Enumerable.Range(0, 20); var pResult = Parallel.ForEach(collection, (element, state) => { if (element > 10) { Console.WriteLine("Breaking on {0}", element); state.Break(); } Console.WriteLine(element); }); If we run this, we get a result that may seem unexpected at first: 0 2 1 5 6 3 4 10 Breaking on 11 11 Breaking on 12 12 9 Breaking on 13 13 7 8 Breaking on 15 15 What is occurring here is that we loop until we find the first element where the element is greater than 10.  In this case, this was found, the first time, when one of our threads reached element 11.  It requested that the loop stop by calling Break() at this point.  However, the loop continued processing until all of the elements less than 11 were completed, then terminated.  This means that it will guarantee that elements 9, 7, and 8 are completed before it stops processing.  You can see our other threads that were running each tried to break as well, but since Break() was called on the element with a value of 11, it decides which elements (0-10) must be processed. If this behavior is not desirable, there is another option.  Instead of calling ParallelLoopState.Break(), you can call ParallelLoopState.Stop().  The Stop() method requests that the runtime terminate as soon as possible , without guaranteeing that any other elements are processed.  Stop() will not stop the processing within an element, so elements already being processed will continue to be processed.  It will prevent new elements, even ones found earlier in the collection, from being processed.  Also, when Stop() is called, the ParallelLoopState’s IsStopped property will return true.  This lets longer running processes poll for this value, and return after performing any necessary cleanup. The basic rule of thumb for choosing between Break() and Stop() is the following. Use ParallelLoopState.Stop() when possible, since it terminates more quickly.  This is particularly useful in situations where you are searching for an element or a condition in the collection.  Once you’ve found it, you do not need to do any other processing, so Stop() is more appropriate. Use ParallelLoopState.Break() if you need to more closely match the behavior of the C# break statement. Both methods behave differently than our C# break statement.  Unfortunately, when parallelizing a routine, more thought and care needs to be put into every aspect of your routine than you may otherwise expect.  This is due to my second observation: Parallelizing a routine will almost always change its behavior. This sounds crazy at first, but it’s a concept that’s so simple its easy to forget.  We’re purposely telling the system to process more than one thing at the same time, which means that the sequence in which things get processed is no longer deterministic.  It is easy to change the behavior of your routine in very subtle ways by introducing parallelism.  Often, the changes are not avoidable, even if they don’t have any adverse side effects.  This leads to my final observation for this post: Parallelization is something that should be handled with care and forethought, added by design, and not just introduced casually.

    Read the article

  • Access violation in DirectX OMSetRenderTargets

    - by IDWMaster
    I receive the following error (Unhandled exception at 0x527DAE81 (d3d11_1sdklayers.dll) in Lesson2.Triangles.exe: 0xC0000005: Access violation reading location 0x00000000) when running the Triangle sample application for DirectX 11 in D3D_FEATURE_LEVEL_9_1. This error occurs at the OMSetRenderTargets function, as shown below, and does not happen if I remove that function from the program (but then, the screen is blue, and does not render the triangle) //// THIS CODE AND INFORMATION IS PROVIDED "AS IS" WITHOUT WARRANTY OF //// ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING BUT NOT LIMITED TO //// THE IMPLIED WARRANTIES OF MERCHANTABILITY AND/OR FITNESS FOR A //// PARTICULAR PURPOSE. //// //// Copyright (c) Microsoft Corporation. All rights reserved #include #include #include "DirectXSample.h" #include "BasicMath.h" #include "BasicReaderWriter.h" using namespace Microsoft::WRL; using namespace Windows::UI::Core; using namespace Windows::Foundation; using namespace Windows::ApplicationModel::Core; using namespace Windows::ApplicationModel::Infrastructure; // This class defines the application as a whole. ref class Direct3DTutorialViewProvider : public IViewProvider { private: CoreWindow^ m_window; ComPtr m_swapChain; ComPtr m_d3dDevice; ComPtr m_d3dDeviceContext; ComPtr m_renderTargetView; public: // This method is called on application launch. void Initialize( _In_ CoreWindow^ window, _In_ CoreApplicationView^ applicationView ) { m_window = window; } // This method is called after Initialize. void Load(_In_ Platform::String^ entryPoint) { } // This method is called after Load. void Run() { // First, create the Direct3D device. // This flag is required in order to enable compatibility with Direct2D. UINT creationFlags = D3D11_CREATE_DEVICE_BGRA_SUPPORT; #if defined(_DEBUG) // If the project is in a debug build, enable debugging via SDK Layers with this flag. creationFlags |= D3D11_CREATE_DEVICE_DEBUG; #endif // This array defines the ordering of feature levels that D3D should attempt to create. D3D_FEATURE_LEVEL featureLevels[] = { D3D_FEATURE_LEVEL_11_1, D3D_FEATURE_LEVEL_11_0, D3D_FEATURE_LEVEL_10_1, D3D_FEATURE_LEVEL_10_0, D3D_FEATURE_LEVEL_9_3, D3D_FEATURE_LEVEL_9_1 }; ComPtr d3dDevice; ComPtr d3dDeviceContext; DX::ThrowIfFailed( D3D11CreateDevice( nullptr, // specify nullptr to use the default adapter D3D_DRIVER_TYPE_HARDWARE, nullptr, // leave as nullptr if hardware is used creationFlags, // optionally set debug and Direct2D compatibility flags featureLevels, ARRAYSIZE(featureLevels), D3D11_SDK_VERSION, // always set this to D3D11_SDK_VERSION &d3dDevice, nullptr, &d3dDeviceContext ) ); // Retrieve the Direct3D 11.1 interfaces. DX::ThrowIfFailed( d3dDevice.As(&m_d3dDevice) ); DX::ThrowIfFailed( d3dDeviceContext.As(&m_d3dDeviceContext) ); // After the D3D device is created, create additional application resources. CreateWindowSizeDependentResources(); // Create a Basic Reader-Writer class to load data from disk. This class is examined // in the Resource Loading sample. BasicReaderWriter^ reader = ref new BasicReaderWriter(); // Load the raw vertex shader bytecode from disk and create a vertex shader with it. auto vertexShaderBytecode = reader-ReadData("SimpleVertexShader.cso"); ComPtr vertexShader; DX::ThrowIfFailed( m_d3dDevice-CreateVertexShader( vertexShaderBytecode-Data, vertexShaderBytecode-Length, nullptr, &vertexShader ) ); // Create an input layout that matches the layout defined in the vertex shader code. // For this lesson, this is simply a float2 vector defining the vertex position. const D3D11_INPUT_ELEMENT_DESC basicVertexLayoutDesc[] = { { "POSITION", 0, DXGI_FORMAT_R32G32_FLOAT, 0, 0, D3D11_INPUT_PER_VERTEX_DATA, 0 }, }; ComPtr inputLayout; DX::ThrowIfFailed( m_d3dDevice-CreateInputLayout( basicVertexLayoutDesc, ARRAYSIZE(basicVertexLayoutDesc), vertexShaderBytecode-Data, vertexShaderBytecode-Length, &inputLayout ) ); // Load the raw pixel shader bytecode from disk and create a pixel shader with it. auto pixelShaderBytecode = reader-ReadData("SimplePixelShader.cso"); ComPtr pixelShader; DX::ThrowIfFailed( m_d3dDevice-CreatePixelShader( pixelShaderBytecode-Data, pixelShaderBytecode-Length, nullptr, &pixelShader ) ); // Create vertex and index buffers that define a simple triangle. float3 triangleVertices[] = { float3(-0.5f, -0.5f,13.5f), float3( 0.0f, 0.5f,0), float3( 0.5f, -0.5f,0), }; D3D11_BUFFER_DESC vertexBufferDesc = {0}; vertexBufferDesc.ByteWidth = sizeof(float3) * ARRAYSIZE(triangleVertices); vertexBufferDesc.Usage = D3D11_USAGE_DEFAULT; vertexBufferDesc.BindFlags = D3D11_BIND_VERTEX_BUFFER; vertexBufferDesc.CPUAccessFlags = 0; vertexBufferDesc.MiscFlags = 0; vertexBufferDesc.StructureByteStride = 0; D3D11_SUBRESOURCE_DATA vertexBufferData; vertexBufferData.pSysMem = triangleVertices; vertexBufferData.SysMemPitch = 0; vertexBufferData.SysMemSlicePitch = 0; ComPtr vertexBuffer; DX::ThrowIfFailed( m_d3dDevice-CreateBuffer( &vertexBufferDesc, &vertexBufferData, &vertexBuffer ) ); // Once all D3D resources are created, configure the application window. // Allow the application to respond when the window size changes. m_window-SizeChanged += ref new TypedEventHandler( this, &Direct3DTutorialViewProvider::OnWindowSizeChanged ); // Specify the cursor type as the standard arrow cursor. m_window-PointerCursor = ref new CoreCursor(CoreCursorType::Arrow, 0); // Activate the application window, making it visible and enabling it to receive events. m_window-Activate(); // Enter the render loop. Note that tailored applications should never exit. while (true) { // Process events incoming to the window. m_window-Dispatcher-ProcessEvents(CoreProcessEventsOption::ProcessAllIfPresent); // Specify the render target we created as the output target. ID3D11RenderTargetView* targets[1] = {m_renderTargetView.Get()}; m_d3dDeviceContext-OMSetRenderTargets( 1, targets, NULL // use no depth stencil ); // Clear the render target to a solid color. const float clearColor[4] = { 0.071f, 0.04f, 0.561f, 1.0f }; //Code fails here m_d3dDeviceContext-ClearRenderTargetView( m_renderTargetView.Get(), clearColor ); m_d3dDeviceContext-IASetInputLayout(inputLayout.Get()); // Set the vertex and index buffers, and specify the way they define geometry. UINT stride = sizeof(float3); UINT offset = 0; m_d3dDeviceContext-IASetVertexBuffers( 0, 1, vertexBuffer.GetAddressOf(), &stride, &offset ); m_d3dDeviceContext-IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY_TRIANGLELIST); // Set the vertex and pixel shader stage state. m_d3dDeviceContext-VSSetShader( vertexShader.Get(), nullptr, 0 ); m_d3dDeviceContext-PSSetShader( pixelShader.Get(), nullptr, 0 ); // Draw the cube. m_d3dDeviceContext-Draw(3,0); // Present the rendered image to the window. Because the maximum frame latency is set to 1, // the render loop will generally be throttled to the screen refresh rate, typically around // 60Hz, by sleeping the application on Present until the screen is refreshed. DX::ThrowIfFailed( m_swapChain-Present(1, 0) ); } } // This method is called before the application exits. void Uninitialize() { } private: // This method is called whenever the application window size changes. void OnWindowSizeChanged( _In_ CoreWindow^ sender, _In_ WindowSizeChangedEventArgs^ args ) { m_renderTargetView = nullptr; CreateWindowSizeDependentResources(); } // This method creates all application resources that depend on // the application window size. It is called at app initialization, // and whenever the application window size changes. void CreateWindowSizeDependentResources() { if (m_swapChain != nullptr) { // If the swap chain already exists, resize it. DX::ThrowIfFailed( m_swapChain-ResizeBuffers( 2, 0, 0, DXGI_FORMAT_R8G8B8A8_UNORM, 0 ) ); } else { // If the swap chain does not exist, create it. DXGI_SWAP_CHAIN_DESC1 swapChainDesc = {0}; swapChainDesc.Stereo = false; swapChainDesc.BufferUsage = DXGI_USAGE_RENDER_TARGET_OUTPUT; swapChainDesc.Scaling = DXGI_SCALING_NONE; swapChainDesc.Flags = 0; // Use automatic sizing. swapChainDesc.Width = 0; swapChainDesc.Height = 0; // This is the most common swap chain format. swapChainDesc.Format = DXGI_FORMAT_R8G8B8A8_UNORM; // Don't use multi-sampling. swapChainDesc.SampleDesc.Count = 1; swapChainDesc.SampleDesc.Quality = 0; // Use two buffers to enable flip effect. swapChainDesc.BufferCount = 2; // We recommend using this swap effect for all applications. swapChainDesc.SwapEffect = DXGI_SWAP_EFFECT_FLIP_SEQUENTIAL; // Once the swap chain description is configured, it must be // created on the same adapter as the existing D3D Device. // First, retrieve the underlying DXGI Device from the D3D Device. ComPtr dxgiDevice; DX::ThrowIfFailed( m_d3dDevice.As(&dxgiDevice) ); // Ensure that DXGI does not queue more than one frame at a time. This both reduces // latency and ensures that the application will only render after each VSync, minimizing // power consumption. DX::ThrowIfFailed( dxgiDevice-SetMaximumFrameLatency(1) ); // Next, get the parent factory from the DXGI Device. ComPtr dxgiAdapter; DX::ThrowIfFailed( dxgiDevice-GetAdapter(&dxgiAdapter) ); ComPtr dxgiFactory; DX::ThrowIfFailed( dxgiAdapter-GetParent( __uuidof(IDXGIFactory2), &dxgiFactory ) ); // Finally, create the swap chain. DX::ThrowIfFailed( dxgiFactory-CreateSwapChainForImmersiveWindow( m_d3dDevice.Get(), DX::GetIUnknown(m_window), &swapChainDesc, nullptr, // allow on all displays &m_swapChain ) ); } // Once the swap chain is created, create a render target view. This will // allow Direct3D to render graphics to the window. ComPtr backBuffer; DX::ThrowIfFailed( m_swapChain-GetBuffer( 0, __uuidof(ID3D11Texture2D), &backBuffer ) ); DX::ThrowIfFailed( m_d3dDevice-CreateRenderTargetView( backBuffer.Get(), nullptr, &m_renderTargetView ) ); // After the render target view is created, specify that the viewport, // which describes what portion of the window to draw to, should cover // the entire window. D3D11_TEXTURE2D_DESC backBufferDesc = {0}; backBuffer-GetDesc(&backBufferDesc); D3D11_VIEWPORT viewport; viewport.TopLeftX = 0.0f; viewport.TopLeftY = 0.0f; viewport.Width = static_cast(backBufferDesc.Width); viewport.Height = static_cast(backBufferDesc.Height); viewport.MinDepth = D3D11_MIN_DEPTH; viewport.MaxDepth = D3D11_MAX_DEPTH; m_d3dDeviceContext-RSSetViewports(1, &viewport); } }; // This class defines how to create the custom View Provider defined above. ref class Direct3DTutorialViewProviderFactory : IViewProviderFactory { public: IViewProvider^ CreateViewProvider() { return ref new Direct3DTutorialViewProvider(); } }; [Platform::MTAThread] int main(array^) { auto viewProviderFactory = ref new Direct3DTutorialViewProviderFactory(); Windows::ApplicationModel::Core::CoreApplication::Run(viewProviderFactory); return 0; }

    Read the article

  • Mauritius Software Craftsmanship Community

    There we go! I finally managed to push myself forward and pick up an old, actually too old, idea since I ever arrived here in Mauritius more than six years ago. I'm talking about a community for all kind of ICT connected people. In the past (back in Germany), I used to be involved in various community activities. For example, I was part of the Microsoft Community Leader/Influencer Program (CLIP) in Germany due to an FAQ on Visual FoxPro, actually Active FoxPro Pages (AFP) to be more precise. Then in 2003/2004 I addressed the responsible person of the dFPUG user group in Speyer in order to assist him in organising monthly user group meetings. Well, he handed over management completely, and attended our meetings regularly. Why did it take you so long? Well, I don't want to bother you with the details but short version is that I was too busy on either job (building up new companies) or private life (got married and we have two lovely children, eh 'monsters') or even both. But now is the time where I was starting to look for new fields given the fact that I gained some spare time. My businesses are up and running, the kids are in school, and I am finally in a position where I can commit myself again to community activities. And I love to do that! Why a new user group? Good question... And 'easy' to answer. Since back in 2007 I did my usual research, eh Google searches, to see whether there existing user groups in Mauritius and in which field of interest. And yes, there are! If I recall this correctly, then there are communities for PHP, Drupal, Python (just recently), Oracle, and Linux (which used to be even two). But... either they do not exist anymore, they are dormant, or there is only a low heart-beat, frankly speaking. And yes, I went to meetings of the Linux User Group Meta (Mauritius) back in 2010/2011 and just recently. I really like the setup and the way the LUGM is organised. It's just that I have a slightly different point of view on how a user group or community should organise itself and how to approach future members. Don't get me wrong, I'm not criticizing others doing a very good job, I'm only saying that I'd like to do it differently. The last meeting of the LUGM was awesome; read my feedback about it. Ok, so what's up with 'Mauritius Software Craftsmanship Community' or short: MSCC? As I've already written in my article on 'Communities - The importance of exchange and discussion' I think it is essential in a world of IT to stay 'connected' with a good number of other people in the same field. There is so much dynamic and every day's news that it is almost impossible to keep on track with all of them. The MSCC is going to provide a common platform to exchange experience and share knowledge between each other. You might be a newbie and want to know what to expect working as a software developer, or as a database administrator, or maybe as an IT systems administrator, or you're an experienced geek that loves to share your ideas or solutions that you implemented to solve a specific problem, or you're the business (or HR) guy that is looking for 'fresh' blood to enforce your existing team. Or... you're just interested and you'd like to communicate with like-minded people. Meetup of 26.06.2013 @ L'arabica: Of course there are laptops around. Free WiFi, power outlet, coffee, code and Linux in one go. The MSCC is technology-agnostic and spans an umbrella over any kind of technology. Simply because you can't ignore other technologies anymore in a connected IT world as we have. A front-end developer for iOS applications should have the chance to connect with a Python back-end coder and eventually with a DBA for MySQL or PostgreSQL and exchange their experience. Furthermore, I'm a huge fan of cross-platform development, and it is very pleasant to have pure Web developers - with all that HTML5, CSS3, JavaScript and JS libraries stuff - and passionate C# or Java coders at the same table. This diversity of knowledge can assist and boost your personal situation. And last but not least, there are projects and open positions 'flying' around... People might like to hear others opinion about an employer or get new impulses on how to tackle down an issue at their workspace, etc. This is about community. And that's how I see the MSCC in general - free of any limitations be it by programming language or technology. Having the chance to exchange experience and to discuss certain aspects of technology saves you time and money, and it's a pleasure to enjoy. Compared to dusty books and remote online resources. It's human! Organising meetups (meetings, get-together, gatherings - you name it!) As of writing this article, the MSCC is currently meeting every Wednesday for the weekly 'Code & Coffee' session at various locations (suggestions are welcome!) in Mauritius. This might change in the future eventually but especially at the beginning I think it is very important to create awareness in the Mauritian IT world. Yes, we are here! Come and join us! ;-) The MSCC's main online presence is located at Meetup.com because it allows me to handle the organisation of events and meeting appointments very easily, and any member can have a look who else is involved so that an exchange of contacts is given at any time. In combination with the other entities (G+ Communities, FB Pages or in Groups) I advertise and manage all future activities here: Mauritius Software Craftsmanship Community This is a community for those who care and are proud of what they do. For those developers, regardless how experienced they are, who want to improve and master their craft. This is a community for those who believe that being average is just not good enough. I know, there are not many 'craftsmen' yet but it's a start... Let's see how it looks like by the end of the year. There are free smartphone apps for Android and iOS from Meetup.com that allow you to keep track of meetings and to stay informed on latest updates. And last but not least, there is a Trello workspace to collect and share ideas and provide downloads of slides, etc. Trello is also available as free smartphone app. Sharing is caring! As mentioned, the #MSCC is present in various social media networks in order to cover as many people as possible here in Mauritius. Following is an overview of the current networks: Twitter - Latest updates and quickies Google+ - Community channel Facebook - Community Page LinkedIn - Community Group Trello - Collaboration workspace to share and develop ideas Hopefully, this covers the majority of computer-related people in Mauritius. Please spread the word about the #MSCC between your colleagues, your friends and other interested 'geeks'. Your future looks bright Running and participating in a user group or any kind of community usually provides quite a number of advantages for anyone. On the one side it is very joyful for me to organise appointments and get in touch with people that might be interested to present a little demo of their projects or their recent problems they had to tackle down, and on the other side there are lots of companies that have various support programs or sponsorships especially tailored for user groups. At the moment, I already have a couple of gimmicks that I would like to hand out in small contests or raffles during one of the upcoming meetings, and as said, companies provide all kind of goodies, books free of charge, or sometimes even licenses for communities. Meeting other software developers or IT guys also opens up your point of view on the local market and there might be interesting projects or job offers available, too. A community like the Mauritius Software Craftsmanship Community is great for freelancers, self-employed, students and of course employees. Meetings will be organised on a regular basis, and I'm open to all kind of suggestions from you. Please leave a comment here in blog or join the conversations in the above mentioned social networks. Let's get this community up and running, my fellow Mauritians! Recent updates The MSCC is now officially participating in the O'Reilly UK User Group programm and we are allowed to request review or recension copies of recent titles. Additionally, we have a discount code for any books or ebooks that you might like to order on shop.oreilly.com. More applications for user group sponsorship programms are pending and I'm looking forward to a couple of announcement very soon. And... we need some kind of 'corporate identity' - Over at the MSCC website there is a call for action (or better said a contest with prizes) to create a unique design for the MSCC. This would include a decent colour palette, a logo, graphical banners for Meetup, Google+, Facebook, LinkedIn, etc. and of course badges for our craftsmen to add to their personal blogs and websites. Please spread the word and contribute. Thanks!

    Read the article

  • 500 Metro Style WP7 Icons

    - by Bil Simser
    I was inspired by The Noun Project, a project that offers up “Metro-style” icons in SVG format. The project is licensed under a public domain license and while it’s a great project, all of the content is in SVG format. Jon Galloway has a great post (from 2007) talking about the differences between SVG and XAML so I highly recommend that for some background. I thought it would be helpful to the WPF/Windows Phone 7/Silverlight community to provide the content in alternative formats for use in your applications. The Goods I’ve put together a package of the 500 icons (502 actually) in PNG, XAML and the original SVG format along with a couple of sample projects so you can see them in action. There’s a WPF desktop app: And a Windows Phone 7 app: Building It To get all the content first I wrote up a quick program to suck the original SVG files. Luckily they’re all in a common path just named 1.SVG, 2.SVG, and so on. Easy sleazy to grab the contents. Once I had 500 SVG files I used the latest copy of XamlTune, an open source CodePlex project that has a command line conversion tool to convert the directory of SVG files into XAML (the tool also created a PNG file of each SVG so that’s just icing on the cake). Conversions The conversion from SVG to XAML isn’t 100%. While you can just drop the content into a WPF app, it doesn’t work that way for WP7. There are just some small adjustments I made to each format so you’ll have to do the same. Follow the information below or refer to the sample applications. As a sample, here’s an icon we want to use: Here’s the original SVG file: <svg version="1.0" id="Layer_1" xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" width="100px" height="94.616px" viewBox="0 0 100 94.616" enable-background="new 0 0 100 94.616" xml:space="preserve"> <path d="M25.076,15.639c4.324,0.009,7.824-3.488,7.82-7.82C32.9,3.512,29.4,0.012,25.076,0c-4.313,0.012-7.814,3.512-7.821,7.819 C17.262,12.15,20.763,15.648,25.076,15.639L25.076,15.639z"/> <path d="M4.593,43.388h6.861l4.137-15.135h1.716L13.22,43.388h24.318l-4.389-15.135h1.817l2.32,7.415 c1.08,3.131,3.852,3.851,6.003,1.162l8.375-10.142c2.651-3.42-2.104-7.021-4.844-4.035l-4.993,5.952 c0.007,0.095-0.96-3.278-0.96-3.278c-1.135-3.978-4.918-7.903-10.595-7.922H19.576c-5.071,0.019-9.043,4.434-9.888,7.214 L4.593,43.388L4.593,43.388z"/> <polygon points="56.206,22.753 56.206,7.163 49.192,7.163 49.192,22.753 56.206,22.753 "/> <path d="M79.87,15.738c4.332-0.014,7.831-3.516,7.82-7.82c0.011-4.332-3.488-7.833-7.82-7.82c-4.306-0.013-7.806,3.488-7.821,7.82 C72.064,12.222,75.564,15.725,79.87,15.738L79.87,15.738z"/> <path d="M89.759,89.556v-43.19h5.751V22.804c0.007-3.079-2.757-5.448-6.71-5.449H70.436c-3.65,0.001-4.539,1.186-5.551,2.168 L49.597,37.889c-3.098,3.848,2.428,8.333,5.55,4.743L69.88,25.226v64.43c-0.019,6.475,9.06,6.686,9.081,0.201v-36.58h1.765v36.379 C80.748,96.109,89.772,96.13,89.759,89.556L89.759,89.556z"/> <polygon points="100,54.035 100,45.155 0,45.155 0,54.035 100,54.035 "/> </svg> Here’s the XAML that XamlTune created. It can be used in any WPF app without any changes: <Canvas Name="Layer_1" Width="100" Height="94.616" ClipToBounds="True" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M25.076,15.639C29.4,15.648 32.9,12.151 32.896,7.819 32.9,3.512 29.4,0.012 25.076,0 20.763,0.012 17.262,3.512 17.255,7.819 17.262,12.15 20.763,15.648 25.076,15.639L25.076,15.639z" /> </Path.Data> </Path> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M4.593,43.388L11.454,43.388 15.591,28.253 17.307,28.253 13.22,43.388 37.538,43.388 33.149,28.253 34.966,28.253 37.286,35.668C38.366,38.799,41.138,39.519,43.289,36.83L51.664,26.688C54.315,23.268,49.56,19.667,46.82,22.653L41.827,28.605C41.834,28.7 40.867,25.327 40.867,25.327 39.732,21.349 35.949,17.424 30.272,17.405L19.576,17.405C14.505,17.424,10.533,21.839,9.688,24.619L4.593,43.388 4.593,43.388z" /> </Path.Data> </Path> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M56.206,22.753L56.206,7.163 49.192,7.163 49.192,22.753 56.206,22.753z" /> </Path.Data> </Path> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M79.87,15.738C84.202,15.724 87.701,12.222 87.69,7.918 87.701,3.586 84.202,0.0849999999999991 79.87,0.097999999999999 75.564,0.084999999999999 72.064,3.586 72.049,7.918 72.064,12.222 75.564,15.725 79.87,15.738L79.87,15.738z" /> </Path.Data> </Path> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M89.759,89.556L89.759,46.366 95.51,46.366 95.51,22.804C95.517,19.725,92.753,17.356,88.8,17.355L70.436,17.355C66.786,17.356,65.897,18.541,64.885,19.523L49.597,37.889C46.499,41.737,52.025,46.222,55.147,42.632L69.88,25.226 69.88,89.656C69.861,96.131,78.94,96.342,78.961,89.857L78.961,53.277 80.726,53.277 80.726,89.656C80.748,96.109,89.772,96.13,89.759,89.556L89.759,89.556z" /> </Path.Data> </Path> <Path Fill="#FF000000"> <Path.Data> <PathGeometry FillRule="Nonzero" Figures="M100,54.035L100,45.155 0,45.155 0,54.035 100,54.035z" /> </Path.Data> </Path> </Canvas> The XAML works AS-IS in a WPF application but there are some changes I did to get it to work in a WP7 app. Here’s the modified XAML in a WP7 application: <Canvas Grid.Row="0" Grid.Column="0" Name="Icon_1" Width="100" Height="94.616"> <Path Fill="#FF000000" Data="M25.076,15.639C29.4,15.648 32.9,12.151 32.896,7.819 32.9,3.512 29.4,0.012 25.076,0 20.763,0.012 17.262,3.512 17.255,7.819 17.262,12.15 20.763,15.648 25.076,15.639L25.076,15.639z"> </Path> <Path Fill="#FF000000" Data="M4.593,43.388L11.454,43.388 15.591,28.253 17.307,28.253 13.22,43.388 37.538,43.388 33.149,28.253 34.966,28.253 37.286,35.668C38.366,38.799,41.138,39.519,43.289,36.83L51.664,26.688C54.315,23.268,49.56,19.667,46.82,22.653L41.827,28.605C41.834,28.7 40.867,25.327 40.867,25.327 39.732,21.349 35.949,17.424 30.272,17.405L19.576,17.405C14.505,17.424,10.533,21.839,9.688,24.619L4.593,43.388 4.593,43.388z"> </Path> <Path Fill="#FF000000" Data="M56.206,22.753L56.206,7.163 49.192,7.163 49.192,22.753 56.206,22.753z"> </Path> <Path Fill="#FF000000" Data="M79.87,15.738C84.202,15.724 87.701,12.222 87.69,7.918 87.701,3.586 84.202,0.0849999999999991 79.87,0.097999999999999 75.564,0.084999999999999 72.064,3.586 72.049,7.918 72.064,12.222 75.564,15.725 79.87,15.738L79.87,15.738z"> </Path> <Path Fill="#FF000000" Data="M89.759,89.556L89.759,46.366 95.51,46.366 95.51,22.804C95.517,19.725,92.753,17.356,88.8,17.355L70.436,17.355C66.786,17.356,65.897,18.541,64.885,19.523L49.597,37.889C46.499,41.737,52.025,46.222,55.147,42.632L69.88,25.226 69.88,89.656C69.861,96.131,78.94,96.342,78.961,89.857L78.961,53.277 80.726,53.277 80.726,89.656C80.748,96.109,89.772,96.13,89.759,89.556L89.759,89.556z"> </Path> <Path Fill="#FF000000" Data="M100,54.035L100,45.155 0,45.155 0,54.035 100,54.035z"> </Path> </Canvas> All I did was take the data portion and put it directly into a Data attribute on the Path. Note that while it does show up in the app (on the emulator or device) it wouldn’t show up in Visual Studio for me. Maybe some XAML guru out there can tell me why. You can just as easily use the PNG files in WP7 but if you want the crispness of vector graphics, go for the XAML version. Of course with XamlTune being open source you could always modify the output of that program to cater it to your app. If you do make a change that’s worthy please consider submitting a patch to the project so everyone can benefit. Hope this helps and happy programming! Resources and Links Sample Project and Icons XamlTune an open source project to convert SVG to XAML The Noun Project source of the original files Jon Galloways post on SVG and XAML StackOverflow question on converting SVG to XAML

    Read the article

  • SQL Server and Hyper-V Dynamic Memory Part 2

    - by SQLOS Team
    Part 1 of this series was an introduction and overview of Hyper-V Dynamic Memory. This part looks at SQL Server memory management and how the SQL engine responds to changing OS memory conditions.   Part 2: SQL Server Memory Management As with any Windows process, sqlserver.exe has a virtual address space (VAS) of 4GB on 32-bit and 8TB in 64-bit editions. Pages in its VAS are mapped to pages in physical memory when the memory is committed and referenced for the first time. The collection of VAS pages that have been recently referenced is known as the Working Set. How and when SQL Server allocates virtual memory and grows its working set depends on the memory model it uses. SQL Server supports three basic memory models:   1. Conventional Memory Model   The Conventional model is the default SQL Server memory model and has the following properties: - Dynamic - can grow or shrink its working set in response to load and external (operating system) memory conditions. - OS uses 4K pages – (not to be confused with SQL Server “pages” which are 8K regions of committed memory).- Pageable - Can be paged out to disk by the operating system.   2. Locked Page Model The locked page memory model is set when SQL Server is started with "Lock Pages in Memory" privilege*. It has the following characteristics: - Dynamic - can grow or shrink its working set in the same way as the Conventional model.- OS uses 4K pages - Non-Pageable – When memory is committed it is locked in memory, meaning that it will remain backed by physical memory and will not be paged out by the operating system. A common misconception is to interpret "locked" as non-dynamic. A SQL Server instance using the locked page memory model will grow and shrink (allocate memory and release memory) in response to changing workload and OS memory conditions in the same way as it does with the conventional model.   This is an important consideration when we look at Hyper-V Dynamic Memory – “locked” memory works perfectly well with “dynamic” memory.   * Note in “Denali” (Standard Edition and above), and in SQL 2008 R2 64-bit (Enterprise and above editions) the Lock Pages in Memory privilege is all that is required to set this model. In 2008 R2 64-Bit standard edition it also requires trace flag 845 to be set, in 2008 R2 32-bit editions it requires sp_configure 'awe enabled' 1.   3. Large Page Model The Large page model is set using trace flag 834 and potentially offers a small performance boost for systems that are configured with large pages. It is characterized by: - Static - memory is allocated at startup and does not change. - OS uses large (>2MB) pages - Non-Pageable The large page model is supported with Hyper-V Dynamic Memory (and Hyper-V also supports large pages), but you get no benefit from using Dynamic Memory with this model since SQL Server memory does not grow or shrink. The rest of this article will focus on the locked and conventional SQL Server memory models.   When does SQL Server grow? For “dynamic” configurations (Conventional and Locked memory models), the sqlservr.exe process grows – allocates and commits memory from the OS – in response to a workload. As much memory is allocated as is required to optimally run the query and buffer data for future queries, subject to limitations imposed by:   - SQL Server max server memory setting. If this configuration option is set, the buffer pool is not allowed to grow to more than this value. In SQL Server 2008 this value represents single page allocations, and in “Denali” it represents any size page allocations and also managed CLR procedure allocations.   - Memory signals from OS. The operating system sets a signal on memory resource notification objects to indicate whether it has memory available or whether it is low on available memory. If there is only 32MB free for every 4GB of memory a low memory signal is set, which continues until 64MB/4GB is free. If there is 96MB/4GB free the operating system sets a high memory signal. SQL Server only allocates memory when the high memory signal is set.   To summarize, for SQL Server to grow you need three conditions: a workload, max server memory setting higher than the current allocation, high memory signals from the OS.    When does SQL Server shrink caches? SQL Server as a rule does not like to return memory to the OS, but it will shrink its caches in response to memory pressure. Memory pressure can be divided into “internal” and “external”.   - External memory pressure occurs when the operating system is running low on memory and low memory signals are set. The SQL Server Resource Monitor checks for low memory signals approximately every 5 seconds and it will attempt to free memory until the signals stop.   To free memory SQL Server does the following: ·         Frees unused memory. ·         Notifies Memory Manager Clients to release memory o   Caches – Free unreferenced cache objects. o   Buffer pool - Based on oldest access times.   The freed memory is released back to the operating system. This process continues until the low memory resource notifications stop.    - Internal memory pressure occurs when the size of different caches and allocations increase but the SQL Server process needs to keep its total memory within a target value. For example if max server memory is set and certain caches are growing large, it will cause SQL to free memory for re-use internally, but not to release memory back to the OS. If you lower the value of max server memory you will generate internal memory pressure that will cause SQL to release memory back to the OS.    Memory pressure handling has not changed much since SQL 2005 and it was described in detail in a blog post by Slava Oks.   Note that SQL Server Express is an exception to the above behavior. Unlike other editions it does not assume it is the most important process running on the system but tries to be more “desktop” friendly. It will empty its working set after a period of inactivity.   How does SQL Server respond to changing OS memory?    In SQL Server 2005 support for Hot-Add memory was introduced. This feature, available in Enterprise and above editions, allows the server to make use of any extra physical memory that was added after SQL Server started. Being able to add physical memory when the system is running is limited to specialized hardware, but with the Hyper-V Dynamic Memory feature, when new memory is allocated to a guest virtual machine, it looks like hot-add physical memory to the guest. What this means is that thanks to the hot-add memory feature, SQL Server 2005 and higher can dynamically grow if more “physical” memory is granted to a guest VM by Hyper-V dynamic memory.   SQL Server checks OS memory every second and dynamically adjusts its “target” (based on available OS memory and max server memory) accordingly.   In “Denali” Standard Edition will also have sqlserver.exe support for hot-add memory when running virtualized (i.e. detecting and acting on Hyper-V Dynamic Memory allocations).   How does a SQL Server workload in a guest VM impact Hyper-V dynamic memory scheduling?   When a SQL workload causes the sqlserver.exe process to grow its working set, the Hyper-V memory scheduler will detect memory pressure in the guest VM and add memory to it. SQL Server will then detect the extra memory and grow according to workload demand. In our tests we have seen this feedback process cause a guest VM to grow quickly in response to SQL workload - we are still working on characterizing this ramp-up.    How does SQL Server respond when Hyper-V removes memory from a guest VM through ballooning?   If pressure from other VM's cause Hyper-V Dynamic Memory to take memory away from a VM through ballooning (allocating memory with a virtual device driver and returning it to the host OS), Windows Memory Manager will page out unlocked portions of memory and signal low resource notification events. When SQL Server detects these events it will shrink memory until the low memory notifications stop (see cache shrinking description above).    This raises another question. Can we make SQL Server release memory more readily and hence behave more "dynamically" without compromising performance? In certain circumstances where the application workload is predictable it may be possible to have a job which varies "max server memory" according to need, lowering it when the engine is inactive and raising it before a period of activity. This would have limited applicaability but it is something we're looking into.   What Memory Management changes are there in SQL Server “Denali”?   In SQL Server “Denali” (aka SQL11) the Memory Manager has been re-written to be more efficient. The main changes are summarized in this post. An important change with respect to Hyper-V Dynamic Memory support is that now the max server memory setting includes any size page allocations and managed CLR procedure allocations it now represents a closer approximation to total sqlserver.exe memory usage. This makes it easier to calculate a value for max server memory, which becomes important when configuring virtual machines to work well with Hyper-V Dynamic Memory Startup and Maximum RAM settings.   Another important change is no more AWE or hot-add support for 32-bit edition. This means if you're running a 32-bit edition of Denali you're limited to a 4GB address space and will not be able to take advantage of dynamically added OS memory that wasn't present when SQL Server started (though Hyper-V Dynamic Memory is still a supported configuration).   In part 3 we’ll develop some best practices for configuring and using SQL Server with Dynamic Memory. Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • Windows Azure: Backup Services Release, Hyper-V Recovery Manager, VM Enhancements, Enhanced Enterprise Management Support

    - by ScottGu
    This morning we released a huge set of updates to Windows Azure.  These new capabilities include: Backup Services: General Availability of Windows Azure Backup Services Hyper-V Recovery Manager: Public preview of Windows Azure Hyper-V Recovery Manager Virtual Machines: Delete Attached Disks, Availability Set Warnings, SQL AlwaysOn Configuration Active Directory: Securely manage hundreds of SaaS applications Enterprise Management: Use Active Directory to Better Manage Windows Azure Windows Azure SDK 2.2: A massive update of our SDK + Visual Studio tooling support All of these improvements are now available to use immediately.  Below are more details about them. Backup Service: General Availability Release of Windows Azure Backup Today we are releasing Windows Azure Backup Service as a general availability service.  This release is now live in production, backed by an enterprise SLA, supported by Microsoft Support, and is ready to use for production scenarios. Windows Azure Backup is a cloud based backup solution for Windows Server which allows files and folders to be backed up and recovered from the cloud, and provides off-site protection against data loss. The service provides IT administrators and developers with the option to back up and protect critical data in an easily recoverable way from any location with no upfront hardware cost. Windows Azure Backup is built on the Windows Azure platform and uses Windows Azure blob storage for storing customer data. Windows Server uses the downloadable Windows Azure Backup Agent to transfer file and folder data securely and efficiently to the Windows Azure Backup Service. Along with providing cloud backup for Windows Server, Windows Azure Backup Service also provides capability to backup data from System Center Data Protection Manager and Windows Server Essentials, to the cloud. All data is encrypted onsite before it is sent to the cloud, and customers retain and manage the encryption key (meaning the data is stored entirely secured and can’t be decrypted by anyone but yourself). Getting Started To get started with the Windows Azure Backup Service, create a new Backup Vault within the Windows Azure Management Portal.  Click New->Data Services->Recovery Services->Backup Vault to do this: Once the backup vault is created you’ll be presented with a simple tutorial that will help guide you on how to register your Windows Servers with it: Once the servers you want to backup are registered, you can use the appropriate local management interface (such as the Microsoft Management Console snap-in, System Center Data Protection Manager Console, or Windows Server Essentials Dashboard) to configure the scheduled backups and to optionally initiate recoveries. You can follow these tutorials to learn more about how to do this: Tutorial: Schedule Backups Using the Windows Azure Backup Agent This tutorial helps you with setting up a backup schedule for your registered Windows Servers. Additionally, it also explains how to use Windows PowerShell cmdlets to set up a custom backup schedule. Tutorial: Recover Files and Folders Using the Windows Azure Backup Agent This tutorial helps you with recovering data from a backup. Additionally, it also explains how to use Windows PowerShell cmdlets to do the same tasks. Below are some of the key benefits the Windows Azure Backup Service provides: Simple configuration and management. Windows Azure Backup Service integrates with the familiar Windows Server Backup utility in Windows Server, the Data Protection Manager component in System Center and Windows Server Essentials, in order to provide a seamless backup and recovery experience to a local disk, or to the cloud. Block level incremental backups. The Windows Azure Backup Agent performs incremental backups by tracking file and block level changes and only transferring the changed blocks, hence reducing the storage and bandwidth utilization. Different point-in-time versions of the backups use storage efficiently by only storing the changes blocks between these versions. Data compression, encryption and throttling. The Windows Azure Backup Agent ensures that data is compressed and encrypted on the server before being sent to the Windows Azure Backup Service over the network. As a result, the Windows Azure Backup Service only stores encrypted data in the cloud storage. The encryption key is not available to the Windows Azure Backup Service, and as a result the data is never decrypted in the service. Also, users can setup throttling and configure how the Windows Azure Backup service utilizes the network bandwidth when backing up or restoring information. Data integrity is verified in the cloud. In addition to the secure backups, the backed up data is also automatically checked for integrity once the backup is done. As a result, any corruptions which may arise due to data transfer can be easily identified and are fixed automatically. Configurable retention policies for storing data in the cloud. The Windows Azure Backup Service accepts and implements retention policies to recycle backups that exceed the desired retention range, thereby meeting business policies and managing backup costs. Hyper-V Recovery Manager: Now Available in Public Preview I’m excited to also announce the public preview of a new Windows Azure Service – the Windows Azure Hyper-V Recovery Manager (HRM). Windows Azure Hyper-V Recovery Manager helps protect your business critical services by coordinating the replication and recovery of System Center Virtual Machine Manager 2012 SP1 and System Center Virtual Machine Manager 2012 R2 private clouds at a secondary location. With automated protection, asynchronous ongoing replication, and orderly recovery, the Hyper-V Recovery Manager service can help you implement Disaster Recovery and restore important services accurately, consistently, and with minimal downtime. Application data in an Hyper-V Recovery Manager scenarios always travels on your on-premise replication channel. Only metadata (such as names of logical clouds, virtual machines, networks etc.) that is needed for orchestration is sent to Azure. All traffic sent to/from Azure is encrypted. You can begin using Windows Azure Hyper-V Recovery today by clicking New->Data Services->Recovery Services->Hyper-V Recovery Manager within the Windows Azure Management Portal.  You can read more about Windows Azure Hyper-V Recovery Manager in Brad Anderson’s 9-part series, Transform the datacenter. To learn more about setting up Hyper-V Recovery Manager follow our detailed step-by-step guide. Virtual Machines: Delete Attached Disks, Availability Set Warnings, SQL AlwaysOn Today’s Windows Azure release includes a number of nice updates to Windows Azure Virtual Machines.  These improvements include: Ability to Delete both VM Instances + Attached Disks in One Operation Prior to today’s release, when you deleted VMs within Windows Azure we would delete the VM instance – but not delete the drives attached to the VM.  You had to manually delete these yourself from the storage account.  With today’s update we’ve added a convenience option that now allows you to either retain or delete the attached disks when you delete the VM:   We’ve also added the ability to delete a cloud service, its deployments, and its role instances with a single action. This can either be a cloud service that has production and staging deployments with web and worker roles, or a cloud service that contains virtual machines.  To do this, simply select the Cloud Service within the Windows Azure Management Portal and click the “Delete” button: Warnings on Availability Sets with Only One Virtual Machine In Them One of the nice features that Windows Azure Virtual Machines supports is the concept of “Availability Sets”.  An “availability set” allows you to define a tier/role (e.g. webfrontends, databaseservers, etc) that you can map Virtual Machines into – and when you do this Windows Azure separates them across fault domains and ensures that at least one of them is always available during servicing operations.  This enables you to deploy applications in a high availability way. One issue we’ve seen some customers run into is where they define an availability set, but then forget to map more than one VM into it (which defeats the purpose of having an availability set).  With today’s release we now display a warning in the Windows Azure Management Portal if you have only one virtual machine deployed in an availability set to help highlight this: You can learn more about configuring the availability of your virtual machines here. Configuring SQL Server Always On SQL Server Always On is a great feature that you can use with Windows Azure to enable high availability and DR scenarios with SQL Server. Today’s Windows Azure release makes it even easier to configure SQL Server Always On by enabling “Direct Server Return” endpoints to be configured and managed within the Windows Azure Management Portal.  Previously, setting this up required using PowerShell to complete the endpoint configuration.  Starting today you can enable this simply by checking the “Direct Server Return” checkbox: You can learn more about how to use direct server return for SQL Server AlwaysOn availability groups here. Active Directory: Application Access Enhancements This summer we released our initial preview of our Application Access Enhancements for Windows Azure Active Directory.  This service enables you to securely implement single-sign-on (SSO) support against SaaS applications (including Office 365, SalesForce, Workday, Box, Google Apps, GitHub, etc) as well as LOB based applications (including ones built with the new Windows Azure AD support we shipped last week with ASP.NET and VS 2013). Since the initial preview we’ve enhanced our SAML federation capabilities, integrated our new password vaulting system, and shipped multi-factor authentication support. We've also turned on our outbound identity provisioning system and have it working with hundreds of additional SaaS Applications: Earlier this month we published an update on dates and pricing for when the service will be released in general availability form.  In this blog post we announced our intention to release the service in general availability form by the end of the year.  We also announced that the below features would be available in a free tier with it: SSO to every SaaS app we integrate with – Users can Single Sign On to any app we are integrated with at no charge. This includes all the top SAAS Apps and every app in our application gallery whether they use federation or password vaulting. Application access assignment and removal – IT Admins can assign access privileges to web applications to the users in their active directory assuring that every employee has access to the SAAS Apps they need. And when a user leaves the company or changes jobs, the admin can just as easily remove their access privileges assuring data security and minimizing IP loss User provisioning (and de-provisioning) – IT admins will be able to automatically provision users in 3rd party SaaS applications like Box, Salesforce.com, GoToMeeting, DropBox and others. We are working with key partners in the ecosystem to establish these connections, meaning you no longer have to continually update user records in multiple systems. Security and auditing reports – Security is a key priority for us. With the free version of these enhancements you'll get access to our standard set of access reports giving you visibility into which users are using which applications, when they were using them and where they are using them from. In addition, we'll alert you to un-usual usage patterns for instance when a user logs in from multiple locations at the same time. Our Application Access Panel – Users are logging in from every type of devices including Windows, iOS, & Android. Not all of these devices handle authentication in the same manner but the user doesn't care. They need to access their apps from the devices they love. Our Application Access Panel will support the ability for users to access access and launch their apps from any device and anywhere. You can learn more about our plans for application management with Windows Azure Active Directory here.  Try out the preview and start using it today. Enterprise Management: Use Active Directory to Better Manage Windows Azure Windows Azure Active Directory provides the ability to manage your organization in a directory which is hosted entirely in the cloud, or alternatively kept in sync with an on-premises Windows Server Active Directory solution (allowing you to seamlessly integrate with the directory you already have).  With today’s Windows Azure release we are integrating Windows Azure Active Directory even more within the core Windows Azure management experience, and enabling an even richer enterprise security offering.  Specifically: 1) All Windows Azure accounts now have a default Windows Azure Active Directory created for them.  You can create and map any users you want into this directory, and grant administrative rights to manage resources in Windows Azure to these users. 2) You can keep this directory entirely hosted in the cloud – or optionally sync it with your on-premises Windows Server Active Directory.  Both options are free.  The later approach is ideal for companies that wish to use their corporate user identities to sign-in and manage Windows Azure resources.  It also ensures that if an employee leaves an organization, his or her access control rights to the company’s Windows Azure resources are immediately revoked. 3) The Windows Azure Service Management APIs have been updated to support using Windows Azure Active Directory credentials to sign-in and perform management operations.  Prior to today’s release customers had to download and use management certificates (which were not scoped to individual users) to perform management operations.  We still support this management certificate approach (don’t worry – nothing will stop working).  But we think the new Windows Azure Active Directory authentication support enables an even easier and more secure way for customers to manage resources going forward.  4) The Windows Azure SDK 2.2 release (which is also shipping today) includes built-in support for the new Service Management APIs that authenticate with Windows Azure Active Directory, and now allow you to create and manage Windows Azure applications and resources directly within Visual Studio using your Active Directory credentials.  This, combined with updated PowerShell scripts that also support Active Directory, enables an end-to-end enterprise authentication story with Windows Azure. Below are some details on how all of this works: Subscriptions within a Directory As part of today’s update, we have associated all existing Window Azure accounts with a Windows Azure Active Directory (and created one for you if you don’t already have one). When you login to the Windows Azure Management Portal you’ll now see the directory name in the URI of the browser.  For example, in the screen-shot below you can see that I have a “scottgu” directory that my subscriptions are hosted within: Note that you can continue to use Microsoft Accounts (formerly known as Microsoft Live IDs) to sign-into Windows Azure.  These map just fine to a Windows Azure Active Directory – so there is no need to create new usernames that are specific to a directory if you don’t want to.  In the scenario above I’m actually logged in using my @hotmail.com based Microsoft ID which is now mapped to a “scottgu” active directory that was created for me.  By default everything will continue to work just like you used to before. Manage your Directory You can manage an Active Directory (including the one we now create for you by default) by clicking the “Active Directory” tab in the left-hand side of the portal.  This will list all of the directories in your account.  Clicking one the first time will display a getting started page that provides documentation and links to perform common tasks with it: You can use the built-in directory management support within the Windows Azure Management Portal to add/remove/manage users within the directory, enable multi-factor authentication, associate a custom domain (e.g. mycompanyname.com) with the directory, and/or rename the directory to whatever friendly name you want (just click the configure tab to do this).  You can also setup the directory to automatically sync with an on-premises Active Directory using the “Directory Integration” tab. Note that users within a directory by default do not have admin rights to login or manage Windows Azure based resources.  You still need to explicitly grant them co-admin permissions on a subscription for them to login or manage resources in Windows Azure.  You can do this by clicking the Settings tab on the left-hand side of the portal and then by clicking the administrators tab within it. Sign-In Integration within Visual Studio If you install the new Windows Azure SDK 2.2 release, you can now connect to Windows Azure from directly inside Visual Studio without having to download any management certificates.  You can now just right-click on the “Windows Azure” icon within the Server Explorer and choose the “Connect to Windows Azure” context menu option to do so: Doing this will prompt you to enter the email address of the username you wish to sign-in with (make sure this account is a user in your directory with co-admin rights on a subscription): You can use either a Microsoft Account (e.g. Windows Live ID) or an Active Directory based Organizational account as the email.  The dialog will update with an appropriate login prompt depending on which type of email address you enter: Once you sign-in you’ll see the Windows Azure resources that you have permissions to manage show up automatically within the Visual Studio server explorer and be available to start using: No downloading of management certificates required.  All of the authentication was handled using your Windows Azure Active Directory! Manage Subscriptions across Multiple Directories If you have already have multiple directories and multiple subscriptions within your Windows Azure account, we have done our best to create a good default mapping of your subscriptions->directories as part of today’s update.  If you don’t like the default subscription-to-directory mapping we have done you can click the Settings tab in the left-hand navigation of the Windows Azure Management Portal and browse to the Subscriptions tab within it: If you want to map a subscription under a different directory in your account, simply select the subscription from the list, and then click the “Edit Directory” button to choose which directory to map it to.  Mapping a subscription to a different directory takes only seconds and will not cause any of the resources within the subscription to recycle or stop working.  We’ve made the directory->subscription mapping process self-service so that you always have complete control and can map things however you want. Filtering By Directory and Subscription Within the Windows Azure Management Portal you can filter resources in the portal by subscription (allowing you to show/hide different subscriptions).  If you have subscriptions mapped to multiple directory tenants, we also now have a filter drop-down that allows you to filter the subscription list by directory tenant.  This filter is only available if you have multiple subscriptions mapped to multiple directories within your Windows Azure Account:   Windows Azure SDK 2.2 Today we are also releasing a major update of our Windows Azure SDK.  The Windows Azure SDK 2.2 release adds some great new features including: Visual Studio 2013 Support Integrated Windows Azure Sign-In support within Visual Studio Remote Debugging Cloud Services with Visual Studio Firewall Management support within Visual Studio for SQL Databases Visual Studio 2013 RTM VM Images for MSDN Subscribers Windows Azure Management Libraries for .NET Updated Windows Azure PowerShell Cmdlets and ScriptCenter I’ll post a follow-up blog shortly with more details about all of the above. Additional Updates In addition to the above enhancements, today’s release also includes a number of additional improvements: AutoScale: Richer time and date based scheduling support (set different rules on different dates) AutoScale: Ability to Scale to Zero Virtual Machines (very useful for Dev/Test scenarios) AutoScale: Support for time-based scheduling of Mobile Service AutoScale rules Operation Logs: Auditing support for Service Bus management operations Today we also shipped a major update to the Windows Azure SDK – Windows Azure SDK 2.2.  It has so much goodness in it that I have a whole second blog post coming shortly on it! :-) Summary Today’s Windows Azure release enables a bunch of great new scenarios, and enables a much richer enterprise authentication offering. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • CodePlex Daily Summary for Saturday, February 20, 2010

    CodePlex Daily Summary for Saturday, February 20, 2010New ProjectsBerkeliumDotNet: BerkeliumDotNet is a .NET wrapper for the Berkelium library written in C++/CLI. Berkelium provides off-screen browser rendering via Google's Chromi...BoxBinary Descriptive WebCacheManager Framework: Allows you to take a simple, different, and more effective method of caching items in ASP.NET. The developer defines descriptive categories to deco...CHS Extranet: CHS Extranet Project, working with the RM Community to build the new RM Easylink type applicationDbModeller: Generates one class having properties with the basic C# types aimed to serve as a business object by using reflection from the passed objects insta...Dice.NET: Dice.NET is a simple dice roller for those cases where you just kinda forgot your real dices. It's very simple in setup/use.EBI App with the SQL Server CE and websync: EBI App with the SQL Server CE and SQL Server DEV with Merge Replication(Web Synchronization) ere we are trying to develop an application which yo...Family Tree Analyzer: This project with be a c# project which aims to allow users to import their GEDCOM files and produce various data analysis reports such as a list o...Go! Embedded Device Builder: Go! is a complete software engineering environment for the creation of embedded Linux devices. It enables you to engineer, design, develop, build, ...HiddenWordsReadingPlan: HiddenWordsReadingPlanHtml to OpenXml: A library to convert simple or advanced html to plain OpenXml document.Jeffrey Palermo's shared source: This project contains multiple samples with various snippets and projects from blog posts, user group talks, and conference sessions.Krypton Palette Selectors: A small C# control library that allows for simplified palette selection and management. It makes use of and relies on Component Factory's excellen...OCInject: A DI container on a diet. This is a basic DI container that lives in your project not an external assembly with support for auto generated delegat...Photo Organiser: A small utility to sort photos into a new file structure based on date held in their XMP or EXIF tags (YYYY/MM/DD/hhmmss.jpg). Developed in C# / WPF.QPAPrintLib: Print every document by its recommended programmReusable Library: A collection of reusable abstractions for enterprise application developer.Runtime Intelligence Data Visualizer: WPF application used to visualize Runtime Intelligence data using the Data Query Service from the Runtime Intelligence Endpoint Starter Kit.ScreenRec: ScreenRec is program for record your desktop and save to images or save one picture.Silverlight Internet Desktop Application Guidance: SLIDA (Silverlight Internet Desktop Applications) provides process guidance for developers creating Silverlight applications that run exclusively o...WSUS Web Administration: Web Application to remotely manage WSUSNew Releases7zbackup - PowerShell Script to Backup Files with 7zip: 7zBackup v. 1.7.0 Stable: Bug Solved : Test-Path-Writable fails on root of system drive on Windows 7. Therefore the function now accepts an optional parameter to specify if ...aqq: sec 1.02: Projeto SEC - Sistema economico Comercial - em Visual FoxPro 9.0 OpenSource ou seja gratis e com fontes, licença GNU/GPL para maiores informações e...ASP.NET MVC Attribute Based Route Mapper: Attribute Based Routes v0.2: ASP.NET MVC Attribute Based Route MapperBoxBinary Descriptive WebCacheManager Framework: Initial release: Initial assembly release for anyone wanting the files referenced in my talk at Umbraco's 5th Birthday London meetup 16/Feb/2010 The code is fairly...Build Version Increment Add-In Visual Studio: Build Version Increment v2.2 Beta: 2.2.10050.1548Added support for custom increment schemes via pluginsBuild Version Increment Add-In Visual Studio: BuildVersionIncrement v2.1: 2.1.10050.1458Fix: Localization issues Feature: Unmanaged C support Feature: Multi-Monitor support Feature: Global/Default settings Fix: De...CHS Extranet: Beta 2.3: Beta 2.3 Release Change Log: Fixed the update my details not updating the department/form Tried to fix the issue when the ampersand (&) is in t...Cover Creator: CoverCreator 1.2.2: Resolved BUG: If there is only one CD entry in freedb.org application do nothing. Instalation instructions Just unzip CoverCreator and run CoverCr...Employee Scheduler: Employee Scheduler 2.3: Extract the files to a directory and run Lab Hours.exe. Add an employee. Double click an employee to modify their times. Please contact me through ...EnOceanNet: EnOceanNet v1.11: Recompiled for .NET Framework 4 RCFree Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts 3.0.3 beta 4 Released: Hi, This release contains fix for the following bugs: * DataBinding was not working as expected with RIA services. * DataSeries visual wa...Html to OpenXml: HtmlToOpenXml 0.1 Beta: This is a beta version for testing purpose.Jeffrey Palermo's shared source: Web Forms front controller: This code goes along with my blog post about adding code that executes before your web form http://jeffreypalermo.com/blog/add-post-backs-to-mvc-nd...Krypton Palette Selectors: Initial Release: The initial release. Contains only the KryptonPaletteDropButton control.LaunchMeNot: LaunchMeNot 1.10: Lots has been added in this release. Feel free, however, to suggest features you'd like on the forums. Changes in LaunchMeNot 1.10 (19th February ...Magellan: Magellan 1.1.36820.4796 Stable: This is a stable release. It contains a fix for a bug whereby the content of zones in a shared layout couldn't use element bindings (due to name sc...Magellan: Magellan 1.1.36883.4800 Stable: This release includes a couple of major changes: A new Forms object model. See this page for details. Magellan objects are now part of the defau...MAISGestão: LayerDiagram: LayerDiagramMatrix3DEx: Matrix3DEx 1.0.2.0: Fixes the SwapHandedness method. This release includes factory methods for all common transformation matrices like rotation, scaling, translation, ...MDownloader: MDownloader-0.15.1.55880: Fixed bugs.NewLineReplacer: QPANewLineReplacer 1.1: Replace letter fast and easy in great textfilesOAuthLib: OAuthLib (1.5.0.1): Difference between 1.5.0.0 is just version number.OCInject: First Release: First ReleasePhoto Organiser: Installer alpha release: First release - contains known bugs, but works for the most part.Pinger: Pinger-1.0.0.0 Source: The Latest and First Source CodePinger: Pinger-1.0.0.2 Binary: Hi, This version can! work on Older versions of windows than 7 but i haven't test it! tnxPinger: Pinger-1.0.0.2 Source: Hi, It's the raw source!Reusable Library: v1.0: A collection of reusable abstractions for enterprise application developer.ScreenRec: Version 1: One version of this programSense/Net Enterprise Portal & ECMS: SenseNet 6.0 Beta 5: Sense/Net 6.0 Beta 5 Release Notes We are proud to finally present you Sense/Net 6.0 Beta 5, a major feature overhaul over beta43, and hopefully th...Silverlight Internet Desktop Application Guidance: v1: Project templates for Silverlight IDA and Silverlight Navigation IDA.SLAM! SharePoint List Association Manager: SLAM v1.3: The SharePoint List Association Manager is a platform for managing lists in SharePoint in a relational manner. The SLAM Hierarchy Extension works ...SQL Server PowerShell Extensions: 2.0.2 Production: Release 2.0.1 re-implements SQLPSX as PowersShell version 2.0 modules. SQLPSX consists of 8 modules with 133 advanced functions, 2 cmdlets and 7 sc...StoryQ: StoryQ 2.0.2 Library and Converter UI: Fixes: 16086 This release includes the following files: StoryQ.dll - the actual StoryQ library for you to reference StoryQ.xml - the xmldoc for ...Text to HTML: 0.4.0 beta: Cambios de la versión:Inclusión de los idiomas castellano, inglés y francés. Adición de una ventana de configuración. Carga dinámica de variabl...thor: Version 1.1: What's New in Version 1.1Specify whether or not to display the subject of appointments on a calendar Specify whether or not to use a booking agen...TweeVo: Tweet What Your TiVo Is Recording: TweeVo v1.0: TweeVo v1.0VCC: Latest build, v2.1.30219.0: Automatic drop of latest buildVFPnfe: Arquivo xml gerado manualmente: Segue um aquivo que gera o xml para NF-e de forma manual, estou trabalhando na versão 1.1 deste projeto, aguarde, enquanto isso baixe outro projeto...Windows Double Explorer: WDE v0.3.7: -optimization -locked tabs can be reset to locked directory (single & multi) -folder drag drop to tabcontrol creates new tab -splash screen -direcl...WPF ShaderEffect Generator: WPF ShaderEffect Generator 1.5: Visual Studio 2008 and RC 2010 are now supported. Different profiles can now be used to compile the shader with. ChangesVisual Studio RC 2010 is ...Most Popular ProjectsRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)Image Resizer Powertoy Clone for WindowsASP.NETDotNetNuke® Community EditionMicrosoft SQL Server Community & SamplesMost Active ProjectsDinnerNow.netRawrSharpyBlogEngine.NETSharePoint ContribjQuery Library for SharePoint Web ServicesNB_Store - Free DotNetNuke Ecommerce Catalog Modulepatterns & practices – Enterprise LibraryPHPExcelFluent Ribbon Control Suite

    Read the article

  • CodePlex Daily Summary for Saturday, February 27, 2010

    CodePlex Daily Summary for Saturday, February 27, 2010New ProjectsASP.NET MVC ScriptBehind: Dynamic, developer & designer friendly script inclusion, compression and optimization for ASP.NET MVCCSLib.Net: CSLib.Net (Common Solutions Library) is yet another library with commonly used utilities, helpers, extensions and etc.DNN Module - Google Analytic Dashboard: Here is a Google Analytic Dashboard DNN Module which contain following sub modules. * Visitors Overview * World Map Overlay * Traf...dotUML: dotUML is a toolkit that enables developers to create and visualize UML diagrams like sequence, use case or component diagrams. EventRegistration: Event Registration ProgramGameStore League Manager: GameStore League Manager makes it easier for gaming store managers to run local leagues for card games, board games and any game where there is a h...GibberIM: GibberIM (Gibberish IM) is yet another Jabber instant messanger implementation.HTTP Compression Library for Heavy Load Web Server: Deflater is a HTTP Compression Library, supporting Deflate (RFC 1950, RFC 1951) and GZip (RFC 1952). It is designed to encode and compress HTML con...HydroLiDAR: This is a research project intended to explore algorithms and techniques for extracting Hydrographic features (rivers, watersheds, ponds, pits, etc...Lan Party Manager: Lan Party ManagerMAPS SQL Analysis Project: This solution demonstrates how to build a Business Intelligence solution on top of the MAPS databaseMMDB Parallax ALM: An open source Application Lifecycle Management (ALM) system, being built by Mike Mooney of MMDB Solutions, as a learning/teaching exercise. MyColorSprite: This Silverlight app is a color selection tool especially great for creating gredient color brushes for the xaml code. It allows a user create/pic...PDF Form Bubble Up: Bubble Up takes PDF Forms stored in SharePoint document libraries and "bubbles up" the data in the PDF Form to the library. This means the data tha...PostBack Blog Engine: A modified Oxite open-source blog engine on top of the DB4O object database engine.Project Otto: A Silverlight Isometric Rendering Engine and Demo GameQFrac: Fraktalų generatorius parašytas naudojant Qt karkasą.RapidIoC: RapidIoC provides lightning fast IoC capabilities including Dependency Injection & AOP. The modular framework will allow for constructor, property,...Shatranj: A WPF / Silverlight based frontend to Huo Chess. This project was conceived as a way to learn key WPF / Silverlight concepts. At the release, it...WHS SkyDrive Backup Add In: This project allows for Windows Home Server to backup selected folders to your free 25GB Live SkyDrive. Simply dump the Home Server Add In, into y...Workflow Type Browser for WF4: This Workflow Type Browser displays type information for all arguments and variables in a WF4 workflow. It is designed for use in a rehosted desig...ZoomBarPlus: Windows Mobile Service designed for the HTC Touch Pro 2. Adds additional functionality to the zoom bar at the bottom of the screen. You can map key...New ReleasesBCryptTool: BCryptTool v0.2: The Microsoft .NET Framework 3.5 (SP1) is needed to run this program.Braintree Client Library: Braintree-1.1.1: Braintree-1.1.1CC.Utilities: CC.Utilities 1.0.10.226: Minor bug fixes. A few new functions in the Interop namespace. DoubleBufferedGraphics now exposes the underlying memory Image through the Mem...CC.Votd: CC.Votd 1.0.10.227: This release includes several bug fixes and enchancements. The most notable enhancement is the RssItemCache which will allow the screensaver to f...DNN Module - Google Analytic Dashboard: DNN Module - Google Analytic Dashboard: Here is a Google Analytic Dashboard DNN Module which contain following sub modules. * Visitors Overview * World Map Overlay * Traffic ...Extend SmallBasic: Teaching Extensions v.008: Fixed Message Box to appear in front as expected. Added ColorWheel.getRandomColor() Including Recipes and Concept slides as part of releaseFolderSize: FolderSize.Win32.1.0.5.0: FolderSize.Win32.1.0.5.0 A simple utility intended to be used to scan harddrives for the folders that take most place and display this to the user...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts 3.0.4 beta Released: Hi, Today we are releasing the much awaited Zooming feature. In this version of Zooming you will be able to zoom/scale the PlotArea of the chart. ...GameStore League Manager: League Manager 1.0: Rough and ready first version. You will need to have SQL Server Express 2005 or 2008 installed on your machine to use this software. Unzip to a l...Google Maps API 3 Visual Studio Intellisense: google-maps-3-vs-1-0-vsdoc: google-maps-3-vs-1-0 provides Visual Studio intellisense in-line api documentation and code completion for Google Maps API V3. Updated 02/25/10 A...HaoRan_WebCam: HaoRan.WebCam.Web beta2: 在年前发布的那一版基于silverlight4(beta)版的摄像头应用之后。经过最近一段时间的完善。目前已推出了beta2版,在修改了原有程序bug的基础上,做了如下变化: 1.将图片载入修改成为按原图宽高比进行缩放,所以以前沿X,Y轴变化就变成了一个缩放条同比例变化了。 ...IQToolkit Contrib: IQToolkitContrib.zip (v1.0.17.1): Update to DataServiceClientRepository - added ExecuteNonEntity to deal with calling Wcf Data Service methods for Dto classes (opposed to Entity cla...kdar: KDAR 0.0.15: KDAR - Kernel Debugger Anti Rootkit - new module cheks added - bugs fixedLogJoint - Log Viewer: logjoint 1.5: - Added support for more formats - Timeline improvement - Unicode logs and encodings supportMyColorSprite: MyColorSprite: MyColorSprite This Silverlight app is a color selection tool especially great for creating gredient color brushes for the xaml code. It allows a ...OAuthLib: OAuthLib (1.6.0.1): Difference between 1.6.0.0 is just version number.Picasa Downloader: PicasaDownloader Setup (41085): Changelog: Fixed workitem 10296 (Saving at resolutions above 1600px), Added experimental support for a modifier of the image download url (inse...Prolog.NET: Prolog.NET 1.0 Beta 1.1: Installer includes: primary Prolog.NET assembly Prolog.NET Workbench PrologTest console application all required dependencies Beta 1.1 in...QFrac: QFrac 1.0: Pirmoji stabili QFrac versija.SharepointApplicationFramework: SAF QuickPoll: Release Notes: This web part is written in VS2010 beta2 and uses Microsoft Chart Controls. Packaged into a single WSP. This wsp creates a quick po...Star System Simulator: Star System Simulator 2.2: An minor update to Version 2.1. Changes in this release: User interface enhancements/fixes with toolbar and icons. Features in this release: Mod...ToDoListReminder: Version 1.0.1.0: Bugs fixed: 10316, 10317 Handler for "Window Closing" event was added Error handling for XML parsing was addedVCC: Latest build, v2.1.30226.0: Automatic drop of latest buildWindows Remote Assistance For Skype: Beta 1.0.1: Major changes: 1) Now using Skype4COM to interact with Skype 2) InvitationXML is compressed 3) Showing warning on first run to Allow Access to SkypeWorkflow Type Browser for WF4: Release 1.0: There has been much surprise and disappointment expressed by the WF4 developer community since Microsoft made it clear that Intellisense woould not...Most Popular ProjectsData Dictionary CreatorOutlook 2007 Messaging API (MAPI) Code SamplesCommon Data Parameters ModuleTeam System - Work Item Spell Checker (All Languages)Tyrannt Online (Client/Server RPG)Ray Tracer StarterMeeting DemoNick BerardiScreenslayerRawrMost Active ProjectsDinnerNow.netRawrBlogEngine.NETMapWindow GISSLARToolkit - Silverlight Augmented Reality ToolkitInfoServiceSharpMap - Geospatial Application Framework for the CLRCommon Context Adapterspatterns & practices – Enterprise LibraryNB_Store - Free DotNetNuke Ecommerce Catalog Module

    Read the article

  • CodePlex Daily Summary for Monday, April 12, 2010

    CodePlex Daily Summary for Monday, April 12, 2010New Projects3 Hour Game Design Contest: The 3 Hour Game Design Contest is a programming contest for making simple games in 3 hours. 3 hours may not seem like enough time to make a game, b...BI Monkey SSIS ETL Framework: The BI Monkey SSIS ETL Framework is an ETL Execution, Control and Logging system for ETL projects using SSIS. It is supported by a SQL Server metad...Blend Sample Data Helpers: Helper behavior classes to generate sample images and data from Internet sources such as Flickr images. Bold TCP for Delphi 7: Open Sourcing the Bold TCP for Delphi 7.cfThreadingTools: This library project contains classes and extensions which will allow easy handling of multi-threaded UI-accesses.CuBiX_SDL: CuBiX_SDL : CuBiX est un projet personnel.Draglets: Draglets makes it easier for editors and CMS-developers to move and reorder content at their web sites. It's developed in ASP.NET, C# with WCF and ...DSQLT - Dynamic SQL Templates: DSQLT - Dynamic SQL Templates Use Stored Procedures as templates for dynamic SQL statements. Substitute parameters @0-@9 with values like objectna...Edtter: Edtter is a sample web application built on ASP.NET MVC 2 Framework. (Japanese Version Only)Forms Based Authentication Management - SharePoint2007FBA: This is my own update to Stacy Draper's FBABasic project for Forms Based Authentication in MOSS 2007. In additon to managing your fba user's roles,...Height Map to 3D World at XNA: Height Map to 3D World is a XNA project that developed firstly by Eric Grossinger and secondly improved by Karadeniz Technical University Computer ...HouseFly: A simple contact and note taking applicationITM 495 - iPhone Web App: School ProjectKaufleute: This will be finished laterLR: this project is about connecting toPowerShell Integration Services: A set of tools aimed at Extract Transform and Load tasks. Focused on getting the most common ETL tasks done without SSIS. Salient: A collection of, hopefully, useful libraries.Samurai.Validation: Extensible and flexible .Net object validation frameworkSamurai.Workflow: Samurai Workflow is a slim, easy-to-use workflow framework for WPF applications.SharePoint User Management WebPart: SharePoint User Management WebPartUrl shorte(ne)r: It's simple Url Shortener (like: http://tinyurl.com) Currently only Polish language is supported. In future will be provided multi language suppor...Yasbg: Yasbg (pronounced yas-bug) is Yet Another Static Blog Generator. It is made in C# using MarkdownSharp for markdown. Currently in alpha. New Releases.NET Extensions - Extension Methods Library: Release 2010.06: Added an universal approach for grouping extension methods like conversions. Conversion are now available on any data type (it's actually extension...3 Hour Game Design Contest: 3H-GDC mVII: This is the collection of game files for the 7th 3H-GDCB&W Port Scanner: Black`n`White Port Scanner 3.0: B&W Port Scanner 3 includes FTP Server detection tool, Better stability, Optimized memory management, Saving & Opening Result sets ... and more new...BI Monkey SSIS ETL Framework: Framework v1 Alpha: This Alpha release is not fully tested and some functionality is not operating as intended.Bluetooth Radar: Version 1.7: UI Changes Device UserControl Randomly placed devices.BugTracker.NET: BugTracker.NET 3.4.1: For the tasks/time tracking feature, added a way of viewing all the tasks at once, not just the tasks for one bug. Also added a way of exporting a...cfThreadingTools: cfThreadingTools 0.1.1.8: This is the first public available release. Following items are included: BaseTools-class which allows thread-safe setting of properties and callin...DeepZoom Pivot Constructor: DeepZoom Pivot Constructor v0.1: This is a test release of the library platform - Targets .NET 3.5 No samples yet, etc., but it works well :-)DSQLT - Dynamic SQL Templates: Initial release with License Included: nothing changed but license print procedure included the zip file contains database backup SQL script readmeForms Based Authentication Management - SharePoint2007FBA: SharePoint2007FBA 1.0.0.0: Downloads for the Project solution and the WSP package. Please read the Setup Guide. If you are unfamiliar with setting up Forms Based Authenticati...Foursquare BlogEngine Widget: foursquare widget for BlogEngine.NET version 0.3: To see the changes which have been made, visit http://philippkueng.ch/post/Foursquare-BlogEngineNET-widget-version-03.aspx For installation instruc...Framework Detector: FrameworkDetect Support .NET 4 v2: FrameworkDetect Support .NET 4Happy Turtle Plugins for BVI :: Repository Based Versioning for Visual Studio: Happy Turtle 1.0.46860: This is the second beta release of the SVN based version incrementor. Please feel free to create a thread in the discussion tabs and provide feedb...Height Map to 3D World at XNA: 3DWorld: Just open .rar file and extract it any folder and run Proje2Dto3D.exe file.HTML Ruby: 6.20.2: Removed rubyLineSpace option Improved options panel Fixed ruby text font-size rendering issue with complex ruby annotation Removed more waste...HTML Ruby: 6.20.3: Removed unused code Temporary partial fix for Firefox 3.7a4pre nightly buildHTML Ruby: 6.21.0: Added support for current HTML5 ruby annotation format. All ruby annotations are converted to XHTML 1.1 complex ruby annotation.Kooboo HTML form: Kooboo HTML Form Module for 2.1.0.0: Compatible with Kooboo cms 2.1.0.0 Upgrade to MVC 2Kooboo Menu: Kooboo CMS Menu for 2.1.0.0: Compatible with Kooboo cms 2.1.0.0 Upgrade to MVC 2Kooboo Meta: Kooboo Meta Module for 2.1.0.0: Compatible with Kooboo cms 2.1.0.0 Upgrade to MVC 2Kooboo PageMenu: Kooboo CMS PageMenu for 2.1.0.0: Compatible with Kooboo cms 2.1.0.0 Upgrade to MVC 2Kooboo Search: Kooboo CMS Search module for 2.1.0.0: Compatible with Kooboo cms 2.1.0.0 Upgrade to MVC 2Numina Application/Security Framework: Numina.Framework Core 50212: Added bulk import user page Added General settings page for updating Company Name, Theme, and API Key Add/Edit application calls Full URL to h...Rawr: Rawr 2.3.14: - Rawr3: Tons of fixes for Rawr3 compatability and UI. - Significant performance improvements all around. - More fixes and improvements to Wowhea...Rich Ajax empowered Web/Cloud Applications: 6.4 beta 2: The first fully featured version of Visual webGui offering web/cloud development tool that puts all ASP.NET Ajax limits behind with enhanced perfor...SharePoint User Management WebPart: User Management Web part 1.0: Most of the organization have one SharePoint Site which is configured with windows authenticated which is for internal employees having AD authenti...SkeinLibManaged: SkeinLibManaged 1.1.0.0 (Beta): This is the compiled DLL with XML documentation, so there should be plenty of context sensitive help and Intellisense. This is the Release version,...VCC: Latest build, v2.1.30411.0: Automatic drop of latest buildVFPX: Code References 1.1 Beta: Visit the Code References Info Page for complete information about this release.VisioAutomation: VisioAutomation 2.5.0: VisioAutomation 2.5.0- General cleanup/bugfixes - Many low-level changes the the VisioAutomation extension methods - these are far fewer now - This...Visual Studio DSite: English To Spanish Translator (Visual C++ 2008): A simple english to spanish translator made in visual c 2008, using the Google Translate API.WatchersNET CKEditor™ Provider for DotNetNuke: CKEditor Provider 1.10.00: Whats NewFile Browser: Inherits Folder Permissions from DotNetNuke Updated the Editor to Version 3.2.1 revision 5372 Added CkEditor jQuery Adap...Web/Cloud Applications Development Framework | Visual WebGui: 6.4 beta 2: The first fully featured version of Visual webGui offering web/cloud development tool that puts all ASP.NET Ajax limits behind with unique develope...WPF Data Virtualization: 1.0.0.0: First ReleaseYasbg: Yasbg Alpha: ReadmeYet Another Static Blog Generator is a command line utility that generates static html files for blogs. Currently, it is NOT feed enabled. I...異世界の新着動画: Ver. 10-04-12: ニコ生の仕様変更に対応 アンケート時間の設定追加Most Popular ProjectsWBFS ManagerRawrASP.NET Ajax LibraryMicrosoft SQL Server Product Samples: DatabaseAJAX Control ToolkitSilverlight ToolkitWindows Presentation Foundation (WPF)ASP.NETMicrosoft SQL Server Community & SamplesFacebook Developer ToolkitMost Active ProjectsRawrnopCommerce. Open Source online shop e-commerce solution.AutoPocopatterns & practices – Enterprise LibraryShweet: SharePoint 2010 Team Messaging built with PexFarseer Physics EngineNB_Store - Free DotNetNuke Ecommerce Catalog ModuleIonics Isapi Rewrite FilterBlogEngine.NETBeanProxy

    Read the article

  • Fitting it together, database, reporting, applications in C#

    - by alvonellos
    Introduction Preamble I was hesitant to post this, since it's an application whose intricate details are defined elsewhere, and answers may not be helpful to others. Within the past few weeks (I was actually going to write a blog post about this after I finished) I've discovered that the barrier I'm encountering is one that's actually quite common for newer developers. This question is not so much about a specific thing as it is about piecing those things together. I've searched the internet far and wide, and found many tutorials on how to create applications that are kind of similar to what I'm looking for. I've also looked at hiring another, more experienced, developer to help me along, but all I've gotten are unqualified candidates that don't have the experience necessary and won't take care of the client or project like I will. I'd rather have the project never transpire than to release a solution that is half-baked. I've asked professors at my school, but they've not turned up answers to my question. I'm an experienced developer, and I've written many applications that are -- very abstractly -- close to what I'm doing, but my experiences from those applications aren't giving me enough leverage to solve this particular problem. I just hope that posting this article isn't a mistake for me to write. Project Description I have a project I'm working on for a client that is a rewrite of an application, originally written in Foxpro 2.6 by someone before me, that performs some analysis (which, sadly, I'm not allowed to disclose as per of my employment contract) on financial data. One day, after a long talk between the client and I -- where he intimately described his frustrations with all the bugs I've been hacking out of this code for 6 months now -- he told me to just rewrite it and gave me a month to write a good 1/8 of this 65k LOC Foxpro monstrosity. this 65k line of code foxpro monstrosity. It'll take me a good 3 - 6 months to rewrite this software (I know things the original programmer did not, like inheritance) going as I am right now, but I'm quickly discovering that I'm going to need to use databases. Prior to this contract I didn't even know about foxpro, and so I've had to learn foxpro on the fly, write procedures and make modifications to the database. I've actually come to like it, and this project would be rewritten in Foxpro if it were still a supported language, because over the past few months, I've come to like the features of Foxpro that make it so easy to develop data-driven applications. I once perfomed an experiment, comparing C# to Foxpro. What took me 45 minutes in C# took me two in Foxpro, and I knew C# prior to Foxpro. I was hoping to leverage the power of C#, but it intimidates me that in foxpro, you can have one line of code and be using a database. Prior to this, I have never written any serious database development from scratch. All the applications that I've written are in a different league. They are either completely data-naive or data-naive enough that I can get away with not using a database through serialization or by designing algorithms that work with the data in a manner that is stateless, so there is no need to worry about databases. I've come to realize, very quickly, that serialization and my efficacy with data structures has been my crutch all these years that's prevented me from adventuring into databases, and has consequently hindered my success in real-world programming. Sure, I've written some database stuff in Perl and Python, and I've done forms and worked with relational databases and tables, I'm a wizard in Access and Excel (seriously) and can do just about anything, but it just feels unnatural writing SQL code in another language... I don't mind writing SQL, and I don't It's that bridge between the database and the program code that drives me absolutely bonkers. I hope I'm not the only one to think this, but it bothers me that I have to create statements like the following string sSql = "SELECT * from tablename" When there's really no reason for that kind of unchecked language binding between two languages and two API's. Don't get my wrong, SQL is great, but I don't like the idea that, when executing commands on a SQL database, that one must intermix database and application software, and there's no database independence, which means that different versions of different databases can break code. This isn't very nice. The nicest thing about Foxpro is the cohesiveness between programming language and database. It's so easy, and Foxpro makes it easy, because the tool just fits the task. I can see why so many developers have created a career with this language, because it lowered the barrier of entry to data-driven applications that so many businesses need. It was wonderful. For my purposes today, with the demands and need for community support, extensibility, and language features, Foxpro isn't a solution that I feel would be the right tool for the job. I'm also worried about working too heavy with the database, because I've seen data-driven .NET applications have issues with database caches, running out of memory, and objects in the database not being collected. (Memory leaks) And OH the queries. Which one, how, and why? There are a plethora of different ways that a database can be setup, I think I counted 5 or 6 different kinds of database applications alone that I can chose from. That is a great mountain for me to climb when I don't even know where to begin when it comes to writing data-driven applications. The problem isn't that I don't know SQL or that I don't know C#. I know both and have worked with both extensively. It's making them work together that's the problem, and it's something I've never done in C# before. Reports The client likes paper. The data needs to be printed out in a format that is extensible, layered, and easy to use. I have never done reporting before, and so this is a bit of a problem. From the data source comes crystal reports, and so there's a dependency on the database, from what I understand. Code reuse A large part of the design decision that I've gone through so far is to break the task of writing a piece of this software into routines and modular DLL's and so forth such that much of the code can be reused. For example, when I setup this database, I want to be able to reuse the same database code over and over again. I also want to make sure that when the day comes that another developer is here, that he/she will be able to pick up just where I left off. The quicker I develop these applications, the better off I am. Tasks & Goals In my project, I need to write routines that apply algorithms and look for predefined patterns in financial data. Additionally, I need to simulate trading based on predefined algorithms and data. Then I need to prepare reports on that data. Additionally, I need to have a way to change the code base for this application quickly and effectively, without hacking together some band-aid solution for a problem that really needs a trauma ward. Special Considerations The solution must be fast, run quickly on existing hardware, and not be too much of a pain to maintain and write. I understand that anything I write I'm married to -- I'm responsible for the things that I write because my reputation and livelihood is dependent on it. Do I really need a database? What about performance? Performance was such a big issue that I hand wrote a data structure that is capable of performing 2 billion operations, using a total of 4 gigs of memory in under 1/4 of a second using the standard core two duo processor. I could not find a similar, pre-written data structure in C# to perform this task. What setup do I use in terms of database? What about reporting? I'd prefer to have PDF's generated, but I'd like to be able to visually sketch those reports and then just have a ReportFactory of some sort, that when I pass some variables in, it just does that data. About Me I'm a lone developer for a small business in this area. This is the first time I've done this and I've never had the breadth and depth of my knowledge tested. I'm incredibly frustrated with this project because I feel incredibly overwhelmed with the task at hand. I'm looking for that entry level point where I can draw a line and say "this is what I need to do" Conclusion I may have not been clear enough on my post. I'm still new to this whole thing, and I've been doing my best to contribute back to the community that I've leached so much knowledge from. I'd be glad to edit my post and add more information if possible. I'm looking for a big-picture solution or design process that helps me get off the ground in this world of data-driven applications, because I have a feeling that it's going to be concentric to my entire career as a programmer for some time. Specifically, if you didn't get it from the rest of the post (I may not have been clear enough) I really need some guidance as to where to go in terms of the design decisions for this project. Some things that'll be useful will be a pro/con list for the different kinds of database projects available in VS2010. I've tried, but generating that list has been as hard as solving the problem itself... If you could walk a developer writing a data-driven application for the first time in C#, how would you do that? Where would you point them to?

    Read the article

  • Internationalize WebCenter Portal - Content Presenter

    - by Stefan Krantz
    Lately we have been involved in engagements where internationalization has been holding the project back from success. In this post we are going to explain how to get Content Presenter and its editorials to comply with the current selected locale for the WebCenter Portal session. As you probably know by now WebCenter Portal leverages the Localization support from Java Server Faces (JSF), in this post we will assume that the localization is controlled and enforced by switching the current browsers locale between English and Spanish. There is two main scenarios in internationalization of a content enabled pages, since Content Presenter offers both presentation of information as well as contribution of information, in this post we will look at how to enable seamless integration of correct localized version of the back end content file and how to enable the editor/author to edit the correct localized version of the file based on the current browser locale. Solution Scenario 1 - Localization aware content presentation Due to the amount of steps required to implement the enclosed solution proposal I have decided to share the solution with you in group components for each facet of the solution. If you want to get more details on each step, you can review the enclosed components. This post will guide you through the steps of enabling each component and what it enables/changes in each section of the system. Enable Content Presenter Customization By leveraging a predictable naming convention of the data files used to hold the content for the Content Presenter instance we can easily develop a component that will dynamically switch the name out before presenting the information. The naming convention we have leverage is the industry best practice by having a shared identifier as prefix (ContentABC) and a language enabled suffix (_EN) (_ES). So the assumption is that each file pair in above example should look like following:- English version - (ContentABC_EN)- Spanish version - (ContentABC_ES) Based on above theory we can now easily regardless of the primary version assigned to the content presenter instance switch the language out by using the localization support from JSF. Below java bean (oracle.webcenter.doclib.internal.view.presenter.NLSHelperBean) is enclosed in the customization project available for download at the bottom of the post: 1: public static final String CP_D_DOCNAME_FORMAT = "%s_%s"; 2: public static final int CP_UNIQUE_ID_INDEX = 0; 3: private ContentPresenter presenter = null; 4:   5:   6: public NLSHelperBean() { 7: super(); 8: } 9:   10: /** 11: * This method updates the configuration for the pageFlowScope to have the correct datafile 12: * for the current Locale 13: */ 14: public void initLocaleForDataFile() { 15: String dataFile = null; 16: // Checking that state of presenter is present, also make sure the item is eligible for localization by locating the "_" in the name 17: if(presenter.getConfiguration().getDatasource() != null && 18: presenter.getConfiguration().getDatasource().isNodeDatasource() && 19: presenter.getConfiguration().getDatasource().getNodeIdDatasource() != null && 20: !presenter.getConfiguration().getDatasource().getNodeIdDatasource().equals("") && 21: presenter.getConfiguration().getDatasource().getNodeIdDatasource().indexOf("_") > 0) { 22: dataFile = presenter.getConfiguration().getDatasource().getNodeIdDatasource(); 23: FacesContext fc = FacesContext.getCurrentInstance(); 24: //Leveraging the current faces contenxt to get current localization language 25: String currentLocale = fc.getViewRoot().getLocale().getLanguage().toUpperCase(); 26: String newDataFile = dataFile; 27: String [] uniqueIdArr = dataFile.split("_"); 28: if(uniqueIdArr.length > 0) { 29: newDataFile = String.format(CP_D_DOCNAME_FORMAT, uniqueIdArr[CP_UNIQUE_ID_INDEX], currentLocale); 30: } 31: //Replacing the current Node datasource with localized datafile. 32: presenter.getConfiguration().getDatasource().setNodeIdDatasource(newDataFile); 33: } 34: } With this bean code available to our WebCenter Portal implementation we can start the next step, by overriding the standard behavior in content presenter by applying a MDS Taskflow customization to the content presenter taskflow, following taskflow customization has been applied to the customization project attached to this post:- Library: WebCenter Document Library Service View- Path: oracle.webcenter.doclib.view.jsf.taskflows.presenter- File: contentPresenter.xml Changes made in above customization view:1. A new method invocation activity has been added (initLocaleForDataFile)2. The method invocation invokes the new NLSHelperBean3. The default activity is moved to the new Method invocation (initLocaleForDataFile)4. The outcome from the method invocation goes to determine-navigation (original default activity) The above changes concludes the presentation modification to support a compatible localization scenario for a content driven page. In addition this customization do not limit or disables the out of the box capabilities of WebCenter Portal. Steps to enable above customization Start JDeveloper and open your WebCenter Portal Application Select "Open Project" and include the extracted project you downloaded (CPNLSCustomizations.zip) Make sure the build out put from CPNLSCustomizations project is a dependency to your Portal project Deploy your Portal Application to your WC_CustomPortal managed server Make sure your naming convention of the two data files follow above recommendation Example result of the solution: Solution Scenario 2 - Localization aware content creation and authoring As you could see from Solution Scenario 1 we require the naming convention to be strictly followed, this means in the hands of a user with limited technology knowledge this can be one of the failing links in this solutions. Therefore I strongly recommend that you also follow this part since this will eliminate this risk and also increase the editors/authors usability with a magnitude. The current WebCenter Portal Architecture leverages WebCenter Content today to maintain, publish and manage content, therefore we need to make few efforts in making sure this part of the architecture is on board with our new naming practice and also simplifies the creation of content for our end users. As you probably remember the naming convention required a prefix to be common so I propose we enable a new component that help you auto name the content items dDocName (this means that the readable title can still be in a human readable format). The new component (WCP-LocalizationSupport.zip) built for this scenario will enable a couple of things: 1. A new service where a sequential number can be generate on request - service name: GET_WCP_LOCALE_CONTENTID 2. The content presenter is leveraging a specific function when launching the content creation wizard from within Content Presenter. Assumption is that users will create the content by clicking "Create Web Content" button. When clicking the button the wizard opened is actually running in side of WebCenter Content server, file executed (contentwizard.hcsp). This file uses JSON commands that will generate operations in the content server, I have extend this file to create two identical data files instead of one.- First it creates the English version by leveraging the new Service and a Global Rule to set the dDocName on the original check in screen, this global rule is available in a configuration package attached to this blog (NLSContentProfileRule.zip)- Secondly we run a set of JSON javascripts to create the Spanish version with the same details except for the name where we replace the suffix with (_ES)- Then content creation wizard ends with its out of the box behavior and assigns the Content Presenter instance the English versionSee Javascript markup below - this can be changed in the (WCP-LocalizationSupport.zip/component/WCP-LocalizationSupport/publish/webcenter) 1: //---------------------------------------A-TEAM--------------------------------------- 2: WCM.ContentWizard.CheckinContentPage.OnCheckinComplete = function(returnParams) 3: { 4: var callback = WCM.ContentWizard.CheckinContentPage.checkinCompleteCallback; 5: WCM.ContentWizard.ChooseContentPage.OnSelectionComplete(returnParams, callback); 6: // Load latest DOC_INFO_SIMPLE 7: var cgiPath = DOCLIB.config.httpCgiPath; 8: var jsonBinder = new WCM.Idc.JSONBinder(); 9: jsonBinder.SetLocalDataValue('IdcService', 'DOC_INFO_SIMPLE'); 10: jsonBinder.SetLocalDataValue('dID', returnParams.dID); 11: jsonBinder.Send(cgiPath, $CB(this, function(http) { 12: var ret = http.GetResponseText(); 13: var binder = new WCM.Idc.JSONBinder(ret); 14: var dDocName = binder.GetResultSetValue('DOC_INFO', 'dDocName', 0); 15: if(dDocName.indexOf("_") > 0){ 16: var ssBinder = new WCM.Idc.JSONBinder(); 17: ssBinder.SetLocalDataValue('IdcService', 'SS_CHECKIN_NEW'); 18: //Additional Localization dDocName generated 19: ssBinder.SetLocalDataValue('dDocName', getLocalizedDocName(dDocName, "es")); 20: ssBinder.SetLocalDataValue('primaryFile', 'default.xml'); 21: ssBinder.SetLocalDataValue('ssDefaultDocumentToken', 'SSContributorDataFile'); 22:   23: for(var n = 0 ; n < binder.GetResultSetFields('DOC_INFO').length ; n++) { 24: var field = binder.GetResultSetFields('DOC_INFO')[n]; 25: if(field != 'dID' && 26: field != 'dDocName' && 27: field != 'dID' && 28: field != 'dReleaseState' && 29: field != 'dRevClassID' && 30: field != 'dRevisionID' && 31: field != 'dRevLabel') { 32: ssBinder.SetLocalDataValue(field, binder.GetResultSetValue('DOC_INFO', field, 0)); 33: } 34: } 35: ssBinder.Send(cgiPath, $CB(this, function(http) {})); 36: } 37: })); 38: } 39:   40: //Support function to create localized dDocNames 41: function getLocalizedDocName(dDocName, lang) { 42: var result = dDocName.replace("_EN", ("_" + lang)); 43: return result; 44: } 45: //---------------------------------------A-TEAM--------------------------------------- 3. By applying the enclosed NLSContentProfileRule.zip, the check in screen for DataFile creation will have auto naming enabled with localization suffix (default is English)You can change the default language by updating the GlobalNlsRule and assign preferred prefix.See Rule markup for dDocName field below: <$executeService("GET_WCP_LOCALE_CONTENTID")$><$dprDefaultValue=WCP_LOCALE.LocaleContentId & "_EN"$> Steps to enable above extensions and configurations Install WebCenter Component (WCP-LocalizationSupport.zip), via the AdminServer in WebCenter Content Administration menus Enable the component and restart the content server Apply the configuration bundle to enable the new Global Rule (GlobalNlsRule), via the WebCenter Content Administration/Config Migration Admin New Content Creation Experience Result Content EditingContent editing will by default be enabled for authoring in the current select locale since the content file is selected by (Solution Scenario 1), this means that a user can switch his browser locale and then get the editing experience adaptable to the current selected locale. NotesA-Team are planning to post a solution on how to inline switch the locale of the WebCenter Portal Session, so the Content Presenter, Navigation Model and other Face related features are localized accordingly. Content Presenter examples used in this post is an extension to following post:https://blogs.oracle.com/ATEAM_WEBCENTER/entry/enable_content_editing_of_iterative Downloads CPNLSCustomizations.zip - WebCenter Portal, Content Presenter Customization https://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/CPNLSCustomizations.zip WCP-LocalizationSupport.zip - WebCenter Content, Extension Component to enable localization creation of files with compliant auto naminghttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/WCP-LocalizationSupport.zip NLSContentProfileRule.zip - WebCenter Content, Configuration Update Bundle to enable Global rule for new check in naming of data fileshttps://blogs.oracle.com/ATEAM_WEBCENTER/resource/stefan.krantz/NLSContentProfileRule.zip

    Read the article

  • CodePlex Daily Summary for Tuesday, March 02, 2010

    CodePlex Daily Summary for Tuesday, March 02, 2010New ProjectsAcceptance Test Excel Addin: Acceptance Test Excel Addin is a tool to author, execute and analyze acceptance tests in Excel. The tester write tests using Given-When-Then (Gherk...Adrastus: Related to manage Issue/Item of any type. To be defined later TBDAn opportunity of upgrading Linux operating system: Want to add a new software in Linux operating system? Here is an opportunity. UFS (User Friendly Scheduler) has been designed to make the operating...AspNetPager: AspNetPager is a free custom paging control for ASP.NET web form application. It's one of the most popular ASP.NET third party controls used by chi...AzureRunMe: Run Java, Ruby, Python, [insert language of your choice] applications in Windows Azure. You provide a self contained ZIP file with a runme.bat f...DiffPlex - a .NET Diff Generator: DiffPlex is a combination of a .NET Diffing Library with both a Silverlight and HTML diff viewer.GPA WebClient: GPA project web client.Maintenance Service: A lot of projects need to have a windows service that execute different tasks. If am tired of creating the same service for all this project, so He...Marager Component Framework: Marager Component Framework (mcf)Pod Thrower: An application that gives a simple way to create podcast rss feeds from local computer folders.Rapidshare Episode Downloader: Rapidshare Episode Downloader is a software that enables you (yes, you!) to organize your many episodes of different TV shows in a nice list, prese...Resxus - Total .net string resource management tool: RESXUS is a resource file management tool which is being created under my job-experience of managing multilingual resx files. The first goal of th...SLFX: SLFXTheWhiteAmbit: Hybrid Scanline-Raytracing Engine. VisitorPattern based Scenegraph written in C++. DirectX9 or DirectX10 rendering is used for PrimaryRays and CUDA...TSqlMigrations: Yet another migrations platform right? This is purely sql based (tsql as it is only for Sql Server at this time). This tool is meant to help mana...WAFFLE: Windows Authentication Functional Framework (LE): WAFFLE - Windows Authentication Functional Framework (Light Edition) is a .NET library with a COM interface and a Java bridge that provides a worki...web lib api: This project aim, to pull together the major, web api/webservices for each relervant categorie.WSDLGenerator: A tool to generate a WSDL file from a c# dll which contains one more Microsoft WebServices. The project is build using VS2010RC and uses .net Fram...XNA Re-usable UI Components: The aim of this project is to create a re-usable set of UI game components helping reduce production time for your game. More information can be...YUI Compressor Custom Tool for Visual Studio: This EXTREMELY simple custom tool is used to automatically generate a *.min.css file from your existing code on save. It is merely a packaged versi...New ReleasesAcceptance Test Excel Addin: 1.0.0.0: How to Use Extract AcceptanceTestExcelAddIn-1.0.0.0.zip Run setup.exe Extract PasswordSample.zip Open Excel, your will see a new tab, "QA To...An opportunity of upgrading Linux operating system: UFS: The software provided by me is just a basic one that will run in the terminal through gcc compiler. The developers are therefore requested to make ...AspNetPager: Demo project: AspNetPager version 7.3.2 demo web site projectBusiness Framework: Formula Samples: A sample demonstrating textual language - Formula. It can be used is many business scenarios allowing end-user to configure or interact with the sy...Deblector: Deblector 1.1: This build fixes compatibility with .NET Reflector 6.Desktop Dimmer: March 2010: First release, March 2010EasyDump: EasyDump 1.0.1: Easy Dump 1.0.1 fix duplicate output when execute twiceExtensia: Extensia: Extensia is a very large list of extension methods and a few helper types. Many methods have practical utility (e.g. console parsing) whilst some ...Fluent Assertions: Fluent Assertions release 1.0: The first release of the Fluent Assertions. It contains assertions for the most common types and has several extension points.FolderSize: FolderSize.Win32.1.0.6.0: FolderSize.Win32.1.0.6.0 A simple utility intended to be used to scan harddrives for the folders that take most place and display this to the user...GamerShots.com Screenshot Capture: GamerShots.com Screenshot Capture: Windows Form application written in C# ASP.Net. Allows the user to capture a screen by pressing the "Print Scrn" key or by user input. Then uses ...Jet Login Tool (JetLoginTool): Stopped - 1.5.3713.17328: Fixed: Engine will now actually stop when the "Stop" button is hitJolt Environment - RuneScape Emulator: Jolt Environment 1.0.6000 GOLD: Features since 1.0.3200: - Account Creation via client - Character Saving/Loading (via MySQL serialization) - Ground Objects - Ground Items (with m...jQuery Library for SharePoint Web Services: SPServices 0.5.2: NOTE: While I work on new releases, I post alpha versions. Usually the alpha versions are here to address a particular need. I DO NOT recommend us...LINQ to XSD: 1.0.0: The LINQ to XSD technology provides .NET developers with support for typed XML programming. LINQ to XSD contributes to the LINQ project (.NET Langu...Maintenance Service: Alpha Release: Alpha Release of the SoftwareMDownloader: MDownloader-0.15.5.56206: Fixed many gathered bugs;MDownloader: MDownloader-0.15.6.56217: Fixed retrieving hotfile data using registered accounts.MRDS Services for HiTechnic: HiTechnic Controllers: The HiTechnic Controllers package contains services for MRDS that work with the LEGO NXT and TETRIX Servo and Motor Controllers. Initial ReleaseCo...Open NFe: DANFE 1.9.2: Correções do DANFE que serão incluídas na versão 1.9.2PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT: PROGRAMMABLE SOFTWARE DEVELOPMENT ENVIRONMENT-2.4: While testing a standard software parts kit library for strict portability, The following error condition occurred when using the Windows version ...Rapidshare Episode Downloader: RED 0.8: This is an almost fully working version of the software. What DOESN'T work: Showing the list of rapidshare search results (but query is being made...Reusable Library: V1.0.4: A collection of reusable abstractions for enterprise application developer.SharePoint LogViewer: SharePointLogViewer 1.5.1: Follwoing bugs are fixed Bookmarks deleted on refresh/reload Bookmarks not properly navigated on filtered list. Disabled toolbar buttons did n...SharePoint Taxonomy Extensions: SharePoint Taxonomy Extensions 1.1-1: - Some bugfixes - Possibility to switch between alphanummeric und manual sortingSilverSynth - Digital Audio Synthesis for Silverlight: SilverSynth 1.1: SilverSynth 1.1 is a zip file of the source code and includes an updated version of the demo application including presets.SQL Server Reporting Services MSBuild Tasks: Release 1.1.14669: New Features New Task added for Integrated and Native mode: DeleteReportUser Task ReportUserExists TaskTellago DevLabs: BizTalk Data Services v0.2: This release is the first version of the BizTalk Data Services API, a RESTful API for BizTalk Server based on the Open Data (OData) Protocol. The ...VCC: Latest build, v2.1.30301.0: Automatic drop of latest buildWAFFLE: Windows Authentication Functional Framework (LE): 1.2: Build 1.2.4217.0, initial open-source release. - Account lookup locally and in Active Directory. - Enumerating Active Directory domains. - Returns...Watermarker: 0.87: 01.03.2010: • FIXED: some stability fixes • ADDED: ability to choose any number of pictures and folder to save them after the operation completesWSDLGenerator: WSDLGenerator 0.0.0.1: Initial versionXNA Re-usable UI Components: Re-usable Game Components V1.0: First public release of the source codeYUI Compressor Custom Tool for Visual Studio: YUI Compressor Custom Tool with Installer v0.1a: Initial release with alpha installer - documentation to follow.Most Popular ProjectsMetaSharpRawrWBFS ManagerAJAX Control ToolkitMicrosoft SQL Server Product Samples: DatabaseSilverlight ToolkitWindows Presentation Foundation (WPF)Microsoft SQL Server Community & SamplesASP.NETImage Resizer Powertoy Clone for WindowsMost Active ProjectsRawrBlogEngine.NETMapWindow GISpatterns & practices – Enterprise LibraryjQuery Library for SharePoint Web ServicesSharpMap - Geospatial Application Framework for the CLRRapid Entity Framework (ORM). CTP 2DiffPlex - a .NET Diff GeneratorMDT Web FrontEndWPF Dialogs

    Read the article

  • Oracle Fusion Applications: Changing the Game

    - by kellsey.ruppel(at)oracle.com
    Originally posted in the Oracle Profit Magazine, November 2010 Edition. When the order processing system red-flags a customer's credit status, the IT department doesn't get the customer's call. When a supplier misses a delivery date for a key automotive assembly, it's not the CIO who has to answer for the error. Knowledge workers (known in IT circles as "users") are on the front lines when an exception occurs in an established business process. They're also the ones who study sales trends to decide when to open a new store in an up-and-coming neighborhood, which products are most profitable, how employee skill sets are evolving, and which suppliers are most efficient. In short, knowledge workers are masters of business as unusual. Traditional enterprise resource planning (ERP) systems and other familiar enterprise applications excel at automating, managing, and executing standard business processes. These programs shine when everything goes as planned. Life gets even trickier when a traditional application needs to be extended with a new service or an extra step is added to a business process when new products are brought to market, divisions are merged, or companies are acquired. Monolithic applications often need the IT department to step in and make the necessary adjustments--incurring additional costs and delays. Until now. When Oracle unveiled the much-anticipated family of Oracle Fusion Applications at Oracle OpenWorld in September 2010, knowledge workers in particular had a lot to cheer about. Business users will soon have ready access to analytical information and collaboration tools in the context of what they are working on, so they can make better decisions when problems or opportunities arise. Additionally, the Oracle Fusion Applications platform will make it easy for business users to tweak processes, create new capabilities, and find information, often without the need for IT department assistance and while still following company guidelines. And IT leaders will be happy to hear about new deployment options, guided implementation and setup tools, and cost-saving management capabilities. Just as important, the underlying technologies in Oracle Fusion Applications will allow organizations to choose among their existing investments and next-generation enterprise applications so they can introduce innovations at a pace that makes the most business and financial sense. "Oracle Fusion Applications are architected so you don't have to do rip and replace," says Jim Hayes, managing director of the consulting firm Accenture. "That's very important for creating a business case that will get through the steering committee and be approved by the board. It shows you can drive value and make a difference in the near term." For these and other reasons, analysts and early adopters are calling Oracle Fusion Applications a game changer for enterprise customers. The differences become apparent in three key areas: the way we innovate, work, and adopt technology. Game Changer #1: New Standard for InnovationChange is a constant challenge for most businesses, whether the catalysts are market dynamics, new competition, or the ever-expanding regulatory environment. And, in an ongoing effort to differentiate, business leaders are constantly looking for new ways to do business, serve constituents, and bring new products and services to market. In addition, companies face significant costs to keep their applications up-to-date. For example, when a company adds new suppliers to a procurement system, the IT shop typically has to invest time, effort, and even consulting fees for custom integrations that allow various ERP systems to communicate with each other. Oracle Fusion Applications were built on Web services and a modular SOA foundation to ease customizations and integration activities among all applications--whether from Oracle or another vendor. Interfaces and updates written in ubiquitous Java, rather than a proprietary coding language, allow organizations to tap into existing in-house technical skills rather than seek expensive outside specialists. And with SOA, organizations can extend a feature set or integrate with other SOA environments by combining Web services such as "look up customer" into a new business process managed by the BPEL orchestration engine. Flexibility like this has long-term implications. "Because users capture these changes at a higher metadata layer, not in the application's code, changes and additions are protected even as new versions of Oracle Fusion Applications are released," says Steve Miranda, senior vice president of applications development at Oracle. "This is a much more sustainable approach because you don't incur costly customizations that prevent upgrades and other innovations." And changes are easier to make: if one change is made in the metadata, that change is automatically reflected throughout the application interface, business intelligence, business process, and business logic. Game Changer #2: New Standard for WorkBoosting productivity comes down to doing the basics right: running business processes more efficiently and managing exceptions more effectively, so users can accomplish more in the course of a day or spend more quality time with the most profitable customers. The fastest way to improve process efficiency is to reduce the number of steps it takes to execute common tasks, such as ordering office equipment from an internal procurement system. Oracle Fusion Applications will deliver a complete role-based user experience with business intelligence and collaboration capabilities provided in the context of the work at hand. "We created every Oracle Fusion Applications screen by asking 'What does the user need to know?' 'What does he or she need to do?' and 'Who do they need to work with to get the job done?'" Miranda explains. So when the sales department heads need new laptops, the self-service procurement screen will not only display a list of approved vendors and configurations, but also a running list of reviews by coworkers who recently purchased the various models. Embedded intelligence may also display prevailing delivery lead times based on actual order histories, not the generic shipping dates vendors may quote. The pervasive business intelligence serves many other business activities across all areas of the enterprise. For example, a manager considering whether to promote a direct report can see the person's employee profile, with a salary history, appraisal summaries, and a rundown of skills and training. This approach to business intelligence also has implications for supply chain management. "One of the challenges at Ingersoll Rand is lack of visibility in our supply chain," says Mike Macrie, global director of enterprise applications for global industrial firm Ingersoll Rand. "Oracle Fusion Applications are going to provide the embedded intelligence to give us that visibility and give us the ability to analyze those orders at any point in our supply chain." Oracle Fusion Applications will also create a "role-based user experience" that displays a work list of events that need attention, based on user job function. Role awareness guides users with daily lists of action items and exceptions. So a credit manager may see seven invoices with discounts that are about to expire or 12 suppliers that have been put on hold because credit memos are awaiting approval. Individualization extends to the search capabilities of Oracle Fusion Applications. The platform uses Web-style search screens powered by an Oracle enterprise search engine, with a security framework that filters search results so individuals will only see the internal information they're authorized to access. A further aid to productivity is Oracle Fusion Applications' integration with Web 2.0 collaboration and social networking resources for business environments. Hover-over text will reveal relevant contact information whenever the name of a person appears in an Oracle Fusion Application. Users can connect via an online chat, phone call, or instant message without leaving the main application, reducing the time required for an accounts payable staffer to resolve a mismatch between an invoiced charge and the service record, for example. Addresses of suppliers, customers, or partners will also initiate hover-over text to show contact details and Web-based maps. Finally, Oracle Fusion Applications will promote a new way of working with purpose-driven communities that can bring new efficiencies to everything from cultivating sales leads to managing new projects. As soon as a lead or project materializes, the applications will automatically gather relevant participants into an online community that shares member contact information, schedules, discussion forums, and Wiki pages. "Oracle Fusion Applications will allow us to take it to the next level with embedded Web 2.0 tools and the embedded analytics," says Steve Printz, CIO and vice president, supply chain management, at window-and-door manufacturer Pella. "[This] allows those employees today who are processing transactions to really contribute to the success of the company and become decision-makers." Game Changer #3: New Standard for Technology AdoptionAs IT becomes a dominant component of how businesses run and compete, organizations need to lower the cost of implementing applications and introducing new application features. In the past, rolling out new code often required creating a test bed system, moving beta code to a separate system for user feedback, and--once all the revisions were made--moving version one of the software onto production systems, where business users could finally get the needed new features. Oracle Fusion Applications will use a dedicated setup manager application to streamline this process. First, the setup manager will help scope out the project, querying users about their requirements. "From those questions and answers we determine the steps and the order of those steps that will enable that task," Miranda says. Next, system utilities will assign tasks to owners, track completion status, and monitor the overall status of a programming effort. Oracle Fusion Applications can then recommend Web services that allow users to migrate setup choices and steps across all the various deployments of the application. Those setup capabilities automate the migration from test systems to production systems, as well as between different business units that may be using the same application. "The self-service ability of the setup manager helps business users change setups with very little intervention from the IT team," says Ravi Kumar, vice president at IT services company Infosys. "That to me is a big difference from how we've viewed enterprise applications before." For additional flexibility, organizations will be able to adopt Oracle Fusion Applications modules in either of two modes: a single-instance alternative uses one database for all Oracle Fusion Applications, while a "pillar mode" creates separate databases to underpin each application. This means IT departments running any one of Oracle's applications or even third-party applications can plug Oracle Fusion Applications modules into their environment and see additional business value created on top of their existing systems. And Oracle Fusion Applications offer a hybrid approach to deployment. The applications are all software-as-a-service-ready, so customers can choose on-premises, public or private cloud, or a combination of these to suit their business needs. It's that combination of flexibility and a roadmap for the future that may be the biggest game changer of all. "The Oracle Fusion Applications architecture allows us to migrate our company at a pace that's consistent with our business strategy, whereas before we might have had to do it with a massive upgrade," says Macrie of Ingersoll Rand. "We're looking forward to that architecture to really give us more flexibility in how we migrate over time." For More InformationUser Input Key to the Success of Oracle Fusion ApplicationsTransforming Coexistence into Strategic ValueUnder the HoodOracle Fusion ApplicationsOracle Service-Oriented Architecture  

    Read the article

  • Windows Azure Root CAs and SSL Client Certificates

    - by Your DisplayName here!
    I ran into some problems while trying to make SSL client certificates work for StarterSTS 1.5. In theory you have to do two things (via startup tasks): Unlock the SSL section in IIS Install all the root certificates for the client certs you want to accept I did that. But it still does not work. While inspecting the event log, I stumbled over an schannel error message that I’ve never seen before: “When asking for client authentication, this server sends a list of trusted certificate authorities to the client. The client uses this list to choose a client certificate that is trusted by the server. Currently, this server trusts so many certificate authorities that the list has grown too long. This list has thus been truncated. The administrator of this machine should review the certificate authorities trusted for client authentication and remove those that do not really need to be trusted.” WTF? And indeed standard Azure (web role) VMs trust 275 root CAs (see attached list). Including kinda obscure ones. I don’t really know why MS made this design decision. It seems just wrong (including breaking the SSL client cert functionality). Deleting like 60% of them made SSL client certs from my CA work. So I guess I now have to find an automated way to attach CTLs to my site…joy. Exported list of trusted CA (as of 30th Dec 2010) AC Raíz Certicámara S.A. (4/2/2030 9:42:02 PM) AC RAIZ FNMT-RCM (1/1/2030 12:00:00 AM) A-CERT ADVANCED (10/23/2011 2:14:14 PM) Actalis Authentication CA G1 (6/25/2022 2:06:00 PM) Agence Nationale de Certification Electronique (8/12/2037 9:03:17 AM) Agence Nationale de Certification Electronique (8/12/2037 9:58:14 AM) Agencia Catalana de Certificacio (NIF Q-0801176-I) (1/7/2031 10:59:59 PM) America Online Root Certification Authority 1 (11/19/2037 8:43:00 PM) America Online Root Certification Authority 2 (9/29/2037 2:08:00 PM) ANCERT Certificados CGN (2/11/2024 5:27:12 PM) ANCERT Certificados Notariales (2/11/2024 3:58:26 PM) ANCERT Corporaciones de Derecho Publico (2/11/2024 5:22:45 PM) A-Trust-nQual-01 (11/30/2014 11:00:00 PM) A-Trust-nQual-03 (8/17/2015 10:00:00 PM) A-Trust-Qual-01 (11/30/2014 11:00:00 PM) A-Trust-Qual-02 (12/2/2014 11:00:00 PM) A-Trust-Qual-03a (4/24/2018 10:00:00 PM) Austria Telekom-Control Kommission (9/24/2005 12:40:00 PM) Austrian Society for Data Protection (2/12/2009 11:30:30 AM) Austrian Society for Data Protection GLOBALTRUST Certification Service (9/18/2036 2:12:35 PM) Autoridad Certificadora Raiz de la Secretaria de Economia (5/9/2025 12:00:00 AM) Autoridad de Certificacion de la Abogacia (6/13/2030 10:00:00 PM) Autoridad de Certificacion Firmaprofesional CIF A62634068 (10/24/2013 10:00:00 PM) Autoridade Certificadora Raiz Brasileira (11/30/2011 11:59:00 PM) Baltimore CyberTrust Root (5/12/2025 11:59:00 PM) BIT AdminCA-CD-T01 (1/25/2016 12:36:19 PM) BIT Admin-Root-CA (11/10/2021 7:51:07 AM) Buypass Class 2 CA 1 (10/13/2016 10:25:09 AM) Buypass Class 3 CA 1 (5/9/2015 2:13:03 PM) CA Disig (3/22/2016 1:39:34 AM) CertEurope (3/27/2037 11:00:00 PM) CERTICAMARA S.A. (2/23/2015 5:10:37 PM) Certicámara S.A. (5/23/2011 10:00:00 PM) Certigna (6/29/2027 3:13:05 PM) Certipost E-Trust Primary Normalised CA (7/26/2020 10:00:00 AM) Certipost E-Trust Primary Qualified CA (7/26/2020 10:00:00 AM) Certipost E-Trust Primary TOP Root CA (7/26/2025 10:00:00 AM) Certisign Autoridade Certificadora AC1S (6/27/2018 12:00:00 AM) Certisign Autoridade Certificadora AC2 (6/27/2018 12:00:00 AM) Certisign Autoridade Certificadora AC3S (7/9/2018 8:56:32 PM) Certisign Autoridade Certificadora AC4 (6/27/2018 12:00:00 AM) CertPlus Class 1 Primary CA (7/6/2020 11:59:59 PM) CertPlus Class 2 Primary CA (7/6/2019 11:59:59 PM) CertPlus Class 3 Primary CA (7/6/2019 11:59:59 PM) CertPlus Class 3P Primary CA (7/6/2019 11:59:59 PM) CertPlus Class 3TS Primary CA (7/6/2019 11:59:59 PM) CertRSA01 (3/3/2010 2:59:59 PM) certSIGN Root CA (7/4/2031 5:20:04 PM) Certum (6/11/2027 10:46:39 AM) Certum Trusted Network CA (12/31/2029 12:07:37 PM) Chambers of Commerce Root - 2008 (7/31/2038 12:29:50 PM) Chambersign Chambers of Commerce Root (9/30/2037 4:13:44 PM) Chambersign Global Root (9/30/2037 4:14:18 PM) Chambersign Public Notary Root (9/30/2037 4:14:49 PM) Chunghwa Telecom Co. Ltd. (12/20/2034 2:31:27 AM) Cisco Systems (5/14/2029 8:25:42 PM) CNNIC Root (4/16/2027 7:09:14 AM) Common Policy (10/15/2027 4:08:00 PM) COMODO (12/31/2028 11:59:59 PM) COMODO (1/18/2038 11:59:59 PM) COMODO (12/31/2029 11:59:59 PM) ComSign Advanced Security CA (3/24/2029 9:55:55 PM) ComSign CA (3/19/2029 3:02:18 PM) ComSign Secured CA (3/16/2029 3:04:56 PM) Correo Uruguayo - Root CA (12/31/2030 2:59:59 AM) Cybertrust Global Root (12/15/2021 8:00:00 AM) DanID (2/11/2037 9:09:30 AM) DanID (4/5/2021 5:03:17 PM) Deutsche Telekom Root CA 2 (7/9/2019 11:59:00 PM) DigiCert (11/10/2031 12:00:00 AM) DigiCert (11/10/2031 12:00:00 AM) DigiCert (11/10/2031 12:00:00 AM) DigiNotar Root CA (3/31/2025 6:19:21 PM) DIRECCION GENERAL DE LA POLICIA (2/8/2036 10:59:59 PM) DST (ABA.ECOM) CA (7/9/2009 5:33:53 PM) DST (ANX Network) CA (12/9/2018 4:16:48 PM) DST (Baltimore EZ) CA (7/3/2009 7:56:53 PM) DST (National Retail Federation) RootCA (12/8/2008 4:14:16 PM) DST (United Parcel Service) RootCA (12/7/2008 12:25:46 AM) DST ACES CA X6 (11/20/2017 9:19:58 PM) DST Root CA X3 (9/30/2021 2:01:15 PM) DST RootCA X1 (11/28/2008 6:18:55 PM) DST RootCA X2 (11/27/2008 10:46:16 PM) DSTCA E1 (12/10/2018 6:40:23 PM) DSTCA E2 (12/9/2018 7:47:26 PM) DST-Entrust GTI CA (12/9/2018 12:32:24 AM) D-TRUST GmbH (5/16/2022 5:20:47 AM) D-TRUST GmbH (6/8/2012 11:47:46 AM) D-TRUST GmbH (5/16/2022 5:20:47 AM) EBG Elektronik Sertifika Hizmet Saglayicisi (8/14/2016 12:31:09 AM) E-Certchile (9/5/2028 7:39:41 PM) Echoworx Root CA2 (10/7/2030 10:49:13 AM) ECRaizEstado (6/23/2030 1:41:27 PM) EDICOM (4/13/2028 4:24:22 PM) E-GÜVEN Elektronik Sertifika Hizmet Saglayicisi (1/4/2017 11:32:48 AM) E-ME SSI (RCA) (5/19/2027 8:48:15 AM) Entrust (11/27/2026 8:53:42 PM) Entrust (5/25/2019 4:39:40 PM) Entrust.net (12/7/2030 5:55:54 PM) Equifax Secure eBusiness CA-1 (6/21/2020 4:00:00 AM) Equifax Secure eBusiness CA-2 (6/23/2019 12:14:45 PM) Equifax Secure Global eBusiness CA-1 (6/21/2020 4:00:00 AM) eSign Australia: eSign Imperito Primary Root CA (5/23/2012 11:59:59 PM) eSign Australia: Gatekeeper Root CA (5/23/2014 11:59:59 PM) eSign Australia: Primary Utility Root CA (5/23/2012 11:59:59 PM) Fabrica Nacional de Moneda y Timbre (3/18/2019 3:26:19 PM) GeoTrust (8/22/2018 4:41:51 PM) GeoTrust (7/16/2036 11:59:59 PM) GeoTrust Global CA (5/21/2022 4:00:00 AM) GeoTrust Global CA 2 (3/4/2019 5:00:00 AM) GeoTrust Primary Certification Authority - G2 (1/18/2038 11:59:59 PM) GeoTrust Primary Certification Authority - G3 (12/1/2037 11:59:59 PM) GeoTrust Universal CA (3/4/2029 5:00:00 AM) GeoTrust Universal CA 2 (3/4/2029 5:00:00 AM) Global Chambersign Root - 2008 (7/31/2038 12:31:40 PM) GlobalSign (1/28/2028 12:00:00 PM) GlobalSign (12/15/2021 8:00:00 AM) Go Daddy Class 2 Certification Authority (6/29/2034 5:06:20 PM) GTE CyberTrust Global Root (8/13/2018 11:59:00 PM) GTE CyberTrust Root (4/3/2004 11:59:00 PM) GTE CyberTrust Root (2/23/2006 11:59:00 PM) Halcom CA FO (6/5/2020 10:33:31 AM) Halcom CA PO 2 (2/7/2019 6:33:31 PM) Hongkong Post Root CA (1/16/2010 11:59:00 PM) Hongkong Post Root CA 1 (5/15/2023 4:52:29 AM) I.CA První certifikacní autorita a.s. (4/1/2018 12:00:00 AM) I.CA První certifikacní autorita a.s. (4/1/2018 12:00:00 AM) InfoNotary (3/6/2026 5:33:05 PM) IPS SERVIDORES (12/29/2009 11:21:07 PM) IZENPE S.A. (1/30/2018 11:00:00 PM) Izenpe.com (12/13/2037 8:27:25 AM) Japan Certification Services, Inc. SecureSign RootCA1 (9/15/2020 2:59:59 PM) Japan Certification Services, Inc. SecureSign RootCA11 (4/8/2029 4:56:47 AM) Japan Certification Services, Inc. SecureSign RootCA2 (9/15/2020 2:59:59 PM) Japan Certification Services, Inc. SecureSign RootCA3 (9/15/2020 2:59:59 PM) Japan Local Government PKI Application CA (3/31/2016 2:59:59 PM) Japanese Government ApplicationCA (12/12/2017 3:00:00 PM) Juur-SK AS Sertifitseerimiskeskus (8/26/2016 2:23:01 PM) KamuSM (8/21/2017 11:37:07 AM) KISA RootCA 1 (8/24/2025 8:05:46 AM) KISA RootCA 3 (11/19/2014 6:39:51 AM) Macao Post eSignTrust (1/29/2013 11:59:59 PM) MicroSec e-Szigno Root CA (4/6/2017 12:28:44 PM) Microsoft Authenticode(tm) Root (12/31/1999 11:59:59 PM) Microsoft Root Authority (12/31/2020 7:00:00 AM) Microsoft Root Certificate Authority (5/9/2021 11:28:13 PM) Microsoft Timestamp Root (12/30/1999 11:59:59 PM) MOGAHA Govt of Korea (4/21/2012 9:07:23 AM) MOGAHA Govt of Korea GPKI (3/15/2017 6:00:04 AM) NetLock Arany (Class Gold) Fotanúsítvány (12/6/2028 3:08:21 PM) NetLock Expressz (Class C) Tanusitvanykiado (2/20/2019 2:08:11 PM) NetLock Kozjegyzoi (Class A) Tanusitvanykiado (2/19/2019 11:14:47 PM) NetLock Minositett Kozjegyzoi (Class QA) Tanusitvanykiado (12/15/2022 1:47:11 AM) NetLock Platina (Class Platinum) Fotanúsítvány (12/6/2028 3:12:44 PM) NetLock Uzleti (Class B) Tanusitvanykiado (2/20/2019 2:10:22 PM) Netrust CA1 (3/30/2021 2:57:45 AM) Network Solutions (12/31/2029 11:59:59 PM) NLB Nova Ljubljanska Banka d.d. Ljubljana (5/15/2023 12:22:45 PM) OISTE WISeKey Global Root GA CA (12/11/2037 4:09:51 PM) Post.Trust Root CA (7/5/2022 9:12:33 AM) Post.Trust Root CA (8/20/2010 1:56:21 PM) Posta CA Root (10/20/2028 12:52:08 PM) POSTarCA (2/7/2023 11:06:58 AM) QuoVadis Root CA 2 (11/24/2031 6:23:33 PM) QuoVadis Root CA 3 (11/24/2031 7:06:44 PM) QuoVadis Root Certification Authority (3/17/2021 6:33:33 PM) Root CA Generalitat Valenciana (7/1/2021 3:22:47 PM) RSA Security 2048 V3 (2/22/2026 8:39:23 PM) SECOM Trust Systems CO LTD (6/6/2037 2:12:32 AM) SECOM Trust Systems CO LTD (6/25/2019 10:23:48 PM) SECOM Trust Systems CO LTD (9/30/2023 4:20:49 AM) Secretaria de Economia Mexico (5/8/2025 12:00:00 AM) Secrétariat Général de la Défense Nationale (10/17/2020 2:29:22 PM) SecureNet CA Class B (10/16/2009 9:59:00 AM) Serasa Certificate Authority I (11/21/2024 2:12:45 PM) Serasa Certificate Authority II (11/21/2024 12:44:48 PM) Serasa Certificate Authority III (11/21/2024 1:24:14 PM) SERVICIOS DE CERTIFICACION - A.N.C. (3/9/2009 9:08:07 PM) Sigen-CA (6/29/2021 9:57:46 PM) Sigov-CA (1/10/2021 2:22:52 PM) Skaitmeninio sertifikavimo centras (12/28/2026 12:05:04 PM) Skaitmeninio sertifikavimo centras (12/25/2026 12:08:26 PM) Skaitmeninio sertifikavimo centras (12/22/2026 12:11:30 PM) Sonera Class1 CA (4/6/2021 10:49:13 AM) Sonera Class2 CA (4/6/2021 7:29:40 AM) Spanish Property & Commerce Registry CA (4/27/2012 9:39:50 AM) Staat der Nederlanden Root CA (12/16/2015 9:15:38 AM) Staat der Nederlanden Root CA - G2 (3/25/2020 11:03:10 AM) Starfield Class 2 Certification Authority (6/29/2034 5:39:16 PM) Starfield Technologies (6/26/2019 12:19:54 AM) Starfield Technologies Inc. (12/31/2029 11:59:59 PM) StartCom Certification Authority (9/17/2036 7:46:36 PM) S-TRUST Authentication and Encryption Root CA 2005:PN (6/21/2030 11:59:59 PM) Swisscom Root CA 1 (8/18/2025 10:06:20 PM) SwissSign (10/25/2036 8:30:35 AM) SwissSign Platinum G2 Root CA (10/25/2036 8:36:00 AM) SwissSign Silver G2 Root CA (10/25/2036 8:32:46 AM) TC TrustCenter Class 1 CA (1/1/2011 11:59:59 AM) TC TrustCenter Class 2 CA (1/1/2011 11:59:59 AM) TC TrustCenter Class 2 CA II (12/31/2025 10:59:59 PM) TC TrustCenter Class 3 CA (1/1/2011 11:59:59 AM) TC TrustCenter Class 3 CA II (12/31/2025 10:59:59 PM) TC TrustCenter Class 4 CA (1/1/2011 11:59:59 AM) TC TrustCenter Class 4 CA II (12/31/2025 10:59:59 PM) TC TrustCenter Time Stamping CA (1/1/2011 11:59:59 AM) TC TrustCenter Universal CA I (12/31/2025 10:59:59 PM) TC TrustCenter Universal CA II (12/31/2030 10:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte (7/16/2036 11:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte (12/31/2020 11:59:59 PM) thawte Primary Root CA - G2 (1/18/2038 11:59:59 PM) thawte Primary Root CA - G3 (12/1/2037 11:59:59 PM) Thawte Timestamping CA (12/31/2020 11:59:59 PM) Trustis EVS Root CA (1/9/2027 11:56:00 AM) Trustis FPS Root CA (1/21/2024 11:36:54 AM) Trustwave (1/1/2035 5:37:19 AM) Trustwave (12/31/2029 7:40:55 PM) Trustwave (12/31/2029 7:52:06 PM) TURKTRUST Elektronik Islem Hizmetleri (9/16/2015 12:13:05 PM) TURKTRUST Elektronik Islem Hizmetleri (3/22/2015 10:04:51 AM) TURKTRUST Elektronik Sertifika Hizmet Saglayicisi (9/16/2015 10:07:57 AM) TURKTRUST Elektronik Sertifika Hizmet Saglayicisi (3/22/2015 10:27:17 AM) TÜRKTRUST Elektronik Sertifika Hizmet Saglayicisi (12/22/2017 6:37:19 PM) TW Government Root Certification Authority (12/5/2032 1:23:33 PM) TWCA Root Certification Authority 1 (12/31/2030 3:59:59 PM) TWCA Root Certification Authority 2 (12/31/2030 3:59:59 PM) U.S. Government FBCA (10/6/2010 6:53:56 PM) UCA Global Root (12/31/2037 12:00:00 AM) UCA Root (12/31/2029 12:00:00 AM) USERTrust (7/9/2019 6:40:36 PM) USERTrust (7/9/2019 5:36:58 PM) USERTrust (6/24/2019 7:06:30 PM) USERTrust (7/9/2019 6:19:22 PM) USERTrust (5/30/2020 10:48:38 AM) UTN - USERFirst-Network Applications (7/9/2019 6:57:49 PM) ValiCert Class 3 Policy Validation Authority (6/26/2019 12:22:33 AM) VAS Latvijas Pasts SSI(RCA) (9/13/2024 9:27:57 AM) VeriSign (5/18/2018 11:59:59 PM) VeriSign (7/16/2036 11:59:59 PM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (12/31/1999 9:37:48 AM) VeriSign (1/7/2004 11:59:59 PM) VeriSign (5/18/2018 11:59:59 PM) VeriSign (1/7/2004 11:59:59 PM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (1/7/2020 11:59:59 PM) VeriSign (12/31/1999 9:35:58 AM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (7/16/2036 11:59:59 PM) VeriSign (1/7/2004 11:59:59 PM) VeriSign (7/16/2036 11:59:59 PM) VeriSign (1/7/2010 11:59:59 PM) VeriSign (5/18/2018 11:59:59 PM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (1/7/2004 11:59:59 PM) VeriSign (7/16/2036 11:59:59 PM) VeriSign (7/16/2036 11:59:59 PM) VeriSign (8/1/2028 11:59:59 PM) VeriSign (5/18/2018 11:59:59 PM) VeriSign Class 3 Public Primary CA (8/1/2028 11:59:59 PM) VeriSign Class 3 Public Primary Certification Authority - G4 (1/18/2038 11:59:59 PM) VeriSign Time Stamping CA (1/7/2004 11:59:59 PM) VeriSign Universal Root Certification Authority (12/1/2037 11:59:59 PM) Visa eCommerce Root (6/24/2022 12:16:12 AM) Visa Information Delivery Root CA (6/29/2025 5:42:42 PM) VRK Gov. Root CA (12/18/2023 1:51:08 PM) Wells Fargo Root Certificate Authority (1/14/2021 4:41:28 PM) WellsSecure Public Certificate Authority (12/14/2022 12:07:54 AM) Xcert EZ by DST (7/11/2009 4:14:18 PM)

    Read the article

  • CodePlex Daily Summary for Saturday, June 12, 2010

    CodePlex Daily Summary for Saturday, June 12, 2010New ProjectsAdverTool (Advertisement tool): AdverTool is an online tool which integrates the most popular advertisement networks (such as Microsoft adCenter, Google AdWords, Yahoo! Search Mar...Authentication Configuration Tool for SharePoint: Helpful tools to automatically configure SharePoint 2007 and 2010 for forms based authentication and other authentication mechanisms.Bacicworx: A C# .Net 3.5 helper library containing functionality for compression, encryption, hashes, downloading, PayPal API, text analysis and generation, a...BlogEngine.Net iPhone Theme: A port of BETouch originally created by soundbbgBT UPnP Nat Library: This Library makes it extremly simple to add NAT upnp port forwarding to your .net applications. Developed in C# using .Net 4.0CheckBox & CheckBoxList Validators: These validators fill the much needed gap in the Asp.Net Server controlsDataFactories: The DataFactories project was created to provide a standardized interface to SSAS and MSSQL data. However, as it is implemented using the Abstract ...DVD Swarm: Converts unprotected DVD video & audio streams to H.264 with AAC/Vorbis.Frio IM: Frio IM - is cross protocol instant messenger.jiuyuan: jiuyuan management systemMGM: MyGroupManager is a simple graphical interface written in PowerShell that can be deployed to Active Directory users to simplify the managed of grou...MGR2010: This the MA thesis by Witold Stanik & Michał Sereja, PJWSTK.Nauplius.ActiveDirectory: Web-based Active Directory management.Partial rendering control using JQuery: This article show a web custom control that allows partial rendering using JQueryREG - The Random Entertainment Generator: A simple tool to make your mid up when you can't figure out what you want to do!Runes of Magic - Heilerrechner: Heilerrechner für die Heiler von Runes of Magic (www.runes.ofmagic.com)Semagsoft Calculator: Basic calculator for Windows XP, Vista and Windows 7.SO League Tables: SOLT: Stack Overflow League Tables. A fun little app that lets you compare your stack overflow performance for each month, relative to other member...Stacky StackApps .Net Client Library: StackApps is a REST API for which provides access to the stackoverflow.com family of websites. Stacky is a .net client for that API. Stacky current...TwitterDotNet: TwitterDotNet is a TwitterLibrary for .NET Framework.ValiVIN: VIN (Vehicle Identification Number) Validator Validate Vin NumberWorkLogger: Simple work hour logger in WPFNew ReleasesAdverTool (Advertisement tool): Official releases: Please visit http://advertool.org to access the complete source code and downloads.Authentication Configuration Tool for SharePoint: Auth Config Tool (WSS 3.0, MOSS 2007 version): This tool automates the setup of dual authentication web applications in SharePoint that use Windows Authentication and Forms Based Authentication....BlogEngine.Net iPhone Theme: Version 0.1: Original version 0.1 from soundbbgBraintree Client Library: Braintree-2.3.0: Return AvsErrorResponseCode, AvsPostalCodeResponseCode, AvsStreetAddressResponseCode, CurrencyIsoCode, CvvResponseCode with Transaction Return Cr...BT UPnP Nat Library: Bt_Upnp Nat Library Alpha: Alpha Release of the libraryCNZK Library: Silverlight Behaviors - Deep Zoom Tag Filter: Behavior library for Silverlight 4 containing a Deep Zoom Tag Filter Behavior. Sample at the Expression Gallery http://gallery.expression.microsof...Demina: Demina Binaries version 0.2: Updated binaries. This release contains all of the new features, including simple animation transitions.DTLoggedExec: 1.0.0.2: -Fixed a bug that prevented loading packages from SSIS Package Store -Added support for {filename} placeholder in both Data Flow Profiling and CSV ...DVD Swarm: v0.8.10.611: Initial release, mostly stable.Exchange 2010 RBAC Editor (RBAC GUI) - updated on 6/11/2010: RBAC Editor 0.9.5.1: now supports creating and editing Role Assignment Policies; rest of the stuff is the same - still a lot of way to go :) Please use email address i...Extend SmallBasic: Teaching Extensions v.021: Compatible with SmallBasic v0.9 Lame version of TicTacToe Added - more coming later.Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.1.1 GA Released: Hi, Today we are releasing Visifire 3.1.1 GA with the following features: * Logarithmic Axis * ShowIndicator() in Chart. * HideIndica...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.5.4 GA Released: Hi, Today we are releasing Visifire 3.1.1 GA with the following features: Logarithmic Axis ShowIndicator() in Chart. HideIndicator() in Chart...Keep Focused - an enhanced tool for Time Management using Pomodoro Technique: Release 0.3.1 Alpha: Release 0.3.1 Alpha Technical patch. The previous release 0.3 Alpha had some errors and missing features. It was probably not build from the source...Mesopotamia Experiment: Mesopotamia 1.2.96: Bug Fixes - Fixed duplicate cells being added on creating new cells via mutations - Fixed bug where organisms without IO synapses where getting ios...NLog - Advanced .NET Logging: Nightly Build 2010.06.11.001: Changes since the last build:No changes. Unit test results:Passed 243/243 (100%) Passed 243/243 (100%) Passed 267/267 (100%) Passed 269/269 (100%)...Partial rendering control using JQuery: JQuery Web Control V 1.0: This is the first release of the code. It includes the source code and a web application to see how it worksphpxw: Phpxw2.0: 框架目录说明 ./_mod 模块存放目录 ./phpxw/ 框架核心目录 ./phpxw/common/ 框架核心函数 ./phpxw/system/ 框架核心基础类存放目录 ./phpxw/userlib/ 用户继承类存放目录 ./temp...Questionable Content Screensaver: Questionable Content Screensaver: Should be pretty self explanatory, install the appropriate version for your computer (x64 or x86). Features Include Cache comics for offline viewi...Quick Performance Monitor: Version 1.4.1: Added option to change the 'minimum' maximum value visible on the graph at run-time. Also fixed a number of other bugs.Refix - .NET dependency management: Refix v0.1.0.82 ALPHA: This has now been run against a real life project to tease out some of the issues. While this remains alpha software, which you use at your own ris...Rhyduino - Arduino and Managed Code: Beta Release (v0.8.2): ContentsSample Project - Demonstrates basic functionality and is flooded with code comments, so it's capable of being used as a learning tool. It d...Runes of Magic - Heilerrechner: Rom_Heiler_0.1: Erste Version von "RoM Heilerrechner". .Net 4.0 Framework wird vorausgesetzt. Das erhälst du hier: http://www.microsoft.com/downloads/details.aspx?...Semagsoft Calculator: 2.0: new theme and bug fix'sSilverlight Reporting: Release 2: Updated to correct issue in report footer xaml, and to add support for a calculated report footer.Stacky StackApps .Net Client Library: Beta Preview: This is a beta preview to go along with the StackApps beta.TwitterDotNet: TwitterDotNet Library: first versionUnOfficial AW Wrapper dot Net: Aw Wrapper 1.0.0.0 (5.0): New Functions :DValiVIN: ValiVIN first release: First Iteration. METHODS: IsValid(string vin) - Checks if a string is a valid VIN (returns true or false) GetCheckSumValue(string vin) - Returns...VCC: Latest build, v2.1.30611.0: Automatic drop of latest buildViewModelSupport: ViewModelSupport 1.0: Version 1.0 More information: http://houseofbilz.net/archives/2010/05/08/adventures-in-mvvm-my-viewmodel-base/ http://houseofbilz.net/archives/201...VolgaTransTelecomClient: v.1.0.3.0: v.1.0.3.0WCF Client Generator: Version 0.9.3.19259: Changed: - Always generate full type names for parameters and return typesWCF Client Generator: Version 0.9.3.21153: Fixed: - Service contracts namespace generation Added: - Templates assembly code base read from configurationXen: Graphics API for XNA: Xen 2.0 ALPHA: This is a very early alpha for Xen 2.0. Please note: The documentation for this alpha has not been updated yet. Xen 2.0 is not backwards compatib...ZGuideTV.NET: ZGuideTV.NET 0.93: Vendredi 11 avril 2010 (ZGuideTV.NET bêta 9 build 0.93) - English below Ajout : - Classement du contenu dans la description (affichage légende si...Most Popular ProjectsCAML GeneratorSharePoint Geographic Data VisualizerDbIdiom for ADO.NET CorestudyDTSRun Job RunnerXBStudio.asp.net.automationSilverlight load on demand with MEFCloud Business ServicesSharePoint 2010 Taxonomy Import UtilitySTS Federation Metadata EditorMost Active ProjectsRhyduino - Arduino and Managed Codepatterns & practices – Enterprise LibraryjQuery Library for SharePoint Web ServicesNB_Store - Free DotNetNuke Ecommerce Catalog ModuleCommunity Forums NNTP bridgeCassandraemonBlogEngine.NETMediaCoder.NETMicrosoft Silverlight Media FrameworkAndrew's XNA Helpers

    Read the article

  • CodePlex Daily Summary for Tuesday, November 15, 2011

    CodePlex Daily Summary for Tuesday, November 15, 2011Popular ReleasesTHE NVL Maker: The NVL Maker Ver 3.10: 3.10 ??? ???: ·????????? ·????????? ·???“TJS”?“??”“EXP”?????“???”,???????? ·???“????”???,???????@if~@elsif~@else~@endif????? ·TJS????????? ·???????????else?endif??? ??: ·???FantasyDR?????????Wizard.exe(?????:http://code.google.com/p/nvlmaker-wizard/) ·KAGConfigEx2.exe??(?????:http://kcddp.keyfc.net/bbs/viewthread.php?tid=1374&extra=page%3D1) ·??????????skin??? ????: ·mapbutton????EXP??(??macro_map.ks) ·??????????AnimPlayer.ks?system????(??????AnimPlayer.ks???macro.ks) ·??????????????,?????...CreateHandouts: Latest Version: Latest VersionSQL Monitor - tracking sql server activities: SQLMon 4.1 alpha2: 1. improved object search, escape special characters, support search histories, and remember search option. 2. allow user to set connection time out. 3. allow user to drag & drop sql text or file to editors.SCCM Client Actions Tool: SCCM Client Actions Tool v0.8: SCCM Client Actions Tool v0.8 is currently the latest version. It comes with following changes since last version: Added "Wake On LAN" action. WOL.EXE is now included. Added new action "Get all active advertisements" to list all machine based advertisements on remote computers. Added new action "Get all active user advertisements" to list all user based advertisements for logged on users on remote computers. Added config.ini setting "enablePingTest" to control whether ping test is ru...Windows Azure SDK for PHP: Windows Azure SDK for PHP v4.0.4: INSTALLATION Windows Azure SDK for PHP requires no special installation steps. Simply download the SDK, extract it to the folder you would like to keep it in, and add the library directory to your PHP include_path. INSTALLATION VIA PEAR Maarten Balliauw provides an unofficial PEAR channel via http://www.pearplex.net. Here's how to use it: New installation: pear channel-discover pear.pearplex.net pear install pearplex/PHPAzure Or if you've already installed PHPAzure before: pear upgrade p...QuickGraph, Graph Data Structures And Algorithms for .Net: 3.6.61116.0: Portable library build that allows to use QuickGraph in any .NET environment: .net 4.0, silverlight 4.0, WP7, Win8 Metro apps.Devpad: 4.7: Whats new for Devpad 4.7: New export to Rich Text New export to FlowDocument Minor Bug Fix's, improvements and speed upsWeapsy: 0.4.1 Alpha: Edit Text bug fixedDesktop Google Reader: 1.4.2: This release remove the like and the broadcast buttons as Google Reader stopped supporting them (no, we don't like this decission...) Additionally and to have at least a small plus: the login window now automaitcally logs you in if you stored username and passwort (no more extra click needed) Finally added WebKit .NET to the about window and removed Awesomium MD5-Hash: 5fccf25a2fb4fecc1dc77ebabc8d3897 SHA-Hash: d44ff788b123bd33596ad1a75f3b9fa74a862fdbFluent Validation for .NET: 3.2: Changes since 3.1: Fixed issue #7084 (NotEmptyValidator does not work with EntityCollection<T>) Fixed issue #7087 (AbstractValidator.Custom ignores RuleSets and always runs) Removed support for WP7 for now as it doesn't support co/contravariance without crashing.RDRemote: Remote Desktop remote configurator V 1.0.0: Remote Desktop remote configurator V 1.0.0Rawr: Rawr 4.2.7: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr AddonWe now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including bag and bank items) like Char...VidCoder: 1.2.2: Updated Handbrake core to svn 4344. Fixed the 6-channel discrete mixdown option not appearing for AAC encoders. Added handling for possible exceptions when copying to the clipboard, added retries and message when it fails. Fixed issue with audio bitrate UI not appearing sometimes when switching audio encoders. Added extra checks to protect against reported crashes. Added code to upgrade encoding profiles on old queued items.Media Companion: MC 3.422b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) TV Show Resolutions... Made the TV Shows folder list sorted. Re-visibled 'Manually Add Path' in Root Folders. Sorted list to process during new tv episode search Rebuild Movies now processes thru folders alphabetically Fix for issue #208 - Display Missing Episodes is not popu...DotSpatial: DotSpatial Release Candidate 1 (1.0.823): Supports loading extensions using System.ComponentModel.Composition. DemoMap compiled as x86 so that GDAL runs on x64 machines. How to: Use an Assembly from the WebBe aware that your browser may add an identifier to downloaded files which results in "blocked" dll files. You can follow the following link to learn how to "Unblock" files. Right click on the zip file before unzipping, choose properties, go to the general tab and click the unblock button. http://msdn.microsoft.com/en-us/library...XPath Visualizer: XPathVisualizer v1.3 Latest: This is v1.3.0.6 of XpathVisualizer. This is an update release for v1.3. These workitems have been fixed since v1.3.0.5: 7429 7432 7427MSBuild Extension Pack: November 2011: Release Blog Post The MSBuild Extension Pack November 2011 release provides a collection of over 415 MSBuild tasks. A high level summary of what the tasks currently cover includes the following: System Items: Active Directory, Certificates, COM+, Console, Date and Time, Drives, Environment Variables, Event Logs, Files and Folders, FTP, GAC, Network, Performance Counters, Registry, Services, Sound Code: Assemblies, AsyncExec, CAB Files, Code Signing, DynamicExecute, File Detokenisation, GU...Extensions for Reactive Extensions (Rxx): Rxx 1.2: What's NewRelated Work Items Please read the latest release notes for details about what's new. Content SummaryRxx provides the following features. See the Documentation for details. Many IObservable<T> extension methods and IEnumerable<T> extension methods. Many useful types such as ViewModel, CommandSubject, ListSubject, DictionarySubject, ObservableDynamicObject, Either<TLeft, TRight>, Maybe<T> and others. Various interactive labs that illustrate the runtime behavior of the extensio...Facebook C# SDK: v5.3.2: This is a RTW release which adds new features and bug fixes to v5.2.1. Query/QueryAsync methods uses graph api instead of legacy rest api. removed dependency from Code Contracts enabled Task Parallel Support in .NET 4.0+ (experimental) added support for early preview for .NET 4.5 (binaries not distributed in codeplex nor nuget.org, will need to manually build from Facebook-Net45.sln) added additional method overloads for .NET 4.5 to support IProgress<T> for upload progress added ne...Delete Inactive TS Ports: List and delete the Inactive TS Ports: UPDATEAdded support for windows 2003 servers and removed some null reference errors when the registry key was not present List and delete the Inactive TS Ports - The InactiveTSPortList.EXE accepts command line arguments The InactiveTSPortList.Standalone.WithoutPrompt.exe runs as a standalone exe without the need for any command line arguments.New ProjectsAFNC: testArithmetics: arithmetics for silverlight use note pattern by time streamAzon.Library: A collection of extensions, static helpers, AOP attributes. More will added as the project will go on.Chat TextBlock Control: A windows phone 7.1 control Resemble those chat balloon textblocks in the SMS appDiamond Framework: Diamond Framework an Common framework for Diamond Group.DNN Social Helpers: DNN Social HelpersDragon: DragonEasy Video Cropper: A simple application to make cropping videos easy for anyone. - Automatically detects black lines - Uses FFMPEGFluent Resource Mapper: This project aims to develop a framework to assist the internationalization of software using the paradigm Convetion over Configuration.Fully Observable: This project is to create an improved set of observable collections. It provides notifications for when items inside the collection change as well as when the collection itself changes.grpcmnq: no summary at allMathTool: Math tool for silverlight we plan will heve three point .matrix .differential equation .equation of locusnopCommerce Buckaroo payment provider plugin: This is a payment provider plugin for the dutch payment provider BUCKAROO. This plugin is developed and tested for nopCommerce version 2+ Phoenix MVVM+C Framework: Phoenix MVVM+C Framework PowerLib: PowerLib extends system .net library.RDRemote: This utility allows to enable the Remote Desktop connections from a remote computer using WMI.Sencha Touch Mini Workflow Framework: A workflow framework for Sencha Touch mobile apps including automatic component management ShWP: helper library for Windows PhoneTimer, Cronômetro e Despertador: Projeto desenvolvido no curso de extensão de C# da UFSCar SorocabaUtilityLibrary.Ajax: AjaxUtilityLibrary.Email: emailUtilityLibrary.FormBase: UtilityLibrary.FormBaseUtilityLibrary.Http: UtilityLibrary for HttpWebRequestUtilityLibrary.Ormapping: ormappingVoiceModel: VoiceModel is a project which make it easier to develop VoiceXML applications using ASP.Net MVC with Razor. It uses the MVVM (Model-View-VoiceModel) design pattern to abstract the voice application to a higher level. It is developed in C# and Razor.WebSite.Request: WebSite.Request launch web request (via XMLHTTP) on website. Use, for example, to make initial request to sharepoint URL and escape "slow first request" problem.Where's my lei, man?: Where's my lei, man?Zombsquare: Aplicación de ejemplo para Windows Phone utilizada en el Windows Phone Roadshow realizado en España en 2011, en esta solución podras encontra ejemplos de: -Diseño en Blend -BingMaps -GeoLocalizacion -Realidad Aumentada -Converters -Mini-trivial -Serialización de objetos ... resistir un apocalipsis Zombie...

    Read the article

< Previous Page | 520 521 522 523 524 525 526 527 528 529 530 531  | Next Page >