Search Results

Search found 4879 results on 196 pages for 'geeks'.

Page 55/196 | < Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >

  • BizTalk 2009 - BizTalk Server Best Practice Analyser

    - by StuartBrierley
    The BizTalk Server Best Practices Analyser  allows you to carry out a configuration level verification of your BizTalk installation, evaluating the deployed configuration but not modifying or tuning anything that it finds. The Best Practices Analyser uses "reading and reporting" to gather data from different sources, such as: Windows Management Instrumentation (WMI) classes SQL Server databases Registry entries When I first ran the analyser I got a number of errors, if you get any errors these should all be acted upon to resolve them, you should then run the scan again and see if any thing else is reported that needs acting upon. As you can see in the image above, the initial issue that jumped out to me was that the SQL Server Agent was not started. The reasons for this was absent mindedness - this run was against my development PC and I don't have SQL/BizTalk actively running unless I am using them.  Starting the agent service and running the scan again gave me the following results: This resolved most of the issues for me, but next major issue to look at was that there was no tracking host running.  You can also see that I was still getting an error with two of the SQL jobs.  The problem here was that I had not yet configured these two SQL jobs.  Configuring the backup and purge jobs and then starting the tracking host before running the scan again gave: This had cleared all the critical issues, but I did stil have a number of warnings.  For example on this report I was warned that the BizTalk Message box is hosted on the BizTalk Server.  While this is known to be less than ideal, it is as I expected on my development environment where I have installed Visual Studio, SQL and BizTalk on my laptop and I was happy to ignore this and other similar warnings. In your case you should take a look at any warnings you receive and decide what you want to do about each of them in turn.

    Read the article

  • Speaking at Dog Food Conference 2013

    - by Brian T. Jackett
    Originally posted on: http://geekswithblogs.net/bjackett/archive/2013/10/22/speaking-at-dog-food-conference-2013.aspx    It has been a couple years since I last attended / spoke at Dog Food Conference, but on Nov 21-22, 2013 I’ll be speaking at Dog Food Conference 2013 here in Columbus, OH.  For those of you confused by the name of the conference (no it’s not about dog food), read up on the concept of dogfooding .  This conference has a history of great sessions from local and regional speakers and I look forward to being a part of it once again.  Registration is now open (registration link) and is expected to sell out quickly.  Reserve your spot today.   Title: The Evolution of Social in SharePoint Audience and Level: IT Pro / Architect, Intermediate Abstract: Activities, newsfeed, community sites, following... these are just some of the big changes introduced to the social experience in SharePoint 2013. This class will discuss the evolution of the social components since SharePoint 2010, the architecture (distributed cache, microfeed, etc.) that supports the social experience, Yammer integration, and proper planning considerations when deploying social capabilities (personal sites, SkyDrive Pro and distributed cache). This session will include demos of the social newsfeed, community sites, and mentions. Attendees should have an intermediate knowledge of SharePoint 2010 or 2013 administration.         -Frog Out

    Read the article

  • Copy TFS Build Definitions between Projects and Collections

    - by Jakob Ehn
    Originally posted on: http://geekswithblogs.net/jakob/archive/2014/06/05/copy-tfs-build-definitions-between-projects-and-collections.aspxThe last couple of years it has become apparent that using multiple team projects in TFS is generally a bad idea. There are of course exceptions to this, but there are a lot ot things that becomes much easier to do when you put all of your projects and team in the same team project. Fellow ALM MVP Martin Hinshelwood has blogged about this several times, as well as other people in the community. In particular, using the backlog and portfolio management tools makes much more sense when everything is located in the same team project. Consolidating multiple team projects into one is not that easy unfortunately, it involves migrating source code, work items, reports etc.  Another thing that also need to be migrated is build definitions. It is possible to clone build definitions within the same team project using the TFS power tools. The Community TFS Build Manager also lets you clone build definitions to other team projects. But there is no tool that allows you to clone/copy a build definition to another collection. So, I whipped up a simple console application that let you do this. The tool can be downloaded from https://onedrive.live.com/redir?resid=EE034C9F620CD58D!8162&authkey=!ACTr56v1QVowzuE&ithint=file%2c.zip   Using CopyTFSBuildDefinitions You use the tool like this: CopyTFSBuildDefinitions  SourceCollectionUrl  SourceTeamProject  BuildDefinitionName  DestinationCollectionUrl  DestinationTeamProject [NewDefinitionName] Arguments SourceCollectionUrl The URL to the TFS collection that contains the team project with the build definition that you want to copy SourceTeamProject The name of the team project that contains the build definition BuildDefinitionName Name of the build definition DestinationCollectionUrl The URL to the TFS collection that contains the team project that you want to copy your build definition to DestinationTeamProject The name of the team project in the destination collection NewDefinitionName (Optional) Use this to override the name of the new build definition. If you don’t specify this, the name will the same as the original one Example: CopyTFSBuildDefinitions  https://jakob.visualstudio.com DemoProject  WebApplication.CI https://anotheraccount.visualstudio.com     Notes Since we are (potentially) create a build definition in a new collection, there is no guarantee that the various paths that are defined in the build definition exist in the new collection. For example, a build definition refers to server paths in TFVC or repos + branches in TFGit. It also refers to build controllers that definitely don’t exist in the new collection. So there will be some cleanup to do after you copy your build definitions. You can fix some of these using the Community TFS Build Manager, for example it is very easy to apply the correct build controller to a set of build definitions The problem stated above also applies to build process templates. However, the tool tries to find a build process template in the new team project with the same file name as the one that existed in the old team project. If it finds one, it will be used for the new build definition. Otherwise is will use the default build template If you want to run the tool for many build definitions, you can use this SQL scripts, compliments of Mr. Scrum/ALM MVP Richard Hundhausen to generate the necessary commands: USE Tfs_Collection GO SELECT 'CopyTFSBuildDefinitions.exe http://SERVER:8080/tfs/collection "' + P.ProjectName + '" "' + REPLACE(BD.DefinitionName,'\','') + '" http://NEWSERVER:8080/tfs/COLLECTION TEAMPROJECT'   FROM tbl_Project P        INNER JOIN tbl_BuildGroup BG on BG.TeamProject = P.ProjectUri        INNER JOIN tbl_BuildDefinition BD on BD.GroupId = BG.GroupId   ORDER BY P.ProjectName, BD.DefinitionName   Hope that helps, let me know if you have any problems with the tool or if you find it useful

    Read the article

  • Changing the BizTalk message output file name

    - by Bill Osuch
    By default, BizTalk creates the filename of the message dropped to a send port as %MessageID%, which is the unique identifier (GUID) of the message. What if you want to create your own filename? To start, create a simple schema, and a basic orchestration that will receive the message and send it right back out, like this: If you deploy this and wire up the ports, you can drop an xml file into your receive port and have it come out at your send port named something like {7A63CAF8-317B-49D5-871F-9FD57910C3A0}.xml. Now, we'll create a new message with a custom filename. First, create a new orchestration variable called NewFileName, of the type System.String. Next, create a second message using the same schema as the message you're receiving in the Receive shape. Now, drag a Construct Message shape to the orchestration. In the shape's properties, set Messages Constructed to be the new message you just created. Double click the Message Assignment shape (inside the Construct shape...) and paste in the following code: Message_2 = Message_1;   NewFileName = Message_1(FILE.ReceivedFileName); NewFileName = NewFileName.Replace(".xml","_"); NewFileName = NewFileName + "output_" + System.DateTime.Now.Year.ToString() + "-" + System.DateTime.Now.Month.ToString();   Message_2(FILE.ReceivedFileName) = NewFileName; Here we make a copy of the received message, get it's original file name (ReceivedFileName), replace its extension with an underscore, and date-stamp it. Finally, add a Send shape and a Port to the surface, and configure them to send the message you just created. You should wind up with an orchestration like this: Deploy it, and create a new send port. It should be just about identical to the first send port, except this time the file name will be "%SourceFileName%.xml" (without the quotes of course). Fire up the application, drop in a test file, and you should now get both the xml file named with a GUID, and a second file named something along the lines of "MySchemaTestFile_output_2011-6.xml".

    Read the article

  • Download Internet Explorer 9 RTM

    - by Harish Ranganathan
    The much anticipated RTM release of Internet Explorer 9 (IE9) happened today.  IE9 preview release was first showcased at MIX 2010 and post that there were 7-8 Platform Preview releases.  Also, IE9 Beta came out in September 2010 with close to 10 million downloads within a month.  More recently, the RC version was out with much improved performance.  Today, marks the launch of IE9 RTM.  What this means is that, within an year, the IE Team has shipped the stable product, much faster than the earlier cycles for IE8 and IE7.  I wanted to clarify a few things (myths) that arise in common 1. I am already using Chrome and its faster for me, why would I need IE9 IE9 uses 100% hardware acceleration which means, you are going to get the best of performance compared to any other browser that shipped/will ship in future.  With native Windows support, IE9 will outperform all other browsers in terms of performance. 2. What about standards and security Agreed IE6 hasn’t been in the best of standards, but why would someone compare IE6 which was released almost 10 years back.  Later, we shipped IE7 and IE8 which had the best of standards and supports during their timeframes, but one would agree that standards and specifications keep getting updated and its hard to keep pace with the same for older browsers.  Example. HTML5 support is not there in IE8 but it is very much there in IE9.  IE9 supports most of the stable standards of HTML5 and its going to provide preview releases for the work-in-progress standards. 3. IE doesn’t keep in pace with other browsers Agreed! we don’t force/release updates on major versions in very short time periods.  What we do is provide Windows Update that provides security updates/patches and other critical updates for not just IE but the whole of Windows operating system 4. I am running Windows XP, what do I do? This is the trickiest part.  Windows XP isn’t the supported operating system for IE9 and there are various reasons to it.  The recommended operating system is Windows Vista and Windows 7.  In the interest of technology and its pace, we had to discontinue Windows XP both from a retail selling perspective as well as IE9 support.  But, the recent 2 years has seen PCs/Laptops only shipped with Windows Vista or Windows 7 so, it shouldn't affect them. 5. Where do I verify IE9’s performance/standard support and other information. http://samples.msdn.microsoft.com/ietestcenter/  Here below is a snapshot of one of the tests. Clearly IE9 outperforms all other browsers and will continue to outperform them in future.  You can download IE9 from www.beautyoftheweb.com Cheers!!!

    Read the article

  • Paper Gold Rush

    - by Chris G. Williams
    The last few days at the shop have been reminiscent of a marathon of Pawn Stars. Quite a few people have come in wanting to trade for store credit. Most of them have left disappointed. We did pick up a few things here and there (which hopefully I can sell.) The problem, in a nutshell, is that people get it in their head that a (YuGiOh) card is worth X amount because they looked it up 2-3 years a...go, or someone told them it was valuable... then they play it in their deck for a year without sleeves, and cram it in a binder covered in duct tape. By the time they bring the cards in to me, new sets have come out which often de-value the tournament usefulness of the card from $20 to *maybe* 50 cents, in mint condition. Which means I can offer them about 10-15 cents... only they are almost never in mint condition, which means I usually offer them nothing at all. Most of the time, you can watch their smile fade as I start going through their cards. It's kinda sad, really, since I know they think they've spent the last two years walking around with the keys to their own personal gold mine. I don't really enjoy seeing that look on a child's face. I like kids and I remember those moments when perception and reality crashed headlong into each other. It was seldom pretty. So, when I'm talking to a child, I try to take it easy on them and give them some suggestions on how to better preserve their cards. Sometimes though, it's an adult. Depending on the situation, my response to them varies pretty broadly. Most of the time though, I still feel pretty bad when it doesn't go their way.

    Read the article

  • Windows Azure Evolution &ndash; Deploy Web Sites (WAWS Part 3)

    - by Shaun
    This is the sixth post of my Windows Azure Evolution series. After talked a bit about the new caching preview feature in the previous one, let’s back to the Windows Azure Web Sites (WAWS).   Git and GitHub Integration In the third post I introduced the overview functionality of WAWS and demonstrated how to create a WordPress blog through the build-in application gallery. And in the fourth post I covered how to use the TFS service preview to deploy an ASP.NET MVC application to the web site through the TFS integration. WAWS also have the Git integration. I’m not going to talk very detailed about the Git and GitHub integration since there are a bunch of information on the internet you can refer to. To enable the Git just go to the web site item in the developer portal and click the “Set up Git publishing”. After specified the username and password the windows azure platform will establish the Git integration and provide some basic guide. As you can see, you can download the Git binaries, commit the files and then push to the remote repository. Regarding the GitHub, since it’s built on top of Git it should work. Maarten Balliauw have a wonderful post about how to integrate GitHub to Windows Azure Web Site you can find here.   WebMatrix 2 RC WebMatrix is a lightweight web application development tool provided by Microsoft. It utilizes WebDeploy or FTP to deploy the web application to the server. And in WebMatrix 2.0 RC it added the feature to work with Windows Azure. First of all we need to download the latest WebMatrix 2 through the Web Platform Installer 4.0. Just open the WebPI and search “WebMatrix”, or go to its home page download its web installer. Once we have WebMatrix 2, we need to download the publish file of our WAWS. Let’s go to the developer portal and open the web site we want to deploy and download the publish file from the link on the right hand side. This file contains the necessary information of publishing the web site through WebDeploy and FTP, which can be used in WebMatrix, Visual Studio, etc.. Once we have the publish file we can open the WebMatrix, click the Open Site, Remote Site. Then it will bring up a dialog where we can input the information of the remote site. Since we have our publish file already, we can click the “Import publish settings” and select the publish file, then we can see the site information will be populated automatically. Click OK, the WebMatrix will connect to the remote site, which is the WAWS we had deployed already, retrieve the folders and files information. We can open files in WebMatrix and modify. But since WebMatrix is a lightweight web application tool, we cannot update the backend C# code. So in this case, we will modify the frontend home page only. After saved our modification, WebMatrix will compare the files between in local and remote and then it will only upload the modified files to Windows Azure through the connection information in the publish file. Since it only update the files which were changed, this minimized the bandwidth and deployment duration. After few seconds we back to the website and the modification had been applied.   Visual Studio and WebDeploy The publish file we had downloaded can be used not only in WebMatrix but also Visual Studio. As we know in Visual Studio we can publish a web application by clicking the “Publish” item from the project context menu in the solution explorer, and we can specify the WebDeploy, FTP or File System for the publish target. Now we can use the WAWS publish file to let Visual Studio publish the web application to WAWS. Let’s create a new ASP.NET MVC Web Application in Visual Studio 2010 and then click the “Publish” in solution explorer. Once we have the Windows Azure SDK 1.7 installed, it will update the web application publish dialog. So now we can import the publish information from the publish file. Select WebDeploy as the publish method. We can select FTP as well, which is supported by Windows Azure and the FTP information was in the same publish file. In the last step the publish wizard can check the files which will be uploaded to the remote site before the actually publishing. This gives us a chance to review and amend the files. Same as the WebMatrix, Visual Studio will compare the files between local and WAWS and determined which had been changed and need to be published. Finally Visual Studio will publish the web application to windows azure through WebDeploy protocol. Once it finished we can browse our website.   FTP Deployment The publish file we downloaded contains the connection information to our web site via both WebDeploy and FTP. When using WebMatrix and Visual Studio we can select WebDeploy or FTP. WebDeploy method can be used very easily from WebMatrix and Visual Studio, with the file compare feature. But the FTP gives more flexibility. We can use any FTP client to upload files to windows azure regardless which client and OS we are using. Open the publish file in any text editor, we can find the connection information very easily. As you can see the publish file is actually a XML file with WebDeploy and FTP information in plain text attributes. And once we have the FTP URL, username and password, when can connect to the site and upload and download files. For example I opened FileZilla and connected to my WAWS through FTP. Then I can download files I am interested in and modify them on my local disk. Then upload back to windows azure through FileZilla. Then I can see the new page.   Summary In this simple and quick post I introduced vary approaches to deploy our web application to Windows Azure Web Site. It supports TFS integration which I mentioned previously. It also supports Git and GitHub, WebDeploy and FTP as well.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Windows Azure Recipe: Mobile Computing

    - by Clint Edmonson
    A while back, mashups were all the rage. The idea was to compose solutions that provided aggregation and integration across applications and services to make information more available, useful, and personal. Mashups ushered in the era of Web 2.0 in all it’s socially connected goodness. They taught us that to be successful, we needed to add web service APIs to our web applications. Web and client based mashups met with great success and have evolved even further with the introduction of the internet connected smartphone. Nothing is more available, useful, or personal than our smartphones. The current generation of cloud connected mobile computing mashups allow our mobilized workforces to receive, process, and react to information from disparate sources faster than ever before. Drivers Integration Reach Time to market Solution Here’s a sketch of a prototypical mobile computing solution using Windows Azure: Ingredients Web Role – with the phone running a dedicated client application, the web role is responsible for serving up backend web services that implement the solution’s core connected functionality. Database – used to store core operational and workflow data for the solution’s web services. Access Control – this service is used to authenticate and manage users identity, roles, and groups, possibly in conjunction with 3rd identity providers such as Windows LiveID, Google, Yahoo!, and Facebook. Worker Role – this role is used to handle the orchestration of long-running, complex, asynchronous operations. While much of the integration and interaction with other services can be handled directly by the mobile client application, it’s possible that the backend may need to integrate with 3rd party services as well. Offloading this work to a worker role better distributes computing resources and keeps the web roles focused on direct client interaction. Queues – these provide reliable, persistent messaging between applications and processes. They are an absolute necessity once asynchronous processing is involved. Queues facilitate the flow of distributed events and allow a solution to send push notifications back to mobile devices at appropriate times. Training & Resources These links point to online Windows Azure training labs and resources where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Windows Azure Toolkit for Windows Phone The Windows Azure Toolkit for Windows Phone is designed to make it easier for you to build mobile applications that leverage cloud services running in Windows Azure. The toolkit includes Visual Studio project templates for Windows Phone and Windows Azure, class libraries optimized for use on the phone, sample applications, and documentation Windows Azure Toolkit for iOS The Windows Azure Toolkit for iOS is a toolkit for developers to make it easy to access Windows Azure storage services from native iOS applications. The toolkit can be used for both iPhone and iPad applications, developed using Objective-C and XCode. Windows Azure Toolkit for Android The Windows Azure Toolkit for Android is a toolkit for developers to make it easy to work with Windows Azure from native Android applications. The toolkit can be used for native Android applications developed using Eclipse and the Android SDK. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • How to change the Target URL of EditForm.aspx, DispForm.aspx and NewForm.aspx

    - by Jayant Sharma
    Hi All, To changing the URL of ListForms we have very limited options. On Inernet if you search you will lots of article about Customization of List Forms using SharePoint Desinger, but the disadvantage of SharePoint Desinger is, you cannot create wsp of the customization means what ever changes you have done in UAT environment you need to repeat the same at Production. This is the main reason I donot like SharePoint Desinger more. I always prefer to create WSP file and Deployment of WSP file will do all of my work. Now, If you want to change the ListForm URL using Visual Studio, either you have create new List Defintion or Create New Content Type. I found some very good article and want to share.. http://www.codeproject.com/Articles/223431/Custom-SharePoint-List-Forms http://community.bamboosolutions.com/blogs/sharepoint-2010/archive/2011/05/12/sharepoint-2010-cookbook-how-to-create-a-customized-list-edit-form-for-development-in-visual-studio-2010.aspx Whenever you create, either List Defintion or Content type and specify the URL for List Forms it will automatically add ListID and ItemID (No ItemID in case of NewForm.aspx) in the URL as Querystring, so if you want to redirect it or do some logic you can, as you got the ItemID as well as ListID.

    Read the article

  • Silverlight Cream for February 05, 2011 -- #1041

    - by Dave Campbell
    In this Issue: Peter Kuhn, Mike Ormond(-2-, -3-), WindowsPhoneGeek, Daniel N. Egan, Phil Middlemiss(-2-), Max Paulousky, Michael Washington. Above the Fold: Silverlight: "Designing for Browser-Zoom: Part 2" Phil Middlemiss WP7: "Talking about Converters in WP7 | Coding4fun toolkit converters in depth" WindowsPhoneGeek Lightswitch: "LightSwitch: Can We Handle The Truth?" Michael Washington Shoutouts: András Velvárt has a video up of some awesome changes he has planned for SurfCube, check it out: SurfCube V2 - 3D Web Browser for Windows Phone 7, now with tabs! From SilverlightCream.com: Silverlight for keyboard junkies Peter Kuhn has a post up talking about the issues surrounding trying to use the tab key to navigate between controls... and follows it up with a behavior that resolves it. Windows Phone 7 Content On Demand Mike Ormond has a batch of WP7 Videos up... this first is "Windows Phone 7: A Different Kind of Phone" with Andrej Radinger. Windows Phone 7 Content on Demand Pt 2 Mike Ormond's 2nd WP7 video is "Understanding the Windows Phone 7 Development Tools and Getting Started" with Maarten Struys Windows Phone 7 Content on Demand Pt 3 Mike Ormond's 3rd WP7 Content on Demand is "Games Programming on Windows Phone 7 with Silverlight and XNA" with Rob Miles Talking about Converters in WP7 | Coding4fun toolkit converters in depth WindowsPhoneGeek is discussing value converters in his latest post... value converters for WP7... and the ones in the Coding4Fun toolkit to be exact... everything you wanted to know about them but didn't know to ask :) WP7 Developer Tools–Jan Update Daniel N. Egan has information up about the new WP7 Developer Tools release. Designing for Browser-Zoom: Part 1 Phil Middlemiss has both parts of a series on Browser Zoom up... this first part covers the zoom and different pieces involved. Designing for Browser-Zoom: Part 2 Phil Middlemiss's part 2 shows us some design considerations and visual states, including an attached behavior you can use in Blend to respond to the zoom event. Windows Phone Copy-Paste: How It Looks and Works Max Paulousky has the first post I've seen on WP7 Copy/Paste up... of course it's still in the emulator, but hey... that's better than nothing, right? LightSwitch: Can We Handle The Truth? Have you been playing with Lightswitch? Well... Michael Washington has, and it's got his interest up far enough that he's waving the flags trying to attract everyone else over there as well... see if you agree. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Web Essentials extension now available for Visual Studio Express

    - by ihaynes
    Originally posted on: http://geekswithblogs.net/ihaynes/archive/2014/08/12/web-essentials-extension-now-available-for-visual-studio-express.aspxVisual Studio Express has always had one big issue for me, the inability to install any (or very few) of the normal Visual Studio extensions. One of the best extensions for front-end development and design is the Web Essentials extension by Mads Kristensen on the VS team. It was a hugely welcome surprise to find that this extension is now available, in full, for VS Express, and had been since May, darn it! It has a huge number of useful features, not least being able to minify HTML, CSS and JavaScript files, which is almost a prerequisite for responsive sites. It can also bundle them together. There are lots of other features for HTML editing; intellisense and syntax highlighting for robots.txt files; 'surround with tag', intellisense for meta tags (including iOS, Twitter and Facebook); generate vendor-specific CSS etc etc. The full details can be found at the dedicated site. http://vswebessentials.com This extension alone makes VS Express far more useful and worth having.

    Read the article

  • Due to the Classes

    - by Ratman21
    Why does it seem that I am always saying sorry (or in Japanese Gomennasi)?  Well I am late again for blog as you can see. The CCNA class’s part 1 (also known as CCENT) was, well more intense than all of the certification classes before it.   The teacher was cramming as much as he could into us during the week and it was hard to come home and do much more than fall into bed (Well I was doing still doing my Job search and checking up on my web sites and groups).   But I didn’t have much left in the way of blogging (Which by the way is now in 3 different sites). Even though it was hard some times, I really liked the fact I was getting back to something like (and mean really like, in fact I like Cisco routers than some people I know). At the class, I got some software that allows me to simulate setting up and troubles shoot Lan’s or Wan’s.   When we weren’t getting facts for the test thrown at us, we were doing labs with this software. It was fun for me to be able to use the CISCO router commands and trouble shoot router issues. Even if it was just a sim. So now it is study, study, take practices tests and do the labs. I took the week end and more off after cram CCENT week but, now I am back at it.  Also I could not keep up with my Love Dare book during week of the class. No I did not stop or forget what I already learned. I just put the next dare on hold. Well the hold is off starting tomorrow and tonight I think I am going to write a new cover letter. Let’s see what else I can get done tonight. Hmm I think I will try to do a sim of my home wireless LAN and study for CCENT test in about 3 weeks.   So see you tomorrow (I hope).

    Read the article

  • Receiving an MVP Award and Credibility

    - by Joe Mayo
    The post titled, The Problem with MVPs, by Steve Barbour was interesting because it makes you think about the thousands of MVPs around the world and what their value really is. Having been the recipient of multiple MVP awards, it’s an opportunity to reflect and judge my own performance. This is not a dangerous thing to do, but quite the opposite. If a person believes in self improvement, then critical analysis is an important part of that process. A lot of MVPs will tell you that they would be doing the same thing, regardless of whether they were an MVP or not; helping others in the community, which is also where I prefer to hang my hat. I’ve never defined myself as an expert and never will; this determination is left to others. In fact, let me just come out and say it, “I don’t know everything”. Shocked? Sometimes the gap between expectations and reality extends beyond a reasonable measure. Being labeled as a technical expert feels good for one's self esteem and is certainly a useful motivational technique. A problem can emerge though when an individual believes, too much, in what they are told. The problem is not with a pat on the back, but with a person does with the positive reinforcement. Is narcissism too strong a word? How often have you been in a public forum reading a demeaning response to a question that only serves in attempt to raise the stature of the person providing the response? Such behavior compromises one’s credibility, raises questions about validity of the MVP award, and is limited in community value. I’m currently under consideration for another MVP award on April 1st. If it happens, it will be good. Otherwise, I’ll keep writing articles, coding open source software, and whatever else I enjoy doing; with the best reward being that people find value in what I do. Joe

    Read the article

  • Steps to deploying on Windows Azure

    - by Vincent Grondin
    Alright, these steps might be a little detailed and of few might not be necessary but still it's a pretty accurate road map to deploying on azure...     1)     Open you solution 2)      Rebuild ALL 3)      Right click on your Azure project and click "Publish" 4)      It should open a windows explorer window with your package to be uploaded (.cspkg ) and its associated configuration (.cscfg) to be uploaded too.  Keep it open, you'll need that path later on... 5)      It should also open a browser asking you to login to your passport account, please do so. 6)      After this you will be redirected to the Azure Portal where you will see your Azure Project Name below the « Projet Name » section.  Click on it. 7)      Then you should be redirected to a detailed view of your account on Azure where you will create a new service by clicking the hyperlink on the top right corner. 8)      Choose the right service type for you, most likely the "Hosted Service" type 9)      Choose a « Label » name and click « next » 10)   Choose a name for your service and validate that the name is available in the cloud by clicking the "Check Availability" button 11)   At the bottom of this same page, you can choose to create a group for your service, use no group or join an existing group.  Creating a group means that all applications that belong to the same group will see no cost to exchanging data between other applications of the same group.  Most of the time when you create a single application, creating a group is not necessary.  You should choose a region that's close to your own region. 12)   On the next window, you should see a "Production" environment and a "Staging" environment.  Beware because "Staging" and "Production" are two different environments in the cloud and applications in "Staging" even when not runing do continue to rack in charges...  Choose an environment and click "Deploy". 13)   In the following window, browse to the path where your cspkg resides and then do the same thing with your cscfg file.  Choose a name for your Label,  and click "Deploy"... 14)   From now on, the clock is ticking and unless you have free Azure hours, your credit card is being billed… 15)   Click on the « Run » button to start your application 16)   Be patient.... be very patient… 17)   Once your application has finished starting, you should see a GREEN circle on the left side of the screen indicating that your application is READY.  Click the URL to test your application and remember that if your application is a service, you have to hit the "svc" class behind the link you see there.  Something in the likes of http://testvince2.cloudapp.net/service1.svc  (this is a fictional link) 18)   Hopefully your application will show up or in the case of a service, you will see your service's wsdl meaning that everything is working fine. Happy cloud computing all!

    Read the article

  • An introductory presentation about testing with MSTest, Visual Studio, and Team Foundation Server 2010

    - by Thomas Weller
    While it was very quiet here on my blog during the last months, this was not at all true for the rest of my professional life. The simple story is that I was too busy to find the time for authoring blog posts (and you might see from my previous ones that they’re usually not of the ‘Hey, I’m currently reading X’ or ‘I’m currently thinking about Y’ kind…). Anyway. Among the things I did during the last months were setting up a TFS environment (2010) and introducing a development team to the MSTest framework (aka. Visual Studio Unit Testing), some additional tools (e.g. Moq, Moles, White),  how this is supported in Visual Studio, and how it integrates into the broader context of the then new TFS environment. After wiping out all the stuff which was directly related to my former customer and reviewing/extending the Speaker notes, I thought I share this presentation (via Slideshare) with the rest of the world. Hopefully it can be useful to someone else out there… Introduction to testing with MSTest, Visual Studio, and Team Foundation Server 2010 View more presentations from Thomas Weller. Be sure to also check out the slide notes (either by viewing the presentation directly on Slideshare or - even better - by downloading it). They contain quite some additional information, hints, and (in my opinion) best practices.

    Read the article

  • Ambiguous in the namespace problem

    - by Agha Usman Ahmed
    From the last few days, I was ignoring an error that keep coming at the compile time. I spent some two hours on it before but didn’t get it work. The error is quit confusing and of course difficult to manage. 'ApplicationSettingsBase' is ambiguous in the namespace 'System.Configuration' 'MailMessage' is ambiguous in the namespace 'System.Net.Mail' And there are couple of other similar errors that is pointing to some ambiguous references in my project.  The confusing part is that the MailMessage object throws similar error when you are importing the old and new email namespace. For example, Imports System.Web.Mail Imports System.Net.Mail So if you are only encountering ambiguous problem in MailMessage object. It is more possible that you have define both the namespaces in your code behind which is actually confusing the compiler about your referencing object. The quick solve for this problem is that remove Imports System.Web.Mail and it should work smooth. But with me, I never used the old asp.net mail namespace in my project. Then I start looking at my references and luckily I found the problem there. Follow the steps below to investigate the issue 1. Go to your project 2. Then references 3. Right click on “System” and see properties. it should point to the following path x:\Windows\Microsoft.NET\Framework\v2.0.50727\System.dll Where x is name of your operating system directory. This was the problem with my project. I had my operating system install on “D” drive and some how it is pointing to “C” drive which is the root cause of this problem. After that I verify all my references and found 5 –6 assemblies that are pointing to wrong path and get it worked. Also note, the problem can occur in any type of project either it is website , web application etc.

    Read the article

  • Book Review: Professional ASP.NET Design Patterns by Scott Millett

    - by Sam Abraham
    In the next few lines, I will be providing a brief review of Wrox’s Professional ASP.NET Design Patterns by Scott Millett. Design patterns have been a hot topic for many years as developers looked to do more with less, re-use as much code as possible by creating common libraries, as well as make their code easier to understand, extend and collaborate on. Scott Millett’s book covered classic and emerging patterns in a practical presentation that demonstrated with thorough examples how to put each pattern to use in the context of multi-tiered ASP.NET applications. The author’s unique approach and content earned him much kudos in the foreword by Scott Hanselman as well as online reviews. The book has 14 chapters of which 5 are dedicated to a comprehensive case study. Patterns covered therein include S.O.L.I.D, Gang of Four (GoF) as well as Martin Fowler’s Patterns of Enterprise Applications. Many thanks to the Wiley/Wrox User Group Program for their support of our West Palm Beach Developers’ Group. Best regards, --Sam You can access my reviews of books I recently read: Professional WCF 4.0 Inside Windows Communication Foundation Inside Microsoft SQL Server 2008 series

    Read the article

  • Welcome to www.badapi.net, a REST API with badly-behaved endpoints

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2014/08/14/welcome-to-www.badapi.net-a-rest-api-with-badly-behaved-endpoints.aspxI've had a need in a few projects for a REST API that doesn't behave well - takes a long time to respond, or never responds, returns unexpected status codes etc.That can be very useful for testing that clients cope gracefully with unexpected responses.Till now I've always coded a stub API in the project and run it locally, but I've put a few 'misbehaved' endpoints together and published them at www.badapi.net, and the source is on GitHub here: sixeyed/badapi.net.You can browse to the home page and see the available endpoints. I'll be adding more as I think of them, and I may give the styling of the help pages a bit more thought...As of today's release, the misbehaving endpoints available to you are:GET longrunning?between={between}&and={and} - Waits for a (short) random period before returningGET verylongrunning?between={between}&and={and} -Waits for a (long) random period before returningGET internalservererror    - Returns 500: Internal Server ErrorGET badrequest - Returns 400: BadRequestGET notfound - Returns 404: Not FoundGET unauthorized - Returns 401: UnauthorizedGET forbidden - Returns 403: ForbiddenGET conflict -Returns 409: ConflictGET status/{code}?reason={reason} - Returns the provided status code Go bad.

    Read the article

  • FTP Upload ftpWebRequest Proxy

    - by Rodney Vinyard
    Searchable:   FTP Upload ftpWebRequest Proxy FTP command is not supported when using HTTP proxy     In the article below I will cover 2 topics   1.       C# & Windows Command-Line FTP Upload with No Proxy Server   2.       C# & Windows Command-Line FTP Upload with Proxy Server   Not covered here: Secure FTP / SFTP   Sample Attributes: ·         UploadFilePath = “\\servername\folder\file.name” ·         Proxy Server = “ftp://proxy.server/” ·         FTP Target Server = ftp.target.com ·         FTP User = “User” ·         FTP Password = “Password” with No Proxy Server ·         Windows Command-Line > ftp ftp.target.com > ftp User: User > ftp Password: Password > ftp put \\servername\folder\file.name > ftp dir           (result: file.name listed) > ftp del file.name > ftp dir           (result: file.name deleted) > ftp quit   ·         C#   //----------------- //Start FTP via _TargetFtpProxy //----------------- string relPath = Path.GetFileName(\\servername\folder\file.name);   //result: relPath = “file.name”   FtpWebRequest ftpWebRequest = (FtpWebRequest)WebRequest.Create("ftp.target.com/file.name); ftpWebRequest.Method = WebRequestMethods.Ftp.UploadFile;   //----------------- //user - password //----------------- ftpWebRequest.Credentials = new NetworkCredential("user, "password");   //----------------- // set proxy = null! //----------------- ftpWebRequest.Proxy = null;   //----------------- // Copy the contents of the file to the request stream. //----------------- StreamReader sourceStream = new StreamReader(“\\servername\folder\file.name”);   byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); sourceStream.Close(); ftpWebRequest.ContentLength = fileContents.Length;     //----------------- // transer the stream stream. //----------------- Stream requestStream = ftpWebRequest.GetRequestStream(); requestStream.Write(fileContents, 0, fileContents.Length); requestStream.Close();   //----------------- // Look at the response results //----------------- FtpWebResponse response = (FtpWebResponse)ftpWebRequest.GetResponse();   Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);   with Proxy Server ·         Windows Command-Line > ftp proxy.server > ftp User: [email protected] > ftp Password: Password > ftp put \\servername\folder\file.name > ftp dir           (result: file.name listed) > ftp del file.name > ftp dir           (result: file.name deleted) > ftp quit   ·         C#   //----------------- //Start FTP via _TargetFtpProxy //----------------- string relPath = Path.GetFileName(\\servername\folder\file.name);   //result: relPath = “file.name”   FtpWebRequest ftpWebRequest = (FtpWebRequest)WebRequest.Create("ftp://proxy.server/" + relPath); ftpWebRequest.Method = WebRequestMethods.Ftp.UploadFile;   //----------------- //user - password //----------------- ftpWebRequest.Credentials = new NetworkCredential("[email protected], "password");   //----------------- // set proxy = null! //----------------- ftpWebRequest.Proxy = null;   //----------------- // Copy the contents of the file to the request stream. //----------------- StreamReader sourceStream = new StreamReader(“\\servername\folder\file.name”);   byte[] fileContents = Encoding.UTF8.GetBytes(sourceStream.ReadToEnd()); sourceStream.Close(); ftpWebRequest.ContentLength = fileContents.Length;     //----------------- // transer the stream stream. //----------------- Stream requestStream = ftpWebRequest.GetRequestStream(); requestStream.Write(fileContents, 0, fileContents.Length); requestStream.Close();   //----------------- // Look at the response results //----------------- FtpWebResponse response = (FtpWebResponse)ftpWebRequest.GetResponse();   Console.WriteLine("Upload File Complete, status {0}", response.StatusDescription);

    Read the article

  • Abstracting functionality

    - by Ralf Westphal
    Originally posted on: http://geekswithblogs.net/theArchitectsNapkin/archive/2014/08/22/abstracting-functionality.aspxWhat is more important than data? Functionality. Yes, I strongly believe we should switch to a functionality over data mindset in programming. Or actually switch back to it. Focus on functionality Functionality once was at the core of software development. Back when algorithms were the first thing you heard about in CS classes. Sure, data structures, too, were important - but always from the point of view of algorithms. (Niklaus Wirth gave one of his books the title “Algorithms + Data Structures” instead of “Data Structures + Algorithms” for a reason.) The reason for the focus on functionality? Firstly, because software was and is about doing stuff. Secondly because sufficient performance was hard to achieve, and only thirdly memory efficiency. But then hardware became more powerful. That gave rise to a new mindset: object orientation. And with it functionality was devalued. Data took over its place as the most important aspect. Now discussions revolved around structures motivated by data relationships. (John Beidler gave his book the title “Data Structures and Algorithms: An Object Oriented Approach” instead of the other way around for a reason.) Sure, this data could be embellished with functionality. But nevertheless functionality was second. When you look at (domain) object models what you mostly find is (domain) data object models. The common object oriented approach is: data aka structure over functionality. This is true even for the most modern modeling approaches like Domain Driven Design. Look at the literature and what you find is recommendations on how to get data structures right: aggregates, entities, value objects. I´m not saying this is what object orientation was invented for. But I´m saying that´s what I happen to see across many teams now some 25 years after object orientation became mainstream through C++, Delphi, and Java. But why should we switch back? Because software development cannot become truly agile with a data focus. The reason for that lies in what customers need first: functionality, behavior, operations. To be clear, that´s not why software is built. The purpose of software is to be more efficient than the alternative. Money mainly is spent to get a certain level of quality (e.g. performance, scalability, security etc.). But without functionality being present, there is nothing to work on the quality of. What customers want is functionality of a certain quality. ASAP. And tomorrow new functionality needs to be added, existing functionality needs to be changed, and quality needs to be increased. No customer ever wanted data or structures. Of course data should be processed. Data is there, data gets generated, transformed, stored. But how the data is structured for this to happen efficiently is of no concern to the customer. Ask a customer (or user) whether she likes the data structured this way or that way. She´ll say, “I don´t care.” But ask a customer (or user) whether he likes the functionality and its quality this way or that way. He´ll say, “I like it” (or “I don´t like it”). Build software incrementally From this very natural focus of customers and users on functionality and its quality follows we should develop software incrementally. That´s what Agility is about. Deliver small increments quickly and often to get frequent feedback. That way less waste is produced, and learning can take place much easier (on the side of the customer as well as on the side of developers). An increment is some added functionality or quality of functionality.[1] So as it turns out, Agility is about functionality over whatever. But software developers’ thinking is still stuck in the object oriented mindset of whatever over functionality. Bummer. I guess that (at least partly) explains why Agility always hits a glass ceiling in projects. It´s a clash of mindsets, of cultures. Driving software development by demanding small increases in functionality runs against thinking about software as growing (data) structures sprinkled with functionality. (Excuse me, if this sounds a bit broad-brush. But you get my point.) The need for abstraction In the end there need to be data structures. Of course. Small and large ones. The phrase functionality over data does not deny that. It´s not functionality instead of data or something. It´s just over, i.e. functionality should be thought of first. It´s a tad more important. It´s what the customer wants. That´s why we need a way to design functionality. Small and large. We need to be able to think about functionality before implementing it. We need to be able to reason about it among team members. We need to be able to communicate our mental models of functionality not just by speaking about them, but also on paper. Otherwise reasoning about it does not scale. We learned thinking about functionality in the small using flow charts, Nassi-Shneiderman diagrams, pseudo code, or UML sequence diagrams. That´s nice and well. But it does not scale. You can use these tools to describe manageable algorithms. But it does not work for the functionality triggered by pressing the “1-Click Order” on an amazon product page for example. There are several reasons for that, I´d say. Firstly, the level of abstraction over code is negligible. It´s essentially non-existent. Drawing a flow chart or writing pseudo code or writing actual code is very, very much alike. All these tools are about control flow like code is.[2] In addition all tools are computationally complete. They are about logic which is expressions and especially control statements. Whatever you code in Java you can fully (!) describe using a flow chart. And then there is no data. They are about control flow and leave out the data altogether. Thus data mostly is assumed to be global. That´s shooting yourself in the foot, as I hope you agree. Even if it´s functionality over data that does not mean “don´t think about data”. Right to the contrary! Functionality only makes sense with regard to data. So data needs to be in the picture right from the start - but it must not dominate the thinking. The above tools fail on this. Bottom line: So far we´re unable to reason in a scalable and abstract manner about functionality. That´s why programmers are so driven to start coding once they are presented with a problem. Programming languages are the only tool they´ve learned to use to reason about functional solutions. Or, well, there might be exceptions. Mathematical notation and SQL may have come to your mind already. Indeed they are tools on a higher level of abstraction than flow charts etc. That´s because they are declarative and not computationally complete. They leave out details - in order to deliver higher efficiency in devising overall solutions. We can easily reason about functionality using mathematics and SQL. That´s great. Except for that they are domain specific languages. They are not general purpose. (And they don´t scale either, I´d say.) Bummer. So to be more precise we need a scalable general purpose tool on a higher than code level of abstraction not neglecting data. Enter: Flow Design. Abstracting functionality using data flows I believe the solution to the problem of abstracting functionality lies in switching from control flow to data flow. Data flow very naturally is not about logic details anymore. There are no expressions and no control statements anymore. There are not even statements anymore. Data flow is declarative by nature. With data flow we get rid of all the limiting traits of former approaches to modeling functionality. In addition, nomen est omen, data flows include data in the functionality picture. With data flows, data is visibly flowing from processing step to processing step. Control is not flowing. Control is wherever it´s needed to process data coming in. That´s a crucial difference and needs some rewiring in your head to be fully appreciated.[2] Since data flows are declarative they are not the right tool to describe algorithms, though, I´d say. With them you don´t design functionality on a low level. During design data flow processing steps are black boxes. They get fleshed out during coding. Data flow design thus is more coarse grained than flow chart design. It starts on a higher level of abstraction - but then is not limited. By nesting data flows indefinitely you can design functionality of any size, without losing sight of your data. Data flows scale very well during design. They can be used on any level of granularity. And they can easily be depicted. Communicating designs using data flows is easy and scales well, too. The result of functional design using data flows is not algorithms (too low level), but processes. Think of data flows as descriptions of industrial production lines. Data as material runs through a number of processing steps to be analyzed, enhances, transformed. On the top level of a data flow design might be just one processing step, e.g. “execute 1-click order”. But below that are arbitrary levels of flows with smaller and smaller steps. That´s not layering as in “layered architecture”, though. Rather it´s a stratified design à la Abelson/Sussman. Refining data flows is not your grandpa´s functional decomposition. That was rooted in control flows. Refining data flows does not suffer from the limits of functional decomposition against which object orientation was supposed to be an antidote. Summary I´ve been working exclusively with data flows for functional design for the past 4 years. It has changed my life as a programmer. What once was difficult is now easy. And, no, I´m not using Clojure or F#. And I´m not a async/parallel execution buff. Designing the functionality of increments using data flows works great with teams. It produces design documentation which can easily be translated into code - in which then the smallest data flow processing steps have to be fleshed out - which is comparatively easy. Using a systematic translation approach code can mirror the data flow design. That way later on the design can easily be reproduced from the code if need be. And finally, data flow designs play well with object orientation. They are a great starting point for class design. But that´s a story for another day. To me data flow design simply is one of the missing links of systematic lightweight software design. There are also other artifacts software development can produce to get feedback, e.g. process descriptions, test cases. But customers can be delighted more easily with code based increments in functionality. ? No, I´m not talking about the endless possibilities this opens for parallel processing. Data flows are useful independently of multi-core processors and Actor-based designs. That´s my whole point here. Data flows are good for reasoning and evolvability. So forget about any special frameworks you might need to reap benefits from data flows. None are necessary. Translating data flow designs even into plain of Java is possible. ?

    Read the article

  • TechEd 2012 - day 3

    - by Stefan Barrett
    The content has got more useful for me as a developer, and I've now seen 2 things which I think will make a big difference: Fake in vs2012 - allows me to stub or fake out libraries making unit testing easier/possible. C++ AMP & auto - auto might get me to start using c++ again (it makes code like for each much nicer/easier to write), while AMP is something I want to play with (moves the processing onto the GPU) The food got a little better, while there was less sign of the snacks.

    Read the article

  • AJI Report #12 | Tim Hibbard Talks .NET to iOS Development

    - by Jeff Julian
    In this AJI Report, Jeff and John talk with Tim Hibbard of Engraph Software about making the transition from a .NET developer to mobile applications using the iOS platform. Tim dives into what each experience was like from getting into XCode for the first time, using Third-party tools, Apple's design guidelines, and provisioning an app to the App Store. Tim has been a .NET developer since the framework was released in 2001 and now has two mobile applications in production. Listen to the Show Site: http://engraph.com/ Blog: http://timhibbard.com/blog/ Twitter: @timhibbard

    Read the article

  • TFS Rant *WARNING* negative opinions are being expressed.

    - by ryanabr
    It has happened several times now where I end up installing TFS "over the shoulder" of the system admin guy whose job it will be to "own" the server when I am gone. TFS is challenging enough to stand up when doing it myself on a completely open platform, but at these locations, networks are locked down, machines are locked down, and the unexpected always seems to pop up. I personally have the tolerance for these things as a software developer, but as we are installing I have to listen to all of 'colorful' remarks being made about: "why is it like this" or "this is a piece of crap". Generally the issues center around SharePoint integration. TFS on it's own is straightforward, but the last flavor in everyone's mouth is the SharePoint piece. As a product I like SharePoint, but installation is a nightmare. In this particular case, we are going to use WSS since the customer would like this separate from their corp SharePoint 2010 installations since there dev team is really small (1 developer) and it is being used as a VSS replacement, more than a full blown ALM tool. The server where it is being installed as a Cisco Security Agent on it that seems to block 'suspicious' activity, and as far as I can tell is preventing WSS from installing properly. The most confounding thing we can find no meaningful log entries to help diagnose the issue. it didn't help matters that when we tried to contact Microsoft for support, because we mentioned TFS in the list of things that we were trying to install, that after waiting 2 hours that we got a TFS support person NOT the SharePoint person that we really needed, so after another 2 hours the SharePoint support that we did get managed to corrupt the registry sufficiently with his 'tools' that we ended up starting over from scratch the next day anyway after going home at midnight. My point to this is: The System Administrator who is going to own this, now thinks it is a piece of crap because SharePoint wouldn't install properly. Perception is everything.  Everyone today is conditioned that software installs and works in a very simple matter. When looking at the different options to install TFS with the different "modes" there is inconsistency in the information being presented which leads to choices that causes headaches and this bad perception before the product is even installed. I am highlighting this because I love TFS as a product, but I HATE installing it, and would like it to install as simply and elegantly as the product operates once it is installed.

    Read the article

  • Extreme Optimization – Curves (Function Mapping) Part 1

    - by JoshReuben
    Overview ·        a curve is a functional map relationship between two factors (i.e. a function - However, the word function is a reserved word). ·        You can use the EO API to create common types of functions, find zeroes and calculate derivatives - currently supports constants, lines, quadratic curves, polynomials and Chebyshev approximations. ·        A function basis is a set of functions that can be combined to form a particular class of functions.   The Curve class ·        the abstract base class from which all other curve classes are derived – it provides the following methods: ·        ValueAt(Double) - evaluates the curve at a specific point. ·        SlopeAt(Double) - evaluates the derivative ·        Integral(Double, Double) - evaluates the definite integral over a specified interval. ·        TangentAt(Double) - returns a Line curve that is the tangent to the curve at a specific point. ·        FindRoots() - attempts to find all the roots or zeroes of the curve. ·        A particular type of curve is defined by a Parameters property, of type ParameterCollection   The GeneralCurve class ·        defines a curve whose value and, optionally, derivative and integrals, are calculated using arbitrary methods. A general curve has no parameters. ·        Constructor params:  RealFunction delegates – 1 for the function, and optionally another 2 for the derivative and integral ·        If no derivative  or integral function is supplied, they are calculated via the NumericalDifferentiation  and AdaptiveIntegrator classes in the Extreme.Mathematics.Calculus namespace. // the function is 1/(1+x^2) private double f(double x) {     return 1 / (1 + x*x); }   // Its derivative is -2x/(1+x^2)^2 private double df(double x) {     double y = 1 + x*x;     return -2*x* / (y*y); }   // The integral of f is Arctan(x), which is available from the Math class. var c1 = new GeneralCurve (new RealFunction(f), new RealFunction(df), new RealFunction(System.Math.Atan)); // Find the tangent to this curve at x=1 (the Line class is derived from Curve) Line l1 = c1.TangentAt(1);

    Read the article

  • Scrum and Team Consolidation

    - by John K. Hines
    I’m still working my way through one of the more painful team consolidations of my career.  One thing that’s made it hard was my assumption that the use of Agile methods and Scrum would make everything easy.  Take three teams, make all work visible, track it, and presto: An efficient, functioning software development team. What I’ve come to realize is that the primary benefit of Scrum is that Scrum brings teams closer to their customers.  Frequent meetings, short iterations, and phased deployments are all meant to keep the customer in the loop.  It’s true that as teams become proficient with Scrum they tend to become more efficient.  But I don’t think it’s true that Scrum automatically helps people work together. Instead, Scrum can point out when teams aren’t good at working together.   And it really illustrates when teams, especially teams in sustaining mode, are reacting to their customers instead of innovating with them.  At the moment we’ve inherited a huge backlog of tools, processes, and personalities.  It’s up to us to sort them all out.  Unfortunately, after 7 &frac12; months we’re still sorting. What I’d recommend for any blended team is to look at your current product lifecycles and work on a single lifecycle for all work.  If you can’t objectively come up with one process, that’s a good indication that the new team might not be a good fit for being a single unit (which happens all the time in bigger companies).  Go ahead & self-organize into sub-teams.  Then repeat the process. If you can come up with a single process, tackle each piece and standardize all of them.  Do this as soon as possible, as it can be uncomfortable.  Standardize your requirements gathering and tracking, your exploration and technical analysis, your project planning, development standards, validation and sustaining processes.  Standardize all of it.  Make this your top priority, get it out of the way, and get back to work. Lastly, managers of blended teams should realize what I’m suggesting is a disruptive process.  But you’ve just reorganized the team is already disrupted.   Don’t pull the bandage off slowly and force the team through a prolonged transition phase, lowering their productivity over the long term.  You can role model leadership to your team and drive a true consolidation.  Destroy roadblocks, reassure those on your team who are afraid of change, and push forward to create something efficient and beautiful.  Then use Scrum to reengage your customers in a way that they’ll love. Technorati tags: Scrum Scrum Process

    Read the article

< Previous Page | 51 52 53 54 55 56 57 58 59 60 61 62  | Next Page >