Search Results

Search found 4886 results on 196 pages for 'geeks on hugs'.

Page 113/196 | < Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >

  • UDDI vs SO-Aware: Why SO-Aware is the More Efficient and Interoperable Alternative

    - by Vishal
    Hello folks,   If you are implementing a service oriented architecture, and are unsure of the best governance approach to follow, then this webinar is a must-attend event for you.  We will discuss why SO-Aware is the more efficient and interoperable alternative to traditional UDDI-based SOA-governance.   Specifically, we will address the differences between UDDI and SO-Aware in terms of service discovery, configuration, and policy resolution.  Finally, we will address why the REST/Odata based model implemented by SO-Aware enables the most efficient governance not only for WCF but for BizTalk, the Windows Server AppFabric and the Windows Azure AppFabric as well.   Join us on January 26th at 2:00 ET - to register, click here    Thanks,   Vishal

    Read the article

  • MapRedux - PowerShell and Big Data

    - by Dittenhafer Solutions
    MapRedux – #PowerShell and #Big Data Have you been hearing about “big data”, “map reduce” and other large scale computing terms over the past couple of years and been curious to dig into more detail? Have you read some of the Apache Hadoop online documentation and unfortunately concluded that it wasn't feasible to setup a “test” hadoop environment on your machine? More recently, I have read about some of Microsoft’s work to enable Hadoop on the Azure cloud. Being a "Microsoft"-leaning technologist, I am more inclinded to be successful with experimentation when on the Windows platform. Of course, it is not that I am "religious" about one set of technologies other another, but rather more experienced. Anyway, within the past couple of weeks I have been thinking about PowerShell a bit more as the 2012 PowerShell Scripting Games approach and it occured to me that PowerShell's support for Windows Remote Management (WinRM), and some other inherent features of PowerShell might lend themselves particularly well to a simple implementation of the MapReduce framework. I fired up my PowerShell ISE and started writing just to see where it would take me. Quite simply, the ScriptBlock feature combined with the ability of Invoke-Command to create remote jobs on networked servers provides much of the plumbing of a distributed computing environment. There are some limiting factors of course. Microsoft provided some default settings which prevent PowerShell from taking over a network without administrative approval first. But even with just one adjustment, a given Windows-based machine can become a node in a MapReduce-style distributed computing environment. Ok, so enough introduction. Let's talk about the code. First, any machine that will participate as a remote "node" will need WinRM enabled for remote access, as shown below. This is not exactly practical for hundreds of intended nodes, but for one (or five) machines in a test environment it does just fine. C:> winrm quickconfig WinRM is not set up to receive requests on this machine. The following changes must be made: Set the WinRM service type to auto start. Start the WinRM service. Make these changes [y/n]? y Alternatively, you could take the approach described in the Remotely enable PSRemoting post from the TechNet forum and use PowerShell to create remote scheduled tasks that will call Enable-PSRemoting on each intended node. Invoke-MapRedux Moving on, now that you have one or more remote "nodes" enabled, you can consider the actual Map and Reduce algorithms. Consider the following snippet: $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose Invoke-MapRedux takes an instance of a MapReduceItem which references the Map and Reduce scriptblocks, an array of computer names which are the remote nodes, and the initial data set to be processed. As simple as that, you can start working with concepts of big data and the MapReduce paradigm. Now, how did we get there? I have published the initial version of my PsMapRedux PowerShell Module on GitHub. The PsMapRedux module provides the Invoke-MapRedux function described above. Feel free to browse the underlying code and even contribute to the project! In a later post, I plan to show some of the inner workings of the module, but for now let's move on to how the Map and Reduce functions are defined. Map Both the Map and Reduce functions need to follow a prescribed prototype. The prototype for a Map function in the MapRedux module is as follows. A simple scriptblock that takes one PsObject parameter and returns a hashtable. It is important to note that the PsObject $dataset parameter is a MapRedux custom object that has a "Data" property which offers an array of data to be processed by the Map function. $aMap = { Param ( [PsObject] $dataset ) # Indicate the job is running on the remote node. Write-Host ($env:computername + "::Map"); # The hashtable to return $list = @{}; # ... Perform the mapping work and prepare the $list hashtable result with your custom PSObject... # ... The $dataset has a single 'Data' property which contains an array of data rows # which is a subset of the originally submitted data set. # Return the hashtable (Key, PSObject) Write-Output $list; } Reduce Likewise, with the Reduce function a simple prototype must be followed which takes a $key and a result $dataset from the MapRedux's partitioning function (which joins the Map results by key). Again, the $dataset is a MapRedux custom object that has a "Data" property as described in the Map section. $aReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) # The hashtable to return $redux = @{}; # Return Write-Output $redux; } All Together Now When everything is put together in a short example script, you implement your Map and Reduce functions, query for some starting data, build the MapReduxItem via New-MapReduxItem and call Invoke-MapRedux to get the process started: # Import the MapRedux and SQL Server providers Import-Module "MapRedux" Import-Module “sqlps” -DisableNameChecking # Query the database for a dataset Set-Location SQLSERVER:\sql\dbserver1\default\databases\myDb $query = "SELECT MyKey, Date, Value1 FROM BigData ORDER BY MyKey"; Write-Host "Query: $query" $dataset = Invoke-SqlCmd -query $query # Build the Map function $MyMap = { Param ( [PsObject] $dataset ) Write-Host ($env:computername + "::Map"); $list = @{}; foreach($row in $dataset.Data) { # Write-Host ("Key: " + $row.MyKey.ToString()); if($list.ContainsKey($row.MyKey) -eq $true) { $s = $list.Item($row.MyKey); $s.Sum += $row.Value1; $s.Count++; } else { $s = New-Object PSObject; $s | Add-Member -Type NoteProperty -Name MyKey -Value $row.MyKey; $s | Add-Member -type NoteProperty -Name Sum -Value $row.Value1; $list.Add($row.MyKey, $s); } } Write-Output $list; } $MyReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) $redux = @{}; $count = 0; foreach($s in $dataset.Data) { $sum += $s.Sum; $count += 1; } # Reduce $redux.Add($s.MyKey, $sum / $count); # Return Write-Output $redux; } # Create the item data $Mr = New-MapReduxItem "My Test MapReduce Job" $MyMap $MyReduce # Array of processing nodes... $MyNodes = ("node1", "node2", "node3", "node4", "localhost") # Run the Map Reduce routine... $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose # Show the results Set-Location C:\ $MyMrResults | Out-GridView Conclusion I hope you have seen through this article that PowerShell has a significant infrastructure available for distributed computing. While it does take some code to expose a MapReduce-style framework, much of the work is already done and PowerShell could prove to be the the easiest platform to develop and run big data jobs in your corporate data center, potentially in the Azure cloud, or certainly as an academic excerise at home or school. Follow me on Twitter to stay up to date on the continuing progress of my Powershell MapRedux module, and thanks for reading! Daniel

    Read the article

  • HTML5 data-* (custom data attribute)

    - by Renso
    Goal: Store custom data with the data attribute on any DOM element and retrieve it. Previously under HTML4 we used to use classes to store custom data, something to the affect of <input class="account void limit-5000 over-4999" /> and then have to parse the data out of the class In a book published by Peter-Paul Koch in 2007, ppk on JavaScript, he explains why and how to use custom attributes to make data more accessible to JavaScript, using name-value pairs. Accessing a custom attribute account-limit=5000 is much easier and more intuitive than trying to parse it out of a class, Plus, what if the class name for example "color-5" has a representative class definition in a CSS stylesheet that hides it away or worse some JavaScript plugin that automatically adds 5000 to it, or something crazy like that, just because it is a valid class name. As you can see there are quite a few reasons why using classes is a bad design and why it was important to define custom data attributes in HTML5. Syntax: You define the data attribute by simply prefixing any data item you want to store with any HTML element with "data-". For example to store our customers account data with a hidden input element: <input type="hidden" data-account="void" data-limit=5000 data-over=4999  /> How to access the data: account  -     element.dataset.account limit    -     element.dataset.limit You can also access it by using the more traditional get/setAttribute method or if using jQuery $('#element').attr('data-account','void') Browser support: All except for IE. There is an IE hack around this at http://gist.github.com/362081. Special Note: Be AWARE, do not use upper-case when defining your data elements as it is all converted to lower-case when reading it, so: data-myAccount="A1234" will not be found when you read it with: element.dataset.myAccount Use only lowercase when reading so this will work: element.dataset.myaccount

    Read the article

  • NDC Oslo

    - by Alan Smith
    Originally posted on: http://geekswithblogs.net/asmith/archive/2013/06/14/153136.aspx2013 has been a hectic year for conference presentations so far, NDC in Oslo has been the 6th conference I have attended, and my session there was my 11th conference presentation this year. I have been meaning to make the short trip over from Stockholm to NDC for a few years, and this was the first time I made it. I have heard a lot of great things about the event, and was impressed with the location, the sessions, and most of all the atmosphere around the event boots and during the party on Thursday evening. The session I was delivering was my “Grid Computing with 256 Windows Azure Worker Roles & Kinect” demo, which I have delivered at many events over the past 12 months. The demo went fine. I’m always a little nervous when I try to scale out the application to 256 worker roles, it almost always works well and the application will scale in minutes, but very occasionally there can be a longer delay due to the provisioning process in the Windows Azure data centers. This would not be an issue for many scenarios, but when standing on stage in front of a room full of developers you really want things to run smoothly. A number of people have suggested that I should pre-provision an environment so that it is guaranteed to be there when I run the demo during a session. For me the aim has always been to show the rapid scalability on cloud-based platforms live on stage. Pre-provisioning an environment may make for a more reliable demo but to me that would be cheating, and not half as much fun!

    Read the article

  • If not now, then when?

    - by Chris Gardner
    Originally posted on: http://geekswithblogs.net/freestylecoding/archive/2013/10/25/if-not-now-then-when.aspx The time has been flying by this year. It seems like only yesterday that I mentioned the gorillagator, a simple construct of confusion to try to draw attention to my message. In reality, that message was sent over a month ago. During that time, the hours slipped to days and days to weeks. Many exciting things have happened to myself; I'm sure many exciting things have happened to you. I'm also sure that many terrifying things have happened to children and their families. 62 children enter treatment at a Children's Miracle Network Hospital every minute. That's nearly 60,000 children since I sent the last email. To put that number in perspective, that is more than the population of Greenland. If we expand that to the past year, they have been nearly 550,000 children treated. That is almost the population of Huntsville, Decatur, and all their suburbs combined. Over the past 4 years, I have raised a little more than $3,000 for Children's Hospital of Alabama. As a result, I received a call from the organizers of Extra Life thanking me for my dedicated work and informing me that I was the top supporters for Children's Hospital of Alabama ... with my measly three grand. We can do much better than that. It may sound like I'm trying to have fun by playing games for 24 hours. It is more than that. It is me using my time and body as a catalyst. It is me putting my passion to work for a cause. It is me turning my love into something tangible. I have been campaigning and fighting to give these children a chance for years. I have been asking you to help me support these children and families. I've been putting in countless hours of talking to people, impassioned emails, and carefully constructed tweets. I have been fighting with cutting edge, and sometimes expensive, technology to try to provide live streams of my marathons. I yearly put my body through 24 (and, this year, 25) hours of no sleep. I do this to represent the countless hours these families sit awake at their children's side. All I ask is a few minutes on a website and a few dollars. These few minutes and few dollars go a long way help people that are experiencing circumstances that only occur in our nightmares. I also ask that you take one extra step. Forward this plea to those that you know. I can only reach a small fraction of a percentage of the people that may be able to help. Together, we can reach the world. I raise money for Children's Hospital of Alabama. As this message branches out, people may wish to support a hospital closer to their area. I have included a link to the list of people that have dedicated their time and have received no donations. Find someone on the list supporting your local hospital and give them a donation. Let them know that their time and effort are appreciated. Together, we can do something great. Together, we can make a difference. Together, we all stand tall. Thank you. You can get more information at http://www.extra-life.org and http://childrensmiraclenetworkhospitals.org/" My donation page is http://www.extra-life.org/participant/cgardner The list of participants without donations is http://www.extra-life.org/index.cfm?fuseaction=donorDrive.eventParticipantList&page=629&eventID=512

    Read the article

  • APress Deal of the Day 19/Oct/2013 - Software Projects Secrets Why Projects Fail

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2013/10/19/apress-deal-of-the-day-19oct2013---software-projects-secrets.aspxTod\y's $10 deal of the day from APress at http://www.apress.com/9781430251019 is Software Projects Secrets Why Projects Fail "Software Project Secrets: Why Software Projects Fail airs dirty laundry about the software industry—how putting project management's priorities above all else is the root cause of problems in software development projects. This book offers solutions to integrate project management with agile methodologies that really work for software development."

    Read the article

  • Notes from a short presentation on NodeJs

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2014/05/30/notes-from-a-short-presentation-on-nodejs.aspxI volunteered myself to give a short 30 minute presentation at a work lunch and learn on NodeJs. With my limited experience I see using Node as a great tool for build process improvement, scaffolding with yeoman, and running tests with Karma. I haven’t looked into using as a full server or development stack. I guess I’m too stuck on IIS and Visual Studio :-). Here are my notes, that aren’t very well formatted, but I wanted to share it anyways. What is it? "Node.js is a platform built on Chrome's JavaScript runtime for easily building fast, scalable network applications. Node.js uses an event-driven, non-blocking I/O model that makes it lightweight and efficient, perfect for data-intensive real-time applications that run across distributed devices." Why should you be interested? another popular tool that can help you get the job done you can use the command prompt! can be run at build or release time to automate tasks What are some uses? https://www.npmjs.org/ - NuGet for Node packages http://bower.io/ - NuGet for UI JavaScript libraries (jQuery, Bootstrap, Angular, etc) http://yeoman.io/ "Our workflow is comprised of three tools for improving your productivity and satisfaction when building a web app: yo (the scaffolding tool), grunt (the build tool) and bower (for package management)." -> yeoman asks which components you want alternative - http://joakimbeng.eu01.aws.af.cm/slush-replacing-yeoman-with-gulp/ https://www.npmjs.org/package/generator-cg-angular - phantom js, less, // git is needed for bower http://git-scm.com/ run installer in Windows before you can use bower // select Run Git from the Windows Command Prompt in the installer // requires a reboot http://stackoverflow.com/questions/20069297/bower-git-not-in-the-path-error npm install -g git npm install -g yo npm install -g generator-cg-angular mkdir myapp cd myapp yo cg-angular npm install -g bower npm install -g grunt-cli yo bower grunt serve grunt test grunt build // there are many generators (generator-angular) is another one // I like the Nuget HotTowel-Angular from John Papa myself // needed IIS Node for Express -> prompt from WebMatrix Karma bat to startup Karma - see below image compression - https://www.npmjs.org/search?q=optimize+images, https://github.com/heldr/node-smushit - do it from the command line LESS compiling js and css combine and minification at build with Gulp for requireJS apps quick lightweight HTTP server - "Express" Build pipeline with Grunt or Gulp http://www.johnpapa.net/gulp-and-grunt-at-anglebrackets/ Gulp is the newer and improved over Grunt. Supposed to be easier to use, but Grunt is more established. https://github.com/johnpapa/ng-demos/tree/master/grunt-gulp https://github.com/assetgraph/assetgraph-builder Does a lot of the minimizing, combining, image optimization etc using Node. Looks interesting.... http://nodejs.org http://nodeschool.io/ http://sub.watchmecode.net/getting-started-with-nodejs-installing-and-writing-your-first-code/ https://stormpath.com/blog/build-a-killer-node-dot-js-client-for-your-rest-plus-json-api/ https://codio.com/ http://www.hanselman.com/blog/ItsJustASoftwareIssueEdgejsBringsNodeAndNETTogetherOnThreePlatforms.aspx run unit tests - Karma in msBuild karma-start.bat @echo off cd %~dp0\.. REM 604800 is to make sure we only update once every 7 days call npm install --cache-min 604800 -g grunt-cli call npm install --cache-min 604800 call npm install --cache-min 604800 -g karma-cli karma start UnitTests\karma.conf.js REM karma start UnitTests\karma.conf.js --single-run REM see karma-start.bat and karam.config.js REM jsHint comes from Nuget

    Read the article

  • New Horizons arrives at Neptune on a 25-year anniversary!

    - by TATWORTH
    Originally posted on: http://geekswithblogs.net/TATWORTH/archive/2014/08/25/new-horizons-arrives-at-neptune-on-a-25-year-anniversary.aspxToday the New Horizons probe to the planet Pluto crosses the orbit of the planet Neptune. By a “cosmic coincidence”, this is exactly 25 years since Voyager 2 took close-up pictures of Neptune and its satellite Triton. For more see http://pluto.jhuapl.edu/mission/passingplanets/passingPlanets_current.php New Horizons current position is shown at http://pluto.jhuapl.edu/mission/whereis_nh.php

    Read the article

  • Where&rsquo;s my start button?

    - by Dennis Vroegop
    I have to be honest here for a moment. The one thing people most complain about when they talk about Windows 8 is that they miss the Start Button. You know, that dreaded thing that everybody hated when it was introduced… I usually don’t go into these kinds of discussions unless I am personally involved but this one I cannot let go. Why are people doing this? Windows 8 is a great OS. They have changed, updated and perfected so many things so there is enough to talk or write about. Yet, all articles or discussions come down to “Where’s my start button?” In order to save myself from having to explain this every single time I wrote this post and from now on I will simply refer to this blog when I get asked that question. Here it is. Your start menu is there. It’s right in front of your nose. It’s two dimensional, it’s got huge buttons (although they are more than just buttons, they’re alive and therefore called Live Tiles). Just go through those tiles and click what ever you want to start up. Don’t want to look for an item? Just start typing. Really it is that simple. When you are on the start screen just start typing (part of) the name of the program you want and you’ll find it.  As you see in the attached example I started typing “word” and it found Word, Wordfeud, Wordament etc. If you want to find something else besides a program (say you want to change the region you’re in) just click on Settings (it will already show you how many hits there are in that section). People, my request is: dive into something before you complain about it. Look around. This feature is so much easier to use than the old stuff. But you have to know about it. So. I won’t get into this discussion anymore.

    Read the article

  • Why are so many DBCC commands undocumented?

    - by DBA
    Paul Randal of SQLskills.com does a great job of answering the question of why there are so many undocumented DBCC commands in his post Why are so many DBCC commands undocumented? I would like to go on to say that not only does this apply to the DBCC commands but is some respect to all parts of SQL, other Servers, IDE's, Operating Systems, just about everywhere. There is always something that just does not make it into the official documentation. And as Paul points out probably never will make it. That could be why there are so many "Tips & Tricks" types of books, blog post, etc. everywhere you look. And I also agree with Janos's comments on Paul's post, which was "I'm fine with them undocumented. All of us who need to use these commands know where to find "documentation" and whom to ask ". Till later,

    Read the article

  • Tell Us Once&ndash;Guardian Innovation Award Winner

    - by BizTalk Visionary
    Yesterday the Tell Us Once project received it’s latest accolade. My partner in crime in the execution of the delivery of software for this project, Mark Usher,  reports: It’s always great to receive recognition for the effort you put in when working on a project. It’s no secret that here at Solidsoft we are extremely proud of our association with the Government’s Tell Us Once (TUO) programme. Having already been selected by Microsoft as Worldwide Partner Conference (WPC) 2011 Award Winners for Application Integration, we are very pleased that the TUO programme as a whole has been recognised and has won the Guardian Newspaper’s Innovation Nation Award for Frontline Services (link to http://www.guardian.co.uk/innovation-nation-awards )  The TUO entry was judged the winner over three other shortlisted solutions from Dyfed Powys Police, North Yorkshire County Council and Staffordshire County Council. Innovation Nation is a partnership between Virgin Media Business and the Guardian, an initiative to uncover the most innovative businesses, public sector organisations and charities in the UK today.  Its aim is to showcase the ideas, the endeavour and the energy that are making things better in the areas of customer service, unique working practices, frontline government services and collaboration. Solidsoft have been involved with the Tell Us Once programme since its inception in 2007 and worked closely with the Department of Work and Pensions (DWP) to produce a business case for the programme. Teaming up with Atos (who host the application) Solidsoft delivered the first national solution in 2011 and a second phase in April 2012. Whilst currently restricted to distributing citizen data to central government organisations and local government authorities, DWP is now actively engaging with the private sector to see if TUO data can be disclosed to private sector organisations such as banks and building societies. Solidsoft welcome this expansion into the private sector where even more efficiencies will be realised. Mark Usher - Solidsoft Sales and Marketing Director For my part I’d like to say a big thank you to the Solidsoft Team, ATOS team and DWP team that made it happen.

    Read the article

  • How to undo a changeset using tf.exe rollback

    - by Tarun Arora
    Technorati Tags: Team Foundation Server 2010,Team Foundation Utilities,TFS2010   Oh no! Did you just check in a changeset in to TFS and realized that you need to roll back the changeset because the changes were suppose to go in a different branch? Or did you just accidently merge a wrong changeset in your release branch? There are several ways to undo the damage, Manual: Yes, we all just hate this word but for the record you could manually rollback the changes. Get Specific version on the branch and chose the changeset prior to the one you checked in. After that check out all the files in the changeset and check them in. During the check in you will receive a conflict. At this point choose ‘Keep local changes’ in the conflict resolution window and check in the files. Automated: Yes, we just love it! TFS comes with a very powerful command line utility ‘tf.exe’ that gives you the ability to rollback the effects of one or more changesets to one or more version-controlled items. This command does not remove the changesets from an item's version history. Instead, this command creates in your workspace a set of pending changes that negate the effects of the changesets that you specify. Syntax tf rollback /toversion:VersionSpec ItemSpec [/recursive] [/lock:none|checkin|checkout] [/version:versionspec] [/keepmergehistory] [/login:username,[password]] [/noprompt] tf rollback /changeset:ChangesetFrom~ChangesetTo [ItemSpec] [/recursive] [/lock:none|checkin|checkout] [/version:VersionSpec] [/keepmergehistory] [/noprompt] [/login:username,[password]]   I’ll explain this with an example. Your workspace is at the location C:\myWorkspace You want to rollback changeset # 145621 C:\Workspace\MyBranch>tf.exe rollback /changeset:145621 /recursive How do i rollback/undo a series of changesets? You can also rollback a range of changesets by using the following C:\Workspace\MyBranch>tf.exe rollback /changeset:145601~145621 /recursive This will check out the files in the version control and you should be able to see them in the pending changes. Go on check them in to undo the specific changeset that you just rolled back. Do you completely want to get rid of the changeset from all future merges between the two branches? /KeepMergeHistory: This option has an effect only if one or more of the changesets that you are rolling back include a branch or merge change. Specify this option if you want future merges between the same source and the same target to exclude the changes that you are rolling back. Errors “If you get the message ‘Unable to determine the workspace.’ You may be able to correct this by running ‘tf worksapces /collection:TeamProjectCollectionUrl’” you are in the wrong directory. Make sure that you run the ‘tf rollback’ command from the directory of your workspace.   Status Exit Code Description 0 The operation rolled back all items successfully. 1 The operation rolled back at least one item successfully but could not roll back one or more items. 100 The operation could not roll back any items.   To use the command you must have the Read, Check Out, and Check In permissions set to Allow. So, have you been in a rollback undo situation before?   Share this post :

    Read the article

  • Yippy &ndash; the F# MVVM Pattern

    - by MarkPearl
    I did a recent post on implementing WPF with F#. Today I would like to expand on this posting to give a simple implementation of the MVVM pattern in F#. A good read about this topic can also be found on Dean Chalk’s blog although my example of the pattern is possibly simpler. With the MVVM pattern one typically has 3 segments, the view, viewmodel and model. With the beauty of WPF binding one is able to link the state based viewmodel to the view. In my implementation I have kept the same principles. I have a view (MainView.xaml), and and a ViewModel (MainViewModel.fs).     What I would really like to illustrate in this posting is the binding between the View and the ViewModel so I am going to jump to that… In Program.fs I have the following code… module Program open System open System.Windows open System.Windows.Controls open System.Windows.Markup open myViewModels // Create the View and bind it to the View Model let myView = Application.LoadComponent(new System.Uri("/FSharpWPF;component/MainView.xaml", System.UriKind.Relative)) :?> Window myView.DataContext <- new MainViewModel() :> obj // Application Entry point [<STAThread>] [<EntryPoint>] let main(_) = (new Application()).Run(myView) You can see that I have simply created the view (myView) and then created an instance of my viewmodel (MainViewModel) and then bound it to the data context with the code… myView.DataContext <- new MainViewModel() :> obj If I have a look at my viewmodel (MainViewModel) it looks like this… module myViewModels open System open System.Windows open System.Windows.Input open System.ComponentModel open ViewModelBase type MainViewModel() = // private variables let mutable _title = "Bound Data to Textbox" // public properties member x.Title with get() = _title and set(v) = _title <- v // public commands member x.MyCommand = new FuncCommand ( (fun d -> true), (fun e -> x.ShowMessage) ) // public methods member public x.ShowMessage = let msg = MessageBox.Show(x.Title) () I have exposed a few things, namely a property called Title that is mutable, a command and a method called ShowMessage that simply pops up a message box when called. If I then look at my view which I have created in xaml (MainView.xaml) it looks as follows… <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="F# WPF MVVM" Height="350" Width="525"> <Grid> <Grid.RowDefinitions> <RowDefinition Height="Auto"/> <RowDefinition Height="Auto"/> <RowDefinition Height="*"/> </Grid.RowDefinitions> <TextBox Text="{Binding Path=Title, Mode=TwoWay}" Grid.Row="0"/> <Button Command="{Binding MyCommand}" Grid.Row="1"> <TextBlock Text="Click Me"/> </Button> </Grid> </Window>   It is also very simple. It has a button that’s command is bound to the MyCommand and a textbox that has its text bound to the Title property. One other module that I have created is my ViewModelBase. Right now it is used to store my commanding function but I would look to expand on it at a later stage to implement other commonly used functions… module ViewModelBase open System open System.Windows open System.Windows.Input open System.ComponentModel type FuncCommand (canExec:(obj -> bool),doExec:(obj -> unit)) = let cecEvent = new DelegateEvent<EventHandler>() interface ICommand with [<CLIEvent>] member x.CanExecuteChanged = cecEvent.Publish member x.CanExecute arg = canExec(arg) member x.Execute arg = doExec(arg) Put this all together and you have a basic project that implements the MVVM pattern in F#. For me this is quite exciting as it turned out to be a lot simpler to do than I originally thought possible. Also because I have my view in XAML I can use the XAML designer to design forms in F# which I believe is a much cleaner way to go rather than implementing it all in code. Finally if I look at my viewmodel code, it is actually quite clean and compact…

    Read the article

  • BI Presentation for SharePoint 2010

    - by Leonard Mwangi
    I have uploaded the BI Presentation slideds for SharePoint 2010 that I presented at Kansas City SharePoint Users Group. The slides can be downloaded from http://www.etekglobalinc.com/Portals/0/blogs/Business%20intelligence%20with%20SharePoint%202010.ppt

    Read the article

  • Why is NDA so hard to understand?

    - by Dave Campbell
    Maybe this concept is simpler for me because of all the jobs I've been on over the years requiring security clearances. I've signed quite a few NDA forms. Some for big companies, some for small, but the meaning of "NDA" remains constant: Non-Disclosure Agreement. To me, that takes no further explanation, but apparently it's confusing to some people, and I don't understand how you can be confused. The papers I signed with the U.S. Army in 1970 read "10 years and $10,000" for a violation... can't imagine what it's up to now, but THAT is a strict NDA :) So those things I've been told, I cannot talk about, period. Even if the entire world knows about them, I cannot speak about them until the information goes off NDA. An example was a Silverlight release a while back. It might have been Silverlight 3, I don't remember. Everyone was anxiously awaiting the release so they could post their material. Of course the entire world knew it was coming out and imminently so. Some enterprising folks had even found the bits on a server before the official announcement. So then the situation became: everyone knew about it, some were even coding with it and blogging about it and yet we couldn't talk about it. Scott Guthrie's posting about it opened the flood gates and then it went off NDA, but up until that moment, we were locked. Sitting out on the edge you're uninstalling and re-installing all the time and you get frustrated when things that used to work don't, but hey... those bits were still warm when you got 'em, and that's the fun. But that fun comes at a price, and the price is the NDA. Awkward yes, confusing no... See you at MIX10, and Stay in the 'Light! MIX10

    Read the article

  • My First Dive into Ocean of SharePoint

    - by DipeshBhanani
    My First Dive into Ocean of SharePoint   Hello Guys, I am Dipesh Bhanani, An IT Consultant from an MNC. I have worked with many client as a SharePoint Consultant ever since. I have been on various successful engagements deploying Project Server 2003/2007, SharePoint 2003, MOSS 2007 and InfoPath 2007. People have asked me for years why I don’t start blogging. I have come across many technical hurdles in the ocean of SharePoint and resolved them passionately. So I thought why I should not share my knowledge to alleviate SharePoint troubles. Wish me luck on my ride in the world of SharePoint, please share the good and bad with me right here on my blog! More to come soon!

    Read the article

  • Have you used nDepend?

    - by Nick Harrison
    Have you Used NDepend? I have often wanted to use it, but never spent the money on it.   I have developed many tools that try to do pieces of what NDepend does, but never with as much success as they reach. Put simply, it is a tool that will allow you to udnerstand and monitor the architecture of your software, and it does it in some pretty amazing ways. One of the most impressive features is something that they call Code Query Language.   It allows you to write queries very similar to SQL to track the performance of various software metrics and use this to identify areas that are out of compliance with your standards and architecture. For instance, once you have analyzed your project, you can write queries such as : SELECT METHODS WHERE IsPublic AND CouldBePrivate  You can also set up such queries to provide warnings if there are records returned.    You can incorporae this into your daily build and compare build against build. There are over 82 metrics included to allow you to view your code in a variety of angles. I have often advocated for a "Code Inventory" database to track the state of software and the ROI on software investments.    This tool alone will take you about 90% of the way there. If you are not using it yet,  I strongly recommend that you do!

    Read the article

  • F# Simple Twitter Update

    - by mroberts
    A short while ago I posted some code for a C# twitter update.  I decided to move the same functionality / logic to F#.  Here is what I came up with. 1: namespace Server.Actions 2:   3: open System 4: open System.IO 5: open System.Net 6: open System.Text 7:   8: type public TwitterUpdate() = 9: 10: //member variables 11: [<DefaultValue>] val mutable _body : string 12: [<DefaultValue>] val mutable _userName : string 13: [<DefaultValue>] val mutable _password : string 14:   15: //Properties 16: member this.Body with get() = this._body and set(value) = this._body <- value 17: member this.UserName with get() = this._userName and set(value) = this._userName <- value 18: member this.Password with get() = this._password and set(value) = this._password <- value 19:   20: //Methods 21: member this.Execute() = 22: let login = String.Format("{0}:{1}", this._userName, this._password) 23: let creds = Convert.ToBase64String(Encoding.ASCII.GetBytes(login)) 24: let tweet = Encoding.ASCII.GetBytes(String.Format("status={0}", this._body)) 25: let request = WebRequest.Create("http://twitter.com/statuses/update.xml") :?> HttpWebRequest 26: 27: request.Method <- "POST" 28: request.ServicePoint.Expect100Continue <- false 29: request.Headers.Add("Authorization", String.Format("Basic {0}", creds)) 30: request.ContentType <- "application/x-www-form-urlencoded" 31: request.ContentLength <- int64 tweet.Length 32: 33: let reqStream = request.GetRequestStream() 34: reqStream.Write(tweet, 0, tweet.Length) 35: reqStream.Close() 36:   37: let response = request.GetResponse() :?> HttpWebResponse 38:   39: match response.StatusCode with 40: | HttpStatusCode.OK -> true 41: | _ -> false   While the above seems to work, it feels to me like it is not taking advantage of some functional concepts.  Love to get some feedback as to how to make the above more “functional” in nature.  For example, I don’t like the mutable properties.

    Read the article

  • Top 25 security issues for developers of web sites

    - by BizTalk Visionary
    Sourced from: CWE This is a brief listing of the Top 25 items, using the general ranking. NOTE: 16 other weaknesses were considered for inclusion in the Top 25, but their general scores were not high enough. They are listed in the On the Cusp focus profile. Rank Score ID Name [1] 346 CWE-79 Failure to Preserve Web Page Structure ('Cross-site Scripting') [2] 330 CWE-89 Improper Sanitization of Special Elements used in an SQL Command ('SQL Injection') [3] 273 CWE-120 Buffer Copy without Checking Size of Input ('Classic Buffer Overflow') [4] 261 CWE-352 Cross-Site Request Forgery (CSRF) [5] 219 CWE-285 Improper Access Control (Authorization) [6] 202 CWE-807 Reliance on Untrusted Inputs in a Security Decision [7] 197 CWE-22 Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal') [8] 194 CWE-434 Unrestricted Upload of File with Dangerous Type [9] 188 CWE-78 Improper Sanitization of Special Elements used in an OS Command ('OS Command Injection') [10] 188 CWE-311 Missing Encryption of Sensitive Data [11] 176 CWE-798 Use of Hard-coded Credentials [12] 158 CWE-805 Buffer Access with Incorrect Length Value [13] 157 CWE-98 Improper Control of Filename for Include/Require Statement in PHP Program ('PHP File Inclusion') [14] 156 CWE-129 Improper Validation of Array Index [15] 155 CWE-754 Improper Check for Unusual or Exceptional Conditions [16] 154 CWE-209 Information Exposure Through an Error Message [17] 154 CWE-190 Integer Overflow or Wraparound [18] 153 CWE-131 Incorrect Calculation of Buffer Size [19] 147 CWE-306 Missing Authentication for Critical Function [20] 146 CWE-494 Download of Code Without Integrity Check [21] 145 CWE-732 Incorrect Permission Assignment for Critical Resource [22] 145 CWE-770 Allocation of Resources Without Limits or Throttling [23] 142 CWE-601 URL Redirection to Untrusted Site ('Open Redirect') [24] 141 CWE-327 Use of a Broken or Risky Cryptographic Algorithm [25] 138 CWE-362 Race Condition Cross-site scripting and SQL injection are the 1-2 punch of security weaknesses in 2010. Even when a software package doesn't primarily run on the web, there's a good chance that it has a web-based management interface or HTML-based output formats that allow cross-site scripting. For data-rich software applications, SQL injection is the means to steal the keys to the kingdom. The classic buffer overflow comes in third, while more complex buffer overflow variants are sprinkled in the rest of the Top 25.

    Read the article

  • Penne alla MVP

    - by Valter Minute
    I’m sorry for the long silence on this blog and the long delay in replying to the friends that commented on my articles. I’ve been quite busy in the last weeks and I spent a lot of time traveling around Italy (not for pleasure!). In the meantime I’ve been renewed as an MVP on April the 1st (nice date to renew someone with such a bad sense of humor…). I decided to celebrate my MVP award with a new recipe (to be honest, I celebrated by eating the results of this recipe!) and I decided to call it “penne alla MVP”… just because I’m not good in finding nice names for my recipes. Ingredients (for 4 people): 360g pasta (penne or other short pasta) 300g small shrimps 1 cup of whipped cream 2 tablespoons of olive oil 1 small leek 1 glass of beer (I used Hoegaarden dutch white beer… but just because I like it and I finished the rest of the bootle while cooking) Chives Salt, pepper Prepare the pasta by boiling it in salted water, as usual. In the meantime chop the leek in very small bits, heat the oil inside a pan and when the oil is hot, drop the leek chops and let them cook for a few minutes. Add the shrimps and the glass of beer. Let them cook inside beer until they are cooked (if you used pre-cooked shrimps a couple of minutes would be enough to heat them and gave them the flavour of beer). Add the whipped cream and mix it well with the shrimps and the sauce. Dry the pasta and drop the sauce on top of it and then add the chives finely chopped.

    Read the article

  • Invitation for the ArcSig Meeting on May 18th! New Presentation!

    - by Rainer Habermann
    The Fort Lauderdale ArcSig May meeting will be on 05/18/2010 - 6:30 PM at the Microsoft Office in Fort Lauderdale. Jeff Barnes, Microsoft Architect Evangelist for the Gulf States, presents:   Developing with New Parallel Computing Technologies in VS 2010 and .NET 4.0 Register at: http://www.fladotnet.com/  - Free Pizza, soft drinks and small talk at 6:00 PM! I am looking forward to see you at the meeting! Rainer Habermann ArcSig Side Director

    Read the article

  • Information on upgrading Kinect Applications to MS SDK Beta 2.

    - by mbcrump
    Introduction Microsoft recently released the Kinect for Windows SDK Beta 2. It contains many enhancements and fixes that can be found here. The only problem with it is that a lot of current demo applications no longer function properly. Today, I’m going to walk you through a typical scenario of upgrading a Kinect application built with Beta 1 to Beta 2. Note: This tutorial covers WPF, but you can use the same techniques for WinForms. 1) Fix the references Let’s start with a fairly popular Kinect demo called Kinect User Interface Demo. This project uses the beta 1 version of Microsoft.Research.Kinect.dll and version 1.0.0.0 of Coding4Fun’s Kinect library. After you download the source code and extract the zip you will see the following references in Visual Studio 2010: Pay attention to the following references as these are the .dlls that you will have to update: Coding4Fun.Kinect.Wpf Microsoft.Research.Kinect If you click on Coding4Fun.Kinect.Wpf file you will see the following version information (v1.0.0.0): This needs to be upgraded to the Coding4Fun Kinect library built against Beta 2. So head over to http://c4fkinect.codeplex.com/ and hit download and you will have the following files. Go ahead and hit the delete key on your keyboard to remove the Coding4Fun.Kinect.Wpf.dll file from your project. Select “Add Reference” and navigate out to the folder where you extracted the files and select Coding4Fun.Kinect.Wpf.dll. If you click on the Coding4Fun.Kinect.Wpf.dll file and check properties it should be listed at 1.1.0.0: Fix Microsoft.Research.Kinect.dll The official SDK Beta 2 released a new .dll that you will need to reference in your application. Go ahead and select Microsoft.Research.Kinect.dll in your application and hit the Delete key on your keyboard. Go ahead and select Add Reference again and select Microsoft.Research.Kinect.dll from the .NET tab. Double check and make sure the version number is 1.0.0.45 as shown below. References fixed – Runtime needs to be updated. So we have fixed the references in a typical Kinect application that uses Microsoft’s SDK and C4F Kinect libraries. Now, we will need to update the runtime. All Beta 1 Kinect applications will instantiate the Runtime with the following code: Can you see that it is now marked with [Depreciated]? That means we need to update it before Microsoft decides to remove it from future versions of the SDK. We can fix this very easily by replacing this code: readonly Runtime _runtime = new Runtime(); with Microsoft.Research.Kinect.Nui.Runtime _nui; and adding similar code to our Loaded event as shown below public MainWindow() { InitializeComponent(); Loaded += new RoutedEventHandler(MainWindow_Loaded); } void MainWindow_Loaded(object sender, RoutedEventArgs e) { if (Runtime.Kinects.Count == 0) { txtInfo.Text = "Missing Kinect"; } else { _nui = Runtime.Kinects[0]; _nui.Initialize(RuntimeOptions.UseColor); // Video Frame Ready Event can happen now!!! //_nui.VideoFrameReady += new EventHandler<ImageFrameReadyEventArgs>(_nui_VideoFrameReady); _nui.VideoStream.Open(ImageStreamType.Video, 2, ImageResolution.Resolution640x480, ImageType.Color); } } In this sample, I am testing to see if a Kinect is detected and if it is then I initialize the runtime with my first Kinect by using the Runtime.Kinects[0]. You can also specify other Kinect devices here. The rest of the code is standard code that you simply modify however you wish (ie Skeletal, Depth, etc) depending on what type of video feed you want. Conclusion As you can see it really wasn’t that painful to upgrade your project to Beta 2. I would recommend that you go ahead and upgrade to Beta 2 as future versions of the SDK will use these methods.  Thanks for reading. Subscribe to my feed

    Read the article

< Previous Page | 109 110 111 112 113 114 115 116 117 118 119 120  | Next Page >