Search Results

Search found 13788 results on 552 pages for 'instance caging'.

Page 525/552 | < Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >

  • CodePlex Daily Summary for Friday, June 07, 2013

    CodePlex Daily Summary for Friday, June 07, 2013Popular ReleasesASP.NET MVC Forum: MVCForum v1.3.5: This is a bug release version, with a couple of small usability features and UI changes. All the small amount of bugs reported in v1.3 have been fixed, no upgrade needed just overwrite the files and everything should just work.Json.NET: Json.NET 5.0 Release 6: New feature - Added serialized/deserialized JSON to verbose tracing New feature - Added support for using type name handling with ISerializable content Fix - Fixed not using default serializer settings with primitive values and JToken.ToObject Fix - Fixed error writing BigIntegers with JsonWriter.WriteToken Fix - Fixed serializing and deserializing flag enums with EnumMember attribute Fix - Fixed error deserializing interfaces with a valid type converter Fix - Fixed error deser...Christoc's DotNetNuke Module Development Template: DotNetNuke 7 Project Templates V2.3 for VS2012: V2.3 - Release Date 6/5/2013 Items addressed in this 2.3 release Fixed bad namespace for BusinessController in one of the C# templates. Updated documentation in all templates. Setting up your DotNetNuke Module Development Environment Installing Christoc's DotNetNuke Module Development Templates Customizing the latest DotNetNuke Module Development Project TemplatesPulse: Pulse 0.6.7.0: A number of small bug fixes to stabilize the previous Beta. Sorry about the never ending "New Version" bug!ZXMAK2: Version 2.7.5.3: - debugger: add LPC indicator (last executed opcode pc) - add host joystick support (written by Eltaron) - change file extension for CMOS PENTEVO to "cmos" - add hardware value monitor (see Memory Map for PENTEVO/ATM/PROFI)QlikView Extension - Animated Scatter Chart: Animated Scatter Chart - v1.0: Version 1.0 including Source Code qar File Example QlikView application Tested With: Browser Firefox 20 (x64) Google Chrome 27 (x64) Internet Explorer 9 QlikView QlikView Desktop 11 - SR2 (x64) QlikView Desktop 11.2 - SR1 (x64) QlikView Ajax Client 11.2 - SR2 (based on x64)BarbaTunnel: BarbaTunnel 7.2: Warning: HTTP Tunnel is not compatible with version 6.x and prior, HTTP packet format has been changed. Check Version History for more information about this release.SuperWebSocket, a .NET WebSocket Server: SuperWebSocket 0.8: This release includes these changes below: Upgrade SuperSocket to 1.5.3 which is much more stable Added handshake request validating api (WebSocketServer.ValidateHandshake(TWebSocketSession session, string origin)) Fixed a bug that the m_Filters in the SubCommandBase can be null if the command's method LoadSubCommandFilters(IEnumerable<SubCommandFilterAttribute> globalFilters) is not invoked Fixed the compatibility issue on Origin getting in the different version protocols Marked ISub...BlackJumboDog: Ver5.9.0: 2013.06.04 Ver5.9.0 (1) ?????????????????????????????????($Remote.ini Tmp.ini) (2) ThreadBaseTest?? (3) ????POP3??????SMTP???????????????? (4) Web???????、?????????URL??????????????? (5) Ftp???????、LIST?????????????? (6) ?????????????????????Media Companion: Media Companion MC3.569b: New* Movies - Autoscrape/Batch Rescrape extra fanart and or extra thumbs. * Movies - Alternative editor can add manually actors. * TV - Batch Rescraper, AutoScrape extrafanart, if option enabled. Fixed* Movies - Slow performance switching to movie tab by adding option 'Disable "Not Matching Rename Pattern"' to Movie Preferences - General. * Movies - Fixed only actors with images were scraped and added to nfo * Movies - Fixed filter reset if selected tab was above Home Movies. * Updated Medi...Nearforums - ASP.NET MVC forum engine: Nearforums v9.0: Version 9.0 of Nearforums with great new features for users and developers: SQL Azure support Admin UI for Forum Categories Avoid html validation for certain roles Improve profile picture moderation and support Warn, suspend, and ban users Web administration of site settings Extensions support Visit the Roadmap for more details. Webdeploy package sha1 checksum: 9.0.0.0: e687ee0438cd2b1df1d3e95ecb9d66e7c538293b Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.93: Added -esc:BOOL switch (CodeSettings.AlwaysEscapeNonAscii property) to always force non-ASCII character (ch > 0x7f) to be escaped as the JavaScript \uXXXX sequence. This switch should be used if creating a Symbol Map and outputting the result to the a text encoding other than UTF-8 or UTF-16 (ASCII, for instance). Fixed a bug where a complex comma operation is the operand of a return statement, and it was looking at the wrong variable for possible optimization of = to just .VG-Ripper & PG-Ripper: VG-Ripper 2.9.42: changes NEW: Added Support for "GatASexyCity.com" links NEW: Added Support for "ImgCloud.co" links NEW: Added Support for "ImGirl.info" links NEW: Added Support for "SexyImg.com" links FIXED: "ImageBam.com" linksDocument.Editor: 2013.22: What's new for Document.Editor 2013.22: Improved Bullet List support Improved Number List support Minor Bug Fix's, improvements and speed upsCarrotCake, an ASP.Net WebForms CMS: Binaries and PDFs - Zip Archive (v. 4.3 20130528): Features include a content management system and a robust featured blogging engine. This includes configurable date based blog post URLs, blog post content association with categories and tags, assignment/customization of category and tag URL patterns, simple blog post feedback collection and review, blog post pagination/indexes, designation of default blog page (required to make search, category links, or tag links function), URL date formatting patterns, RSS feed support for posts and pages...PHPExcel: PHPExcel 1.7.9: See Change Log for details of the new features and bugfixes included in this release, and methods that are now deprecated.Droid Explorer: Droid Explorer 0.8.8.10 Beta: Fixed issue with some people having a folder called "android-4.2.2" in their build-tools path. - 16223 Magick.NET: Magick.NET 6.8.5.402: Magick.NET compiled against ImageMagick 6.8.5.4. These zip files are also available as a NuGet package: https://nuget.org/profiles/dlemstra/patterns & practices: Data Access Guidance: Data Access Guidance Drop3 2013.05.31: Drop 3DotNet.Highcharts: DotNet.Highcharts 2.0 with Examples: DotNet.Highcharts 2.0 Tested and adapted to the latest version of Highcharts 3.0.1 Added new chart types: Arearange, Areasplinerange, Columnrange, Gauge, Boxplot, Waterfall, Funnel and Bubble Added new type PercentageOrPixel which represents value of number or number with percentage. Used for sizes, width, height, length, etc. Removed inheritances in YAxis option classes. Closed issues: 682: Missing property - XAxisPlotLinesLabel.Text 688: backgroundColor and plotBackgroundColor are...New ProjectsAccountingTest: just to learn asp.net mvc 3 Agile Poker Cards for Windows Mobile: During a scrum or other agile processes, you have to estimate the size of a user story during a planning session. With the Agile Poker Cards program there is no need for using real cards anymore!Buildinator: Buildinator generates TFS Build definitions from an XML file, enabling canonical "templates" that make it easy to add or copy build definitions.Clipboard Capture Plugin: Captures an image in the clipboard and gives you more options to insert the image into Live WriterComercial HS: Commercial hsCommonExtranet: CommonExtranet is a basis for an Extranet web site with a user authentication mechanism that incorporates password aging and various features expected on a domain LogOnDataVeryLite: DataVeryLite is a lightweight *Persistence Framework*. DataVeryLite???????*?????*. ??????Nhibernate?????,??Linq to sql???????,?????DataVeryLite.daydayup: snd\realdamon_cpDNN Extension Url Providers: The DNN Extension Url Providers project contains installable extensions for extending DNN URL functionality.DotNetNuke Kitchen Sink: A sample module project for DotNetNuke with a variety of different scenarios covered.Football Team Management: Manage team, player, match and staffFreePiano: Play piano using your computer keyboard.GIF animator: Dev in progessI'm Feeling Lucky Plugin: Lets you put a link in that acts as though doing an I'm Feeling Lucky search.Insert Video Jnr: This is a baby version of my Video plugin, it is intended for Hosted Wordpress blogs only and shouldn't be used with other blog providers.jabbrmercurial: 22Kax.WebControls.RadioButtonList: Web Custom Control that extend RadioButtonList to allow uncheckable state.Kinect Screen Aware: Kinect Screen Aware uses a Kinect to detect touch, hover, gestures, and voice on a standard television display. It's designed to be low cost and easy to setuplppbop: Aplikasi Laporan Bantuan Operasional PendidikanmobiSms: mobismsnga: National Geography of AzerothRadminPassword: ????????? ??? ??????????????? ????? ??????? ? ????????? ????????? ?????????? ?????????? ?? Radmin. A program to automatically enter the passwords in the famous PC remote control software Radmin.Rx Heat: Rx Heat is a library of helper classes that complements the Reactive Extensions Library with additional features. Schema Generator: The basic idea behind this utility is to emit the database schema from an existing SQL Server database. From a developer perspective, it is sometimes very much handy to quickly take a printout of the database structure for creating the UI layout.SharePoint Packager: Perform the instalation, upgrade and retraction of Ms Sharepoint Applications fast, easy and efficientsmartTouch: :-)SpotifyLync: A small tray application that reports your Spotify status to your Microsoft Lync client. Alos contains additional Spotify / Lync features.Syngine: A simple to use game framework using MonoGame and Farseer Physicstest060601CM: testtestMC053003: testToSic.Eav: A powerfull EAV (Entity-Attribute-Value) system created by 2sic Internet Solutions in Switzerland. It's currently mainly used inside 2SexyContent for DotNetNukeTraceLight: <project name> TraceLight ray tracer </project name> <programming language> C# </programming language>trakr: minimalist webtracking software written in python and twistedTwitterXML: A .NET wrapper library for the Twitter REST API. Currently, all of the methods return an XMLDocument. Also included are classes for Users, Statuses, and Direct Messages that use XML serialization for converting the XML responses to objects with a Deserialize() call.Universal Parking Centre: Universal Parking Centre is a website-based software developed by Center Code to help you in organizing your parking business.Velocity OS: Be fast, Be strong. It's Velocity.WinKeGen Code Samples: This project will allow beginning developers a close look at some code samples and variations of how to use those samples in their own code.WinRT Synth lib: Project Description this project aims to provide an easy-to-use API, for sound synthesis under winrt, in c#. It use the XAudio2 api for the playback of the sounWpfCollaborative3D: WpfCollaborative3DX-Parking: Our online parking sites , try at : x-parking.pemrogramaninternet.infoYnote Plugins: Ynote Classic Plugins which help in transforming Ynote Classic into a powerful HTML / XML Editor or an IDE.

    Read the article

  • Checking who is connected to your server, with PowerShell.

    - by Fatherjack
    There are many occasions when, as a DBA, you want to see who is connected to your SQL Server, along with how they are connecting and what sort of activities they are carrying out. I’m going to look at a couple of ways of getting this information and compare the effort required and the results achieved of each. SQL Server comes with a couple of stored procedures to help with this sort of task – sp_who and its undocumented counterpart sp_who2. There is also the pumped up version of these called sp_whoisactive, written by Adam Machanic which does way more than these procedures. I wholly recommend you try it out if you don’t already know how it works. When it comes to serious interrogation of your SQL Server activity then it is absolutely indispensable. Anyway, back to the point of this blog, we are going to look at getting the information from sp_who2 for a remote server. I wrote this Powershell script a week or so ago and was quietly happy with it for a while. I’m relatively new to Powershell so forgive both my rather low threshold for entertainment and the fact that something so simple is a moderate achievement for me. $Server = 'SERVERNAME' $SMOServer = New-Object Microsoft.SQLServer.Management.SMO.Server $Server # connection and query stuff         $ConnectionStr = "Server=$Server;Database=Master;Integrated Security=True" $Query = "EXEC sp_who2" $Connection = new-object system.Data.SQLClient.SQLConnection $Table = new-object "System.Data.DataTable" $Connection.connectionstring = $ConnectionStr try{ $Connection.open() $Command = $Connection.CreateCommand() $Command.commandtext = $Query $result = $Command.ExecuteReader() $Table.Load($result) } catch{ # Show error $error[0] | format-list -Force } $Title = "Data access processes (" + $Table.Rows.Count + ")" $Table | Out-GridView -Title $Title $Connection.close() So this is pretty straightforward, create an SMO object that represents our chosen server, define a connection to the database and a table object for the results when we get them, execute our query over the connection, load the results into our table object and then, if everything is error free display these results to the PowerShell grid viewer. The query simply gets the results of ‘EXEC sp_who2′ for us. Depending on how many connections there are will influence how long the query runs. The grid viewer lets me sort and search the results so it can be a pretty handy way to locate troublesome connections. Like I say, I was quite pleased with this, it seems a pretty simple script and was working well for me, I have added a few parameters to control the output and give me more specific details but then I see a script that uses the $SMOServer object itself to provide the process information and saves having to define the connection object and query specifications. $Server = 'SERVERNAME' $SMOServer = New-Object Microsoft.SQLServer.Management.SMO.Server $Server $Processes = $SMOServer.EnumProcesses() $Title = "SMO processes (" + $Processes.Rows.Count + ")" $Processes | Out-GridView -Title $Title Create the SMO object of our server and then call the EnumProcesses method to get all the process information from the server. Staggeringly simple! The results are a little different though. Some columns are the same and we can see the same basic information so my first thought was to which runs faster – so that I can get my results more quickly and also so that I place less stress on my server(s). PowerShell comes with a great way of testing this – the Measure-Command function. All you have to do is wrap your piece of code in Measure-Command {[your code here]} and it will spit out the time taken to execute the code. So, I placed both of the above methods of getting SQL Server process connections in two Measure-Command wrappers and pressed F5! The Powershell console goes blank for a while as the code is executed internally when Measure-Command is used but the grid viewer windows appear and the console shows this. You can take the output from Measure-Command and format it for easier reading but in a simple comparison like this we can simply cross refer the TotalMilliseconds values from the two result sets to see how the two methods performed. The query execution method (running EXEC sp_who2 ) is the first set of timings and the SMO EnumProcesses is the second. I have run these on a variety of servers and while the results vary from execution to execution I have never seen the SMO version slower than the other. The difference has varied and the time for both has ranged from sub-second as we see above to almost 5 seconds on other systems. This difference, I would suggest is partly due to the cost overhead of having to construct the data connection and so on where as the SMO EnumProcesses method has the connection to the server already in place and just needs to call back the process information. There is also the difference in the data sets to consider. Let’s take a look at what we get and where the two methods differ Query execution method (sp_who2) SMO EnumProcesses Description - Urn What looks like an XML or JSON representation of the server name and the process ID SPID Spid The process ID Status Status The status of the process Login Login The login name of the user executing the command HostName Host The name of the computer where the  process originated BlkBy BlockingSpid The SPID of a process that is blocking this one DBName Database The database that this process is connected to Command Command The type of command that is executing CPUTime Cpu The CPU activity related to this process DiskIO - The Disk IO activity related to this process LastBatch - The time the last batch was executed from this process. ProgramName Program The application that is facilitating the process connection to the SQL Server. SPID1 - In my experience this is always the same value as SPID. REQUESTID - In my experience this is always 0 - Name In my experience this is always the same value as SPID and so could be seen as analogous to SPID1 from sp_who2 - MemUsage An indication of the memory used by this process but I don’t know what it is measured in (bytes, Kb, Mb…) - IsSystem True or False depending on whether the process is internal to the SQL Server instance or has been created by an external connection requesting data. - ExecutionContextID In my experience this is always 0 so could be analogous to REQUESTID from sp_who2. Please note, these are my own very brief descriptions of these columns, detail can be found from MSDN for columns in the sp_who results here http://msdn.microsoft.com/en-GB/library/ms174313.aspx. Where the columns are common then I would use that description, in other cases then the information returned is purely for interpretation by the reader. Rather annoyingly both result sets have useful information that the other doesn’t. sp_who2 returns Disk IO and LastBatch information which is really useful but the SMO processes method give you IsSystem and MemUsage which have their place in fault diagnosis methods too. So which is better? On reflection I think I prefer to use the sp_who2 method primarily but knowing that the SMO Enumprocesses method is there when I need it is really useful and I’m sure I’ll use it regularly. I’m OK with the fact that it is the slower method because Measure-Command has shown me how close it is to the other option and that it really isn’t a large enough margin to matter.

    Read the article

  • Log Blog

    - by PointsToShare
    © 2011 By: Dov Trietsch. All rights reserved Logging – A log blog In a another blog (Missing Fields and Defaults) I spoke about not doing a blog about log files, but then I looked at it again and realized that this is a nice opportunity to show a simple yet powerful tool and also deal with static variables and functions in C#. My log had to be able to answer a few simple logging rules:   To log or not to log? That is the question – Always log! That is the answer  Do we share a log? Even when a file is opened with a minimal lock, it does not share well and performance greatly suffers. So sharing a log is not a good idea. Also, when sharing, it is harder to find your particular entries and you have to establish rules about retention. My recommendation – Do Not Share!  How verbose? Your log can be very verbose – a good thing when testing, very terse – a good thing in day-to-day runs, or somewhere in between. You must be the judge. In my Blog, I elect to always report a run with start and end times, and always report errors. I normally use 5 levels of logging: 4 – write all, 3 – write more, 2 – write some, 1 – write errors and timing, 0 – write none. The code sample below is more general than that. It uses the config file to set the max log level and each call to the log assigns a level to the call itself. If the level is above the .config highest level, the line will not be written. Programmers decide which log belongs to which level and thus we can set the .config differently for production and testing.  Where do I keep the log? If your career is important to you, discuss this with the boss and with the system admin. We keep logs in the L: drive of our server and make sure that we have a directory for each app that needs a log. When adding a new app, add a new directory. The default location for the log is also found in the .config file Print One or Many? There are two options here:   1.     Print many, Open but once once – you start the stream and close it only when the program ends. This is what you can do when you perform in “batch” mode like in a console app or a stsadm extension.The advantage to this is that starting a closing a stream is expensive and time consuming and because we use a unique file, keeping it open for a long time does not cause contention problems. 2.     Print one entry at a time or Open many – every time you write a line, you start the stream, write to it and close it. This work for event receivers, feature receivers, and web parts. Here scalability requires us to create objects on the fly and get rid of them as soon as possible.  A default value of the onceOrMany resides in the .config.  All of the above applies to any windows or web application, not just SharePoint.  So as usual, here is a routine that does it all, and a few simple functions that call it for a variety of purposes.   So without further ado, here is app.config  <?xml version="1.0" encoding="utf-8" ?> <configuration>     <configSections>         <sectionGroup name="applicationSettings" type="System.Configuration.ApplicationSettingsGroup, System, Version=2.0.0.0, Culture=neutral, ublicKeyToken=b77a5c561934e089" >         <section name="statics.Properties.Settings" type="System.Configuration.ClientSettingsSection, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089" requirePermission="false" />         </sectionGroup>     </configSections>     <applicationSettings>         <statics.Properties.Settings>             <setting name="oneOrMany" serializeAs="String">                 <value>False</value>             </setting>             <setting name="logURI" serializeAs="String">                 <value>C:\staticLog.txt</value>             </setting>             <setting name="highestLevel" serializeAs="String">                 <value>2</value>             </setting>         </statics.Properties.Settings>     </applicationSettings> </configuration>   And now the code:  In order to persist the variables between calls and also to be able to persist (or not to persist) the log file itself, I created an EventLog class with static variables and functions. Static functions do not need an instance of the class in order to work. If you ever wondered why our Main function is static, the answer is that something needs to run before instantiation so that other objects may be instantiated, and this is what the “static” Main does. The various logging functions and variables are created as static because they do not need instantiation and as a fringe benefit they remain un-destroyed between calls. The Main function here is just used for testing. Note that it does not instantiate anything, just uses the log functions. This is possible because the functions are static. Also note that the function calls are of the form: Class.Function.  using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.IO; namespace statics {       class Program     {         static void Main(string[] args)         {             //write a single line             EventLog.LogEvents("ha ha", 3, "C:\\hahafile.txt", 4, true, false);             //this single line will not be written because the msgLevel is too high             EventLog.LogEvents("baba", 3, "C:\\babafile.txt", 2, true, false);             //The next 4 lines will be written in succession - no closing             EventLog.LogLine("blah blah", 1);             EventLog.LogLine("da da", 1);             EventLog.LogLine("ma ma", 1);             EventLog.LogLine("lah lah", 1);             EventLog.CloseLog(); // log will close             //now with specific functions             EventLog.LogSingleLine("one line", 1);             //this is just a test, the log is already closed             EventLog.CloseLog();         }     }     public class EventLog     {         public static string logURI = Properties.Settings.Default.logURI;         public static bool isOneLine = Properties.Settings.Default.oneOrMany;         public static bool isOpen = false;         public static int highestLevel = Properties.Settings.Default.highestLevel;         public static StreamWriter sw;         /// <summary>         /// the program will "print" the msg into the log         /// unless msgLevel is > msgLimit         /// onceOrMany is true when once - the program will open the log         /// print the msg and close the log. False when many the program will         /// keep the log open until close = true         /// normally all the arguments will come from the app.config         /// called by many overloads of logLine         /// </summary>         /// <param name="msg"></param>         /// <param name="msgLevel"></param>         /// <param name="logFileName"></param>         /// <param name="msgLimit"></param>         /// <param name="onceOrMany"></param>         /// <param name="close"></param>         public static void LogEvents(string msg, int msgLevel, string logFileName, int msgLimit, bool oneOrMany, bool close)         {             //to print or not to print             if (msgLevel <= msgLimit)             {                 //open the file. from the argument (logFileName) or from the config (logURI)                 if (!isOpen)                 {                     string logFile = logFileName;                     if (logFileName == "")                     {                         logFile = logURI;                     }                     sw = new StreamWriter(logFile, true);                     sw.WriteLine("Started At: " + DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss"));                     isOpen = true;                 }                 //print                 sw.WriteLine(msg);             }             //close when instructed             if (close || oneOrMany)             {                 if (isOpen)                 {                     sw.WriteLine("Ended At: " + DateTime.Now.ToString("yyyy-MM-dd HH:mm:ss"));                     sw.Close();                     isOpen = false;                 }             }         }           /// <summary>         /// The simplest, just msg and level         /// </summary>         /// <param name="msg"></param>         /// <param name="msgLevel"></param>         public static void LogLine(string msg, int msgLevel)         {             //use the given msg and msgLevel and all others are defaults             LogEvents(msg, msgLevel, "", highestLevel, isOneLine, false);         }                 /// <summary>         /// one line at a time - open print close         /// </summary>         /// <param name="msg"></param>         /// <param name="msgLevel"></param>         public static void LogSingleLine(string msg, int msgLevel)         {             LogEvents(msg, msgLevel, "", highestLevel, true, true);         }           /// <summary>         /// used to close. high level, low limit, once and close are set         /// </summary>         /// <param name="close"></param>         public static void CloseLog()         {             LogEvents("", 15, "", 1, true, true);         }           }     }   }   That’s all folks!

    Read the article

  • CodePlex Daily Summary for Sunday, January 09, 2011

    CodePlex Daily Summary for Sunday, January 09, 2011Popular ReleasesEnhSim: EnhSim 2.2.10 BETA: 2.2.10 BETAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Removed the part...TweetSharp: TweetSharp v2.0.0.0 - Preview 7: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Note: This code is currently preview quality. Preview 7 ChangesFixes the regression issue in OAuth from Preview 6 Preview 6 ChangesMaintenance release with user reported fixes Preview 5 ChangesMaintenance release with user reported fixes Third Party Library VersionsHammock v1.0.6: http://hammock.codeplex.com Json.NET 3.5 Release 8: http://json.codeplex.comExtended WPF Toolkit: Extended WPF Toolkit - 1.3.0: What's in the 1.3.0 Release?BusyIndicator ButtonSpinner ChildWindow ColorPicker - Updated (Breaking Changes) DateTimeUpDown - New Control Magnifier - New Control MaskedTextBox - New Control MessageBox NumericUpDown RichTextBox RichTextBoxFormatBar - Updated .NET 3.5 binaries and SourcePlease note: The Extended WPF Toolkit 3.5 is dependent on .NET Framework 3.5 and the WPFToolkit. You must install .NET Framework 3.5 and the WPFToolkit in order to use any features in the To...sNPCedit: sNPCedit v0.9d: added elementclient coordinate catcher to catch coordinates select a target (ingame) i.e. your char, npc or monster than click the button and coordinates+direction will be transfered to the selected row in the table corrected labels from Rot to Direction (because it is a vector)AutoLoL: AutoLoL v1.5.2: Implemented the Auto Updater Fix: Your settings will no longer be cleared with new releases of AutoLoL The mastery Editor and Browser now have their own tabs instead of nested tabs The Browser tab will only show the masteries matching ALL filters instead of just one Added a 'Browse' button in the Mastery Editor tab to open the Masteries Directory The Browser tab now shows a message when there are no mastery files in the Masteries Directory Fix: Fixed the Save As dialog again, for ...Ionics Isapi Rewrite Filter: 2.1 latest stable: V2.1 is stable, and is in maintenance mode. This is v2.1.1.25. It is a bug-fix release. There are no new features. 28629 29172 28722 27626 28074 29164 27659 27900 many documentation updates and fixes proper x64 build environment. This release includes x64 binaries in zip form, but no x64 MSI file. You'll have to manually install x64 servers, following the instructions in the documentation.StyleCop for ReSharper: StyleCop for ReSharper 5.1.14980.000: A considerable amount of work has gone into this release: Huge focus on performance around the violation scanning subsystem: - caching added to reduce IO operations around reading and merging of settings files - caching added to reduce creation of expensive objects Users should notice condsiderable perf boost and a decrease in memory usage. Bug Fixes: - StyleCop's new ObjectBasedEnvironment object does not resolve the StyleCop installation path, thus it does not return the correct path ...VivoSocial: VivoSocial 7.4.1: New release with bug fixes and updates for performance.SSH.NET Library: 2011.1.6: Fixes CommandTimeout default value is fixed to infinite. Port Forwarding feature improvements Memory leaks fixes New Features Add ErrorOccurred event to handle errors that occurred on different thread New and improve SFTP features SftpFile now has more attributes and some operations Most standard operations now available Allow specify encoding for command execution KeyboardInteractiveConnectionInfo class added for "keyboard-interactive" authentication. Add ability to specify bo....NET Extensions - Extension Methods Library for C# and VB.NET: Release 2011.03: Added lot's of new extensions and new projects for MVC and Entity Framework. object.FindTypeByRecursion Int32.InRange String.RemoveAllSpecialCharacters String.IsEmptyOrWhiteSpace String.IsNotEmptyOrWhiteSpace String.IfEmptyOrWhiteSpace String.ToUpperFirstLetter String.GetBytes String.ToTitleCase String.ToPlural DateTime.GetDaysInYear DateTime.GetPeriodOfDay IEnumberable.RemoveAll IEnumberable.Distinct ICollection.RemoveAll IList.Join IList.Match IList.Cast Array.IsNullOrEmpty Array.W...EFMVC - ASP.NET MVC 3 and EF Code First: EFMVC 0.5- ASP.NET MVC 3 and EF Code First: Demo web app ASP.NET MVC 3, Razor and EF Code FirstVidCoder: 0.8.0: Added x64 version. Made the audio output preview more detailed and accurate. If the chosen encoder or mixdown is incompatible with the source, the fallback that will be used is displayed. Added "Auto" to the audio mixdown choices. Reworked non-anamorphic size calculation to work better with non-standard pixel aspect ratios and cropping. Reworked Custom anamorphic to be more intuitive and allow display width to be set automatically (Thanks, Statick). Allowing higher bitrates for 6-ch....NET Voice Recorder: Auto-Tune Release: This is the source code and binaries to accompany the article on the Coding 4 Fun website. It is the Auto Tuner release of the .NET Voice Recorder application.BloodSim: BloodSim - 1.3.2.0: - Simulation Log is now automatically disabled and hidden when running 10 or more iterations - Hit and Expertise are now entered by Rating, and include option for a Racial Expertise bonus - Added option for boss to use a periodic magic ability (Dragon Breath) - Added option for boss to periodically Enrage, gaining a Damage/Attack Speed buffAllNewsManager.NET: AllNewsManager.NET 1.2.1: AllNewsManager.NET 1.2.1 It is a minor update from version 1.2xUnit.net - Unit Testing for .NET: xUnit.net 1.7 Beta: xUnit.net release 1.7 betaBuild #1533 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision)...Json.NET: Json.NET 4.0 Release 1: New feature - Added Windows Phone 7 project New feature - Added dynamic support to LINQ to JSON New feature - Added dynamic support to serializer New feature - Added INotifyCollectionChanged to JContainer in .NET 4 build New feature - Added ReadAsDateTimeOffset to JsonReader New feature - Added ReadAsDecimal to JsonReader New feature - Added covariance to IJEnumerable type parameter New feature - Added XmlSerializer style Specified property support New feature - Added ...ASP .NET MVC CMS (Content Management System): Atomic CMS 2.1.2: Atomic CMS 2.1.2 release notes Atomic CMS installation guide N2 CMS: 2.1: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) All-round improvements and bugfixes File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the template packs (above) and open the proj...Mobile Device Detection and Redirection: 0.1.11.10: IMPORTANT CHANGESThis release changes the way some WURFL capabilities and attributes are exposed to .NET developers. If you cast MobileCapabilities to return some values then please read the Release Note before implementing this release. The following code snippet can be used to access any WURFL capability. For instance, if the device is a tablet: string capability = Request.Browser["is_tablet"]; SummaryNew attributes have been added to the redirect section: originalUrlAsQueryString If se...New Projects[OOBL] Projekt: Projekt iz kolegija Objektno Oblikovanje na Fakultetu Elektrotehnike i Racunarstva u Zagrebu.Aikido Glossary Reader: The Aikido Glossary Reader application makes it easier for Aikido students to search for Japanese terms used in martial arts training.AsyncFunc: AsyncFunc makes it easy to implement Event-based Asynchronous Pattern in .NET.BeijingAgricultureScience: ???????????BizTalk 5010 999 Generation: Create 5010 999 acknowledgement from the default 997 Generated by BizTalk ServerCheck if Knowledge Base fix is installed script: A handy script that checks if a knowledge base fix is installed or not.Coproject - rich project management: Coproject is a sample Silverlight application built on WCF RIA Services and Caliburn.Micro framework. It should demonstrate typical scenarios in business applications.DeskNote: DeskNote, makes taking and displaying a notes a lot easier. Have your notes on your desktop, write your notes in a text file and it will be automatically displayed on the desktop. DriveBackup: Are you tired of losing data? And do you sometimes forget to backup your data? Then this program is right for you, one-time setup with each new removable disk and after that Drive Backup instantly backs up your data in the background, without you pushing a button.ELSEngine XNA Game OS Engine: XNA Game Operating System Engine, designed to handle and make easy to use Interface, Interface assets, game screens, menus, controls, and more. Designed to be adaptable to XBOX or Windows, and to ANY style game.FermaProject: Team Ferma ProjectHalfNetworkNET: HalfNetworkNET makes it easier for <target user group> to <activity>. You'll no longer have to <activity>. It's developed in C++/CLILFS Record: LFS Record is a camera animation and video recording tool for Live for Speed users. It doesn't replace programs like Fraps 100%, but in most cases you'll find LFS Record allows a lot more freedom and control. It's developed in C# using WPF .net 4.MPC HC Web remote: A Web interface for Media Player Classic Home Cinema. Designed especialy for smart phones. Tested on Nokia 5800, but in theory it works on all smart phones with wifi & a browser.MSMS2CMP: Very simple application to swap particular strings (MSMS and CMPD number) in .mgf files (results from Mascot search engine used to identify proteins by mass spectrometry data). This might be a kind of example of using Regex and Text File I/OMy Now Playing to Twitter: A Simple WinForm Program that can get what user is playing in Winamp and send a status to Twitter. It's wrote in VB.Net but some source code is port of MiniTwitter(http://minitwitter.codeplex.com)and uses tagLib-sharp library(https://github.com/mono/taglib-sharp)RPM Header for .NET: RPM Header for .NET allows a developer to access the headers of RPM Package Manager files. It's developed in C#.SilverMenu: SilverMenu brings some of the menu functionality of Silverlight for Windows Phone 7 to the XNA world.Smartgrid Smartmeter GRYD: The GRYD smartgrid and Smartmeter project shows features and capabilities like Demand Response, Smart Home Appliance Load Curtailments, DMS integration with SCADA etc. GRYD, Gryd.org is to teach you the technology, this is only for personal use and not commercial use.WakeMeQt: WakeMeQt is a smart alarm for the Nokia n810 (and probably n800) internet tablets. Using a Nintendo Wii remote as the sensor it monitors your movements during sleep and chooses the optimal time to wake you. It's developed in C++ using the Qt framework.WPF SplitButton & MenuButton: This WPF SplitButton and MenuButton implementation aims to be more robust and visual attractive than the other WPF split buttons available on CodePlex and elsewhere. The code is based on David Anson's implementation made available on his blog.Y2XMas: Merry Chirstmas - Happy New Year 2011 software

    Read the article

  • CodePlex Daily Summary for Tuesday, October 08, 2013

    CodePlex Daily Summary for Tuesday, October 08, 2013Popular ReleasesTimeWatcher: Time Watcher Beta Stable Release: First release. For x86 & x64 computers. Tested on Windows 7 & Windows 8. The alarm functionality is not yet implemented !!Keepass2Android: 0.9 preview: Support for Dropbox (read/write/sync) Integrated custom file browserLayered Architecture Sample for Azure: Leave Sample - October 2013 (for Azure): Thank You for downloading Layered Architecture Sample. It is important that you read the accompanying README.txt file for setup and installation instructions. Please download and install the Windows Azure SDK before opening these samples. This is a set of revised samples that illustrates how the layered architecture pattern can be applied to web applications that are deployed to Windows Azure. This version is only supported on Visual Studio 2012. This set contains: LeaveSample-WebRoles - AS...NuGet: NuGet 2.7.1: Released October 07, 2013. Release notes: http://docs.nuget.org/docs/release-notes/nuget-2.7.1 Important note: After downloading the signed build of NuGet.exe, if you perform an update using the "nuget.exe update -self" command, it will revert back to the unsigned build.DNNParallax: DNNParallax_01.00.03_Install: This software package includes the DNNParallax skin and the DNNParallax container extensions. Version 01.00.03 Added Parallax JS Script Version 01.00.02 Using local scripts only Version 01.00.01 Fixed a minor packaging issueMugen MVVM Toolkit: Mugen MVVM Toolkit 2.0: IntroductionMugen MVVM Toolkit makes it easier to develop Silverlight, WPF, WinRT and WP applications using the Model-View-ViewModel design pattern. The purpose of the toolkit is to provide a simple framework and set of tools for getting up to speed quickly with applications based on the MVVM design pattern. The core of Toolkit contains a navigation system, windows management system, models, validation, etc. Mugen MVVM Toolkit contains all the MVVM classes such as ViewModelBase, RelayCommand,...Office Ribbon Project (under active development): Ribbon (07. Oct. 2013): Fixed Scrollbar Bug if DropDown Button is bigger than screen Added Office 2013 Theme Fixed closing the Ribbon caused a null reference exception in the RibbonButton.Dispose if the DropDown was not created yet Fixed Memory leak fix (unhooked events after Dispose) Fixed ToolStrip Selected Text 2013 and 2007 for Blue and Standard themesDynamics NAV Application Profiler: Dynamics NAV Application Profiler: Dynamics NAV Application ProfilerGhostscript.NET: Ghostscript.NET v.1.1.0.: v.1.1.0. added GhostscriptViewer state handling (SaveState, RestoreState) GhostscriptRasterizer constructor is extended in order to support usage of the existing GhostscriptViewer instance. fixed problem while using a 32-bit assembly with 32-bit version of Ghostscript on 64-bit Windows: It couldn't find a registry key of installed Ghostscript. Reported and fixed by "r0land". v.1.0.9. implemented EPS (Encapsulated PostScript) support for the GhostscriptViewer. added GhostscriptRasterize...Ghostscript Studio: Ghostscript.Studio v.1.0.2: Ghostscript Studio is easy to use Ghostscript IDE, a tool that facilitates the use of the Ghostscript interpreter by providing you with a graphical interface for postscript editing and file conversions. Ghostscript Studio allows you to preview postscript files, edit the code and execute them in order to convert PDF documents and other formats. The program allows you to convert between PDF, Postscript, EPS, TIFF, JPG and PNG by using the Ghostscript.NET Processor. v.1.0.2. added custom -c s...cmdradio: v0.1.1 binary: Default download in win32. For other OS see here. This is alpha version. Please report all bugs.Squiggle - A free open source LAN Messenger: Squiggle 3.3 Alpha: Allow using environment variables in configuration file (history db connection string, download folder location, display name, group and message) Fix for history viewer to show the correct history entries History saved with UTC timestamp This is alpha release and not recommended for use in productionMedia Companion: Media Companion MC3.579b: Fixed IMDB actor scraping for Movies and TV. Note: there are a couple of new functions that are not active, as this release needed to be done due to IMDB change. New* TV - context menu for rescrapeing Poster image/Banner image Fixed* Both - Fixed actor scraping from IMDB * Movie - Fixed Tableview if Movie's movieset was not in MC's list of moviesets * Movie - Rename with mediainfo now lowercase. * Both - added ignore "A " in titles. Separate option in General Preferences. * Tv - Changing ...VidCoder: 1.5.7 Beta: Updated HandBrake core to SVN 4819. About dialog now pulls down HandBrake version from the DLL. Added a confirmation dialog to Stop if the encode has been going on for more than 5 minutes. Fixed handling of unicode characters for input and output filenames. We now encode to UTF-8 before passing to HandBrake. Fixed a crash in the queue multiple titles dialog. Added code to rescue tool windows which get placed outside of the visible screen area.Wsus Package Publisher: Release v1.3.1310.05: Enhance the "Reboot Remote Computers", by adding a timer before the reboot occure. So that remote users can save their documents and close applications. You can also add a message to be display. In 'Tools'->'Settings'-> Misc Tab, you can set a default message. Enhance the "Compare Computers against AD", by choosing OUs to include in the comparison.Pulse: Pulse 0.6.7.3: Pulse is now accepting donations. To donate by Bitcoin or PayPal see https://pulse.codeplex.com/wikipage?title=Donations Lots of updates in v0.6.7.3: (Feature) New option allows you to disable wallpaper changing when a full screen application is running. This way Pulse doesn't slow down/lag your videos and games :) (Fix) Some users were getting Wallbase errors when logging in. This has been fixed. (Feature) Right click a provider and you can now make a copy of it by selecting the "Dupl...MoreTerra (Terraria World Viewer): MoreTerra 1.11.1: Release 1.11.1 =========== =Bug Fixes= =========== Added more tile blocks (Clouds, crimstone) Added items (binoculars, rope, Pirahna Gun) Added ores (Lead, Tin) Chests now work, I broke them yesterday. =============== =Known Issues= =============== I am having trouble with new background walls. So you will see a red outline for crimson then a pink inside. Same with where I think the queen bee lives.VG-Ripper & PG-Ripper: PG-Ripper 1.4.19: NEW: Added Option to login as Guest NEW: Added Menu Option to delete an Forum Account NEW: Added Support for "ImageTeam.org links FIXED: Fixed Ripping of http://forum.babeunion.com ForumsSimpleExcelReportMaker: Serm 0.03: SourceCode and Sample .Net Framework 3.5 AnyCPU compile.Application Architecture Guidelines: App Architecture Guidelines 3.0.8: This document is an overview of software qualities, principles, patterns, practices, tools and libraries.New ProjectsAutomating windows Azure SQL DB Backup using Worker role: This tool is used for backup functionality on SQL Azure database and tables in a periodical timeline.Billard Management System: Billard Management SystemCRM 2013 Duplicate Detection: Add client side duplicate detection back into Dynamics CRM 2013Demo2: Demo2Enough HttpClientExtensions: Enough HttpClientCachingExtensions allows you to store the response of a HttpClient call if caching is allowed.GameEngine: AInvisibleShortcuts: InvisibleShortcuts, create shortcuts to launch applications, visit folders and websites, manipulate files, execute system command lines and do anything!jean108jabbr: 1MyProject2: This is projectSeries Manager: With this tool it is easy to name all your series with the right season, episode and the title of the specific episode. Meta information is provided by the TVDBsport: sportolTechResearch: Technical researchTestProjectTodor: A test project for playing with TFS

    Read the article

  • Real tortoises keep it slow and steady. How about the backups?

    - by Maria Zakourdaev
      … Four tortoises were playing in the backyard when they decided they needed hibiscus flower snacks. They pooled their money and sent the smallest tortoise out to fetch the snacks. Two days passed and there was no sign of the tortoise. "You know, she is taking a lot of time", said one of the tortoises. A little voice from just out side the fence said, "If you are going to talk that way about me I won't go." Is it too much to request from the quite expensive 3rd party backup tool to be a way faster than the SQL server native backup? Or at least save a respectable amount of storage by producing a really smaller backup files?  By saying “really smaller”, I mean at least getting a file in half size. After Googling the internet in an attempt to understand what other “sql people” are using for database backups, I see that most people are using one of three tools which are the main players in SQL backup area:  LiteSpeed by Quest SQL Backup by Red Gate SQL Safe by Idera The feedbacks about those tools are truly emotional and happy. However, while reading the forums and blogs I have wondered, is it possible that many are accustomed to using the above tools since SQL 2000 and 2005.  This can easily be understood due to the fact that a 300GB database backup for instance, using regular a SQL 2005 backup statement would have run for about 3 hours and have produced ~150GB file (depending on the content, of course).  Then you take a 3rd party tool which performs the same backup in 30 minutes resulting in a 30GB file leaving you speechless, you run to management persuading them to buy it due to the fact that it is definitely worth the price. In addition to the increased speed and disk space savings you would also get backup file encryption and virtual restore -  features that are still missing from the SQL server. But in case you, as well as me, don’t need these additional features and only want a tool that performs a full backup MUCH faster AND produces a far smaller backup file (like the gain you observed back in SQL 2005 days) you will be quite disappointed. SQL Server backup compression feature has totally changed the market picture. Medium size database. Take a look at the table below, check out how my SQL server 2008 R2 compares to other tools when backing up a 300GB database. It appears that when talking about the backup speed, SQL 2008 R2 compresses and performs backup in similar overall times as all three other tools. 3rd party tools maximum compression level takes twice longer. Backup file gain is not that impressive, except the highest compression levels but the price that you pay is very high cpu load and much longer time. Only SQL Safe by Idera was quite fast with it’s maximum compression level but most of the run time have used 95% cpu on the server. Note that I have used two types of destination storage, SATA 11 disks and FC 53 disks and, obviously, on faster storage have got my backup ready in half time. Looking at the above results, should we spend money, bother with another layer of complexity and software middle-man for the medium sized databases? I’m definitely not going to do so.  Very large database As a next phase of this benchmark, I have moved to a 6 terabyte database which was actually my main backup target. Note, how multiple files usage enables the SQL Server backup operation to use parallel I/O and remarkably increases it’s speed, especially when the backup device is heavily striped. SQL Server supports a maximum of 64 backup devices for a single backup operation but the most speed is gained when using one file per CPU, in the case above 8 files for a 2 Quad CPU server. The impact of additional files is minimal.  However, SQLsafe doesn’t show any speed improvement between 4 files and 8 files. Of course, with such huge databases every half percent of the compression transforms into the noticeable numbers. Saving almost 470GB of space may turn the backup tool into quite valuable purchase. Still, the backup speed and high CPU are the variables that should be taken into the consideration. As for us, the backup speed is more critical than the storage and we cannot allow a production server to sustain 95% cpu for such a long time. Bottomline, 3rd party backup tool developers, we are waiting for some breakthrough release. There are a few unanswered questions, like the restore speed comparison between different tools and the impact of multiple backup files on restore operation. Stay tuned for the next benchmarks.    Benchmark server: SQL Server 2008 R2 sp1 2 Quad CPU Database location: NetApp FC 15K Aggregate 53 discs Backup statements: No matter how good that UI is, we need to run the backup tasks from inside of SQL Server Agent to make sure they are covered by our monitoring systems. I have used extended stored procedures (command line execution also is an option, I haven’t noticed any impact on the backup performance). SQL backup LiteSpeed SQL Backup SQL safe backup database <DBNAME> to disk= '\\<networkpath>\par1.bak' , disk= '\\<networkpath>\par2.bak', disk= '\\<networkpath>\par3.bak' with format, compression EXECUTE master.dbo.xp_backup_database @database = N'<DBName>', @backupname= N'<DBName> full backup', @desc = N'Test', @compressionlevel=8, @filename= N'\\<networkpath>\par1.bak', @filename= N'\\<networkpath>\par2.bak', @filename= N'\\<networkpath>\par3.bak', @init = 1 EXECUTE master.dbo.sqlbackup '-SQL "BACKUP DATABASE <DBNAME> TO DISK= ''\\<networkpath>\par1.sqb'', DISK= ''\\<networkpath>\par2.sqb'', DISK= ''\\<networkpath>\par3.sqb'' WITH DISKRETRYINTERVAL = 30, DISKRETRYCOUNT = 10, COMPRESSION = 4, INIT"' EXECUTE master.dbo.xp_ss_backup @database = 'UCMSDB', @filename = '\\<networkpath>\par1.bak', @backuptype = 'Full', @compressionlevel = 4, @backupfile = '\\<networkpath>\par2.bak', @backupfile = '\\<networkpath>\par3.bak' If you still insist on using 3rd party tools for the backups in your production environment with maximum compression level, you will definitely need to consider limiting cpu usage which will increase the backup operation time even more: RedGate : use THREADPRIORITY option ( values 0 – 6 ) LiteSpeed : use  @throttle ( percentage, like 70%) SQL safe :  the only thing I have found was @Threads option.   Yours, Maria

    Read the article

  • CodePlex Daily Summary for Monday, May 26, 2014

    CodePlex Daily Summary for Monday, May 26, 2014Popular ReleasesClosedXML - The easy way to OpenXML: ClosedXML 0.71.1: More performance improvements. It's faster and consumes less memory.Role Based Views in Microsoft Dynamics CRM 2011: Role Based Views in CRM 2011 and 2013 - 1.1.0.0: Issues fixed in this build: 1. Works for CRM 2013 2. Lookup view not getting blockedSimCityPak: SimCityPak 0.3.1.0: Main New Features: Fixed Importing of Instance Names (get rid of the Dutch translations) Added advanced editor for Decal Dictionaries Added possibility to import .PNG to generate new decals Added advanced editor for Path display entriesSimple Connect To Db: SimpleConnectToDb_v1: SimpleConnectToDb_v1CRM 2011 / CRM 2013 Form Helper: v2014.05.25: v2014.05.25 Added PhoneFormat & PhoneFormatAreaCode v2014.05.24 Initial ReleaseCreate Word documents without MS Word: Release 3.0: Add support for Sections, Sections Headers and Footers and right to left languages.Corporate News App for SharePoint 2013: CorporateNewsApp v1.6.2.0: Important note This version contains a major bug fix about the generic error "Request failed. Unexpected response data from server null" This error occurs on SharePoint Online only, following an update of the Javascript API after May 2014. If you have installed this application manually in your applications company catalog, you can download the CorporateNewsApp.app file in the zip archive and update it manually. If you have installed this application directly from the SharePoint Store, it ...DevOS: DevOS: Plugin-system added Including:DevOS.exe DevOS API.dll Files must be in the some folderTiny Deduplicator: Tiny Deduplicator 1.0.1.0: Increased version number to 1.0.1.0 Moved all options to a separate 'Options' dialog window. Allows the user to specify a selection strategy which will help when dealing with large numbers of duplicate files. Available options are "None," "Keep First," and "Keep Last"C64 Studio: 3.5: Add: BASIC renumber function Add: !PET pseudo op Add: elseif for !if, } else { pseudo op Add: !TRACE pseudo op Add: Watches are saved/restored with a solution Add: Ctrl-A works now in export assembly controls Add: Preliminary graphic import dialog (not fully functional yet) Add: range and block selection in sprite/charset editor (Shift-Click = range, Alt-Click = block) Fix: Expression evaluator could miscalculate when both division and multiplication were in an expression without parenthesisSEToolbox: SEToolbox 01.031.009 Release 1: Added mirroring of ConveyorTubeCurved. Updated Ship cube rotation to rotate ship back to original location (cubes are reoriented but ship appears no different to outsider), and to rotate Grouped items. Repair now fixes the loss of Grouped controls due to changes in Space Engineers 01.030. Added export asteroids. Rejoin ships will merge grouping and conveyor systems (even though broken ships currently only maintain the Grouping on one part of the ship). Installation of this version wi...Player Framework by Microsoft: Player Framework for Windows and WP v2.0: Support for new Universal and Windows Phone 8.1 projects for both Xaml and JavaScript projects. See a detailed list of improvements, breaking changes and a general overview of version 2 ADDITIONAL DOWNLOADSSmooth Streaming Client SDK for Windows 8 Applications Smooth Streaming Client SDK for Windows 8.1 Applications Smooth Streaming Client SDK for Windows Phone 8.1 Applications Microsoft PlayReady Client SDK for Windows 8 Applications Microsoft PlayReady Client SDK for Windows 8.1 Applicat...TerraMap (Terraria World Map Viewer): TerraMap 1.0.6: Added support for the new Terraria v1.2.4 update. New items, walls, and tiles Added the ability to select multiple highlighted block types. Added a dynamic, interactive highlight opacity slider, making it easier to find highlighted tiles with dark colors (and fixed blurriness from 1.0.5 alpha). Added ability to find Enchanted Swords (in the stone) and Water Bolt books Fixed Issue 35206: Hightlight/Find doesn't work for Demon Altars Fixed finding Demon Hearts/Shadow Orbs Fixed inst...DotNet.Highcharts: DotNet.Highcharts 4.0 with Examples: DotNet.Highcharts 4.0 Tested and adapted to the latest version of Highcharts 4.0.1 Added new chart type: Heatmap Added new type PointPlacement which represents enumeration or number for the padding of the X axis. Changed target framework from .NET Framework 4 to .NET Framework 4.5. Closed issues: 974: Add 'overflow' property to PlotOptionsColumnDataLabels class 997: Split container from JS 1006: Series/Categories with numeric names don't render DotNet.Highcharts.Samples Updated s...ConEmu - Windows console with tabs: ConEmu 140523 [Alpha]: ConEmu - developer build x86 and x64 versions. Written in C++, no additional packages required. Run "ConEmu.exe" or "ConEmu64.exe". Some useful information you may found: http://superuser.com/questions/tagged/conemu http://code.google.com/p/conemu-maximus5/wiki/ConEmuFAQ http://code.google.com/p/conemu-maximus5/wiki/TableOfContents If you want to use ConEmu in portable mode, just create empty "ConEmu.xml" file near to "ConEmu.exe" Aspose for Apache POI: Missing Features of Apache POI SL - v 1.1: Release contain the Missing Features in Apache POI SL SDK in Comparison with Aspose.Slides for dealing with Microsoft Power Point. What's New ?Following Examples: Managing Slide Transitions Manage Smart Art Adding Media Player Adding Audio Frame to Slide Feedback and Suggestions Many more examples are yet to come here. Keep visiting us. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.PowerShell App Deployment Toolkit: PowerShell App Deployment Toolkit v3.1.3: Added CompressLogs option to the config file. Each Install / Uninstall creates a timestamped zip file with all MSI and PSAppDeployToolkit logs contained within Added variable expansion to all paths in the configuration file Added documentation for each of the Toolkit internal variables that can be used Changed Install-MSUpdates to continue if any errors are encountered when installing updates Implement /Force parameter on Update-GroupPolicy (ensure that any logoff message is ignored) ...WordMat: WordMat v. 1.07: A quick fix because scientific notation was broken in v. 1.06 read more at http://wordmat.blogspot.com????: 《????》: 《????》(c???)??“????”???????,???????????????C?????????。???????,???????????????????????. ??????????????????????????????????;????????????????????????????。Mini SQL Query: Mini SQL Query (1.0.72.457): Apologies for the previous update! FK issue fixed and also a template data cache issue.New ProjectsASP.Net MCV4 Simplified Code Samples: This project intended to simplify the same. In this project each task is implemented with minimum lines of code to reduces complicity.Calvin: net???CodeLatino by Latinosoft: A Modified version for codeShow -- Probably taking more than a month.freeasyBackup: A free and easy to use Backup Tool for everyone. Without any cloud restrictions. freeasyExplorer: A free and easy to use File Explorer for everyone.openPDFspeedreader: #spritz #pdfreader #speedreader PDF Editor to Edit PDF Files in your ASP.NET Applications: This sample application allows the users to edit PDF files online using Aspose.Pdf for .NET.SharePoint World Cup 2013: world cup 2014SSAS Long Running Query Performance Helper: This utility helps investigate long running multidimensional or mining queries in discovery, de-parameterization and re-parameterization back to source format.

    Read the article

  • General Purpose ASP.NET Data Source Control

    - by Ricardo Peres
    OK, you already know about the ObjectDataSource control, so what’s wrong with it? Well, for once, it doesn’t pass any context to the SelectMethod, you only get the parameters supplied on the SelectParameters plus the desired ordering, starting page and maximum number of rows to display. Also, you must have two separate methods, one for actually retrieving the data, and the other for getting the total number of records (SelectCountMethod). Finally, you don’t get a chance to alter the supplied data before you bind it to the target control. I wanted something simple to use, and more similar to ASP.NET 4.5, where you can have the select method on the page itself, so I came up with CustomDataSource. Here’s how to use it (I chose a GridView, but it works equally well with any regular data-bound control): 1: <web:CustomDataSourceControl runat="server" ID="datasource" PageSize="10" OnData="OnData" /> 2: <asp:GridView runat="server" ID="grid" DataSourceID="datasource" DataKeyNames="Id" PageSize="10" AllowPaging="true" AllowSorting="true" /> The OnData event handler receives a DataEventArgs instance, which contains some properties that describe the desired paging location and size, and it’s where you return the data plus the total record count. Here’s a quick example: 1: protected void OnData(object sender, DataEventArgs e) 2: { 3: //just return some data 4: var data = Enumerable.Range(e.StartRowIndex, e.PageSize).Select(x => new { Id = x, Value = x.ToString(), IsPair = ((x % 2) == 0) }); 5: e.Data = data; 6: //the total number of records 7: e.TotalRowCount = 100; 8: } Here’s the code for the DataEventArgs: 1: [Serializable] 2: public class DataEventArgs : EventArgs 3: { 4: public DataEventArgs(Int32 pageSize, Int32 startRowIndex, String sortExpression, IOrderedDictionary parameters) 5: { 6: this.PageSize = pageSize; 7: this.StartRowIndex = startRowIndex; 8: this.SortExpression = sortExpression; 9: this.Parameters = parameters; 10: } 11:  12: public IEnumerable Data 13: { 14: get; 15: set; 16: } 17:  18: public IOrderedDictionary Parameters 19: { 20: get; 21: private set; 22: } 23:  24: public String SortExpression 25: { 26: get; 27: private set; 28: } 29:  30: public Int32 StartRowIndex 31: { 32: get; 33: private set; 34: } 35:  36: public Int32 PageSize 37: { 38: get; 39: private set; 40: } 41:  42: public Int32 TotalRowCount 43: { 44: get; 45: set; 46: } 47: } As you can guess, the StartRowIndex and PageSize receive the starting row and the desired page size, where the page size comes from the PageSize property on the markup. There’s also a SortExpression, which gets passed the sorted-by column and direction (if descending) and a dictionary containing all the values coming from the SelectParameters collection, if any. All of these are read only, and it is your responsibility to fill in the Data and TotalRowCount. The code for the CustomDataSource is very simple: 1: [NonVisualControl] 2: public class CustomDataSourceControl : DataSourceControl 3: { 4: public CustomDataSourceControl() 5: { 6: this.SelectParameters = new ParameterCollection(); 7: } 8:  9: protected override DataSourceView GetView(String viewName) 10: { 11: return (new CustomDataSourceView(this, viewName)); 12: } 13:  14: internal void GetData(DataEventArgs args) 15: { 16: this.OnData(args); 17: } 18:  19: protected virtual void OnData(DataEventArgs args) 20: { 21: EventHandler<DataEventArgs> data = this.Data; 22:  23: if (data != null) 24: { 25: data(this, args); 26: } 27: } 28:  29: [Browsable(false)] 30: [DesignerSerializationVisibility(DesignerSerializationVisibility.Visible)] 31: [PersistenceMode(PersistenceMode.InnerProperty)] 32: public ParameterCollection SelectParameters 33: { 34: get; 35: private set; 36: } 37:  38: public event EventHandler<DataEventArgs> Data; 39:  40: public Int32 PageSize 41: { 42: get; 43: set; 44: } 45: } Also, the code for the accompanying internal – as there is no need to use it from outside of its declaring assembly - data source view: 1: sealed class CustomDataSourceView : DataSourceView 2: { 3: private readonly CustomDataSourceControl dataSourceControl = null; 4:  5: public CustomDataSourceView(CustomDataSourceControl dataSourceControl, String viewName) : base(dataSourceControl, viewName) 6: { 7: this.dataSourceControl = dataSourceControl; 8: } 9:  10: public override Boolean CanPage 11: { 12: get 13: { 14: return (true); 15: } 16: } 17:  18: public override Boolean CanRetrieveTotalRowCount 19: { 20: get 21: { 22: return (true); 23: } 24: } 25:  26: public override Boolean CanSort 27: { 28: get 29: { 30: return (true); 31: } 32: } 33:  34: protected override IEnumerable ExecuteSelect(DataSourceSelectArguments arguments) 35: { 36: IOrderedDictionary parameters = this.dataSourceControl.SelectParameters.GetValues(HttpContext.Current, this.dataSourceControl); 37: DataEventArgs args = new DataEventArgs(this.dataSourceControl.PageSize, arguments.StartRowIndex, arguments.SortExpression, parameters); 38:  39: this.dataSourceControl.GetData(args); 40:  41: arguments.TotalRowCount = args.TotalRowCount; 42: arguments.MaximumRows = this.dataSourceControl.PageSize; 43: arguments.AddSupportedCapabilities(DataSourceCapabilities.Page | DataSourceCapabilities.Sort | DataSourceCapabilities.RetrieveTotalRowCount); 44: arguments.RetrieveTotalRowCount = true; 45:  46: if (!(args.Data is ICollection)) 47: { 48: return (args.Data.OfType<Object>().ToList()); 49: } 50: else 51: { 52: return (args.Data); 53: } 54: } 55: } As always, looking forward to hearing from you!

    Read the article

  • Why Is Vertical Resolution Monitor Resolution so Often a Multiple of 360?

    - by Jason Fitzpatrick
    Stare at a list of monitor resolutions long enough and you might notice a pattern: many of the vertical resolutions, especially those of gaming or multimedia displays, are multiples of 360 (720, 1080, 1440, etc.) But why exactly is this the case? Is it arbitrary or is there something more at work? Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites. The Question SuperUser reader Trojandestroy recently noticed something about his display interface and needs answers: YouTube recently added 1440p functionality, and for the first time I realized that all (most?) vertical resolutions are multiples of 360. Is this just because the smallest common resolution is 480×360, and it’s convenient to use multiples? (Not doubting that multiples are convenient.) And/or was that the first viewable/conveniently sized resolution, so hardware (TVs, monitors, etc) grew with 360 in mind? Taking it further, why not have a square resolution? Or something else unusual? (Assuming it’s usual enough that it’s viewable). Is it merely a pleasing-the-eye situation? So why have the display be a multiple of 360? The Answer SuperUser contributor User26129 offers us not just an answer as to why the numerical pattern exists but a history of screen design in the process: Alright, there are a couple of questions and a lot of factors here. Resolutions are a really interesting field of psychooptics meeting marketing. First of all, why are the vertical resolutions on youtube multiples of 360. This is of course just arbitrary, there is no real reason this is the case. The reason is that resolution here is not the limiting factor for Youtube videos – bandwidth is. Youtube has to re-encode every video that is uploaded a couple of times, and tries to use as little re-encoding formats/bitrates/resolutions as possible to cover all the different use cases. For low-res mobile devices they have 360×240, for higher res mobile there’s 480p, and for the computer crowd there is 360p for 2xISDN/multiuser landlines, 720p for DSL and 1080p for higher speed internet. For a while there were some other codecs than h.264, but these are slowly being phased out with h.264 having essentially ‘won’ the format war and all computers being outfitted with hardware codecs for this. Now, there is some interesting psychooptics going on as well. As I said: resolution isn’t everything. 720p with really strong compression can and will look worse than 240p at a very high bitrate. But on the other side of the spectrum: throwing more bits at a certain resolution doesn’t magically make it better beyond some point. There is an optimum here, which of course depends on both resolution and codec. In general: the optimal bitrate is actually proportional to the resolution. So the next question is: what kind of resolution steps make sense? Apparently, people need about a 2x increase in resolution to really see (and prefer) a marked difference. Anything less than that and many people will simply not bother with the higher bitrates, they’d rather use their bandwidth for other stuff. This has been researched quite a long time ago and is the big reason why we went from 720×576 (415kpix) to 1280×720 (922kpix), and then again from 1280×720 to 1920×1080 (2MP). Stuff in between is not a viable optimization target. And again, 1440P is about 3.7MP, another ~2x increase over HD. You will see a difference there. 4K is the next step after that. Next up is that magical number of 360 vertical pixels. Actually, the magic number is 120 or 128. All resolutions are some kind of multiple of 120 pixels nowadays, back in the day they used to be multiples of 128. This is something that just grew out of LCD panel industry. LCD panels use what are called line drivers, little chips that sit on the sides of your LCD screen that control how bright each subpixel is. Because historically, for reasons I don’t really know for sure, probably memory constraints, these multiple-of-128 or multiple-of-120 resolutions already existed, the industry standard line drivers became drivers with 360 line outputs (1 per subpixel). If you would tear down your 1920×1080 screen, I would be putting money on there being 16 line drivers on the top/bottom and 9 on one of the sides. Oh hey, that’s 16:9. Guess how obvious that resolution choice was back when 16:9 was ‘invented’. Then there’s the issue of aspect ratio. This is really a completely different field of psychology, but it boils down to: historically, people have believed and measured that we have a sort of wide-screen view of the world. Naturally, people believed that the most natural representation of data on a screen would be in a wide-screen view, and this is where the great anamorphic revolution of the ’60s came from when films were shot in ever wider aspect ratios. Since then, this kind of knowledge has been refined and mostly debunked. Yes, we do have a wide-angle view, but the area where we can actually see sharply – the center of our vision – is fairly round. Slightly elliptical and squashed, but not really more than about 4:3 or 3:2. So for detailed viewing, for instance for reading text on a screen, you can utilize most of your detail vision by employing an almost-square screen, a bit like the screens up to the mid-2000s. However, again this is not how marketing took it. Computers in ye olden days were used mostly for productivity and detailed work, but as they commoditized and as the computer as media consumption device evolved, people didn’t necessarily use their computer for work most of the time. They used it to watch media content: movies, television series and photos. And for that kind of viewing, you get the most ‘immersion factor’ if the screen fills as much of your vision (including your peripheral vision) as possible. Which means widescreen. But there’s more marketing still. When detail work was still an important factor, people cared about resolution. As many pixels as possible on the screen. SGI was selling almost-4K CRTs! The most optimal way to get the maximum amount of pixels out of a glass substrate is to cut it as square as possible. 1:1 or 4:3 screens have the most pixels per diagonal inch. But with displays becoming more consumery, inch-size became more important, not amount of pixels. And this is a completely different optimization target. To get the most diagonal inches out of a substrate, you want to make the screen as wide as possible. First we got 16:10, then 16:9 and there have been moderately successful panel manufacturers making 22:9 and 2:1 screens (like Philips). Even though pixel density and absolute resolution went down for a couple of years, inch-sizes went up and that’s what sold. Why buy a 19″ 1280×1024 when you can buy a 21″ 1366×768? Eh… I think that about covers all the major aspects here. There’s more of course; bandwidth limits of HDMI, DVI, DP and of course VGA played a role, and if you go back to the pre-2000s, graphics memory, in-computer bandwdith and simply the limits of commercially available RAMDACs played an important role. But for today’s considerations, this is about all you need to know. Have something to add to the explanation? Sound off in the the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.     

    Read the article

  • MapRedux - PowerShell and Big Data

    - by Dittenhafer Solutions
    MapRedux – #PowerShell and #Big Data Have you been hearing about “big data”, “map reduce” and other large scale computing terms over the past couple of years and been curious to dig into more detail? Have you read some of the Apache Hadoop online documentation and unfortunately concluded that it wasn't feasible to setup a “test” hadoop environment on your machine? More recently, I have read about some of Microsoft’s work to enable Hadoop on the Azure cloud. Being a "Microsoft"-leaning technologist, I am more inclinded to be successful with experimentation when on the Windows platform. Of course, it is not that I am "religious" about one set of technologies other another, but rather more experienced. Anyway, within the past couple of weeks I have been thinking about PowerShell a bit more as the 2012 PowerShell Scripting Games approach and it occured to me that PowerShell's support for Windows Remote Management (WinRM), and some other inherent features of PowerShell might lend themselves particularly well to a simple implementation of the MapReduce framework. I fired up my PowerShell ISE and started writing just to see where it would take me. Quite simply, the ScriptBlock feature combined with the ability of Invoke-Command to create remote jobs on networked servers provides much of the plumbing of a distributed computing environment. There are some limiting factors of course. Microsoft provided some default settings which prevent PowerShell from taking over a network without administrative approval first. But even with just one adjustment, a given Windows-based machine can become a node in a MapReduce-style distributed computing environment. Ok, so enough introduction. Let's talk about the code. First, any machine that will participate as a remote "node" will need WinRM enabled for remote access, as shown below. This is not exactly practical for hundreds of intended nodes, but for one (or five) machines in a test environment it does just fine. C:> winrm quickconfig WinRM is not set up to receive requests on this machine. The following changes must be made: Set the WinRM service type to auto start. Start the WinRM service. Make these changes [y/n]? y Alternatively, you could take the approach described in the Remotely enable PSRemoting post from the TechNet forum and use PowerShell to create remote scheduled tasks that will call Enable-PSRemoting on each intended node. Invoke-MapRedux Moving on, now that you have one or more remote "nodes" enabled, you can consider the actual Map and Reduce algorithms. Consider the following snippet: $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose Invoke-MapRedux takes an instance of a MapReduceItem which references the Map and Reduce scriptblocks, an array of computer names which are the remote nodes, and the initial data set to be processed. As simple as that, you can start working with concepts of big data and the MapReduce paradigm. Now, how did we get there? I have published the initial version of my PsMapRedux PowerShell Module on GitHub. The PsMapRedux module provides the Invoke-MapRedux function described above. Feel free to browse the underlying code and even contribute to the project! In a later post, I plan to show some of the inner workings of the module, but for now let's move on to how the Map and Reduce functions are defined. Map Both the Map and Reduce functions need to follow a prescribed prototype. The prototype for a Map function in the MapRedux module is as follows. A simple scriptblock that takes one PsObject parameter and returns a hashtable. It is important to note that the PsObject $dataset parameter is a MapRedux custom object that has a "Data" property which offers an array of data to be processed by the Map function. $aMap = { Param ( [PsObject] $dataset ) # Indicate the job is running on the remote node. Write-Host ($env:computername + "::Map"); # The hashtable to return $list = @{}; # ... Perform the mapping work and prepare the $list hashtable result with your custom PSObject... # ... The $dataset has a single 'Data' property which contains an array of data rows # which is a subset of the originally submitted data set. # Return the hashtable (Key, PSObject) Write-Output $list; } Reduce Likewise, with the Reduce function a simple prototype must be followed which takes a $key and a result $dataset from the MapRedux's partitioning function (which joins the Map results by key). Again, the $dataset is a MapRedux custom object that has a "Data" property as described in the Map section. $aReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) # The hashtable to return $redux = @{}; # Return Write-Output $redux; } All Together Now When everything is put together in a short example script, you implement your Map and Reduce functions, query for some starting data, build the MapReduxItem via New-MapReduxItem and call Invoke-MapRedux to get the process started: # Import the MapRedux and SQL Server providers Import-Module "MapRedux" Import-Module “sqlps” -DisableNameChecking # Query the database for a dataset Set-Location SQLSERVER:\sql\dbserver1\default\databases\myDb $query = "SELECT MyKey, Date, Value1 FROM BigData ORDER BY MyKey"; Write-Host "Query: $query" $dataset = Invoke-SqlCmd -query $query # Build the Map function $MyMap = { Param ( [PsObject] $dataset ) Write-Host ($env:computername + "::Map"); $list = @{}; foreach($row in $dataset.Data) { # Write-Host ("Key: " + $row.MyKey.ToString()); if($list.ContainsKey($row.MyKey) -eq $true) { $s = $list.Item($row.MyKey); $s.Sum += $row.Value1; $s.Count++; } else { $s = New-Object PSObject; $s | Add-Member -Type NoteProperty -Name MyKey -Value $row.MyKey; $s | Add-Member -type NoteProperty -Name Sum -Value $row.Value1; $list.Add($row.MyKey, $s); } } Write-Output $list; } $MyReduce = { Param ( [object] $key, [PSObject] $dataset ) Write-Host ($env:computername + "::Reduce - Count: " + $dataset.Data.Count) $redux = @{}; $count = 0; foreach($s in $dataset.Data) { $sum += $s.Sum; $count += 1; } # Reduce $redux.Add($s.MyKey, $sum / $count); # Return Write-Output $redux; } # Create the item data $Mr = New-MapReduxItem "My Test MapReduce Job" $MyMap $MyReduce # Array of processing nodes... $MyNodes = ("node1", "node2", "node3", "node4", "localhost") # Run the Map Reduce routine... $MyMrResults = Invoke-MapRedux -MapReduceItem $Mr -ComputerName $MyNodes -DataSet $dataset -Verbose # Show the results Set-Location C:\ $MyMrResults | Out-GridView Conclusion I hope you have seen through this article that PowerShell has a significant infrastructure available for distributed computing. While it does take some code to expose a MapReduce-style framework, much of the work is already done and PowerShell could prove to be the the easiest platform to develop and run big data jobs in your corporate data center, potentially in the Azure cloud, or certainly as an academic excerise at home or school. Follow me on Twitter to stay up to date on the continuing progress of my Powershell MapRedux module, and thanks for reading! Daniel

    Read the article

  • Attaching a Command to the WP7 Application Bar.

    - by mbcrump
    One of the biggest problems that I’ve seen with people creating WP7 applications is how do you bind the application bar to a Relay Command. If your using MVVM then this is particular important. Let’s examine the code that one might add to start with.  <phone:PhoneApplicationPage.ApplicationBar> <shell:ApplicationBar IsVisible="True" IsMenuEnabled="True"> <shell:ApplicationBarIconButton x:Name="appbar_button1" IconUri="/icons/appbar.questionmark.rest.png" Text="About"> <i:Interaction.Triggers> <i:EventTrigger EventName="Click"> <GalaSoft_MvvmLight_Command:EventToCommand Command="{Binding DisplayAbout, Mode=OneWay}" /> </i:EventTrigger> </i:Interaction.Triggers> </shell:ApplicationBarIconButton> <shell:ApplicationBar.MenuItems> <shell:ApplicationBarMenuItem x:Name="menuItem1" Text="MenuItem 1"></shell:ApplicationBarMenuItem> <shell:ApplicationBarMenuItem x:Name="menuItem2" Text="MenuItem 2"></shell:ApplicationBarMenuItem> </shell:ApplicationBar.MenuItems> </shell:ApplicationBar> </phone:PhoneApplicationPage.ApplicationBar> Everything looks right. But we quickly notice that we have a squiggly line under our Interaction.Triggers. The problem is that the object is not a FrameworkObject. This same code would have worked perfect if this were a normal button. OK. Point has been proved. Let’s make the ApplicationBar support Commands. So, go ahead and create a new project using MVVM Light. If you want to check out the source and work along side this tutorial then click here.  7 Easy Steps to have binding on the Application Bar using MVVM Light (I might add that you don’t have to use MVVM Light to get this functionality, I just prefer it.) 1) Download MVVM Light if you don’t already have it and install the project templates. It is available at http://mvvmlight.codeplex.com/. 2) Click File-New Project and navigate to Silverlight for Windows Phone. Make sure you use the MVVM Light (WP7) Template. 3) Now that we have our project setup and ready to go let’s download a wrapper created by Nicolas Humann here, it is called Phone7.Fx. After you download it then extract it somewhere that you can find it. This wrapper will make our application bar/menu item bindable. 4) Right click References inside your WP7 project and add the .dll file to your project. 5) In your MainPage.xaml you will need to add the proper namespace to it. Don’t forget to build your project afterwards. xmlns:Preview="clr-namespace:Phone7.Fx.Preview;assembly=Phone7.Fx.Preview" 6) Now you can add the BindableAppBar to your MainPage.xaml with a few lines of code.  <Preview:BindableApplicationBar x:Name="AppBar" BarOpacity="1.0" > <Preview:BindableApplicationBarIconButton Command="{Binding DisplayAbout}" IconUri="/icons/appbar.questionmark.rest.png" Text="About" /> <Preview:BindableApplicationBar.MenuItems> <Preview:BindableApplicationBarMenuItem Text="Settings" Command="{Binding InputBox}" /> </Preview:BindableApplicationBar.MenuItems> </Preview:BindableApplicationBar> So your final MainPage.xaml will look similar to this: NOTE: The AppBar will be located inside of the Grid using this wrapper.   <!--LayoutRoot contains the root grid where all other page content is placed--> <Grid x:Name="LayoutRoot" Background="Transparent"> <Grid.RowDefinitions> <RowDefinition Height="Auto" /> <RowDefinition Height="*" /> </Grid.RowDefinitions> <!--TitlePanel contains the name of the application and page title--> <StackPanel x:Name="TitlePanel" Grid.Row="0" Margin="24,24,0,12"> <TextBlock x:Name="ApplicationTitle" Text="{Binding ApplicationTitle}" Style="{StaticResource PhoneTextNormalStyle}" /> <TextBlock x:Name="PageTitle" Text="{Binding PageName}" Margin="-3,-8,0,0" Style="{StaticResource PhoneTextTitle1Style}" /> </StackPanel> <!--ContentPanel - place additional content here--> <Grid x:Name="ContentGrid" Grid.Row="1"> <TextBlock Text="{Binding Welcome}" Style="{StaticResource PhoneTextNormalStyle}" HorizontalAlignment="Center" VerticalAlignment="Center" FontSize="40" /> </Grid> <Preview:BindableApplicationBar x:Name="AppBar" BarOpacity="1.0" > <Preview:BindableApplicationBarIconButton Command="{Binding DisplayAbout}" IconUri="/icons/appbar.questionmark.rest.png" Text="About" /> <Preview:BindableApplicationBar.MenuItems> <Preview:BindableApplicationBarMenuItem Text="Settings" Command="{Binding InputBox}" /> </Preview:BindableApplicationBar.MenuItems> </Preview:BindableApplicationBar> </Grid> 7) Let’s go ahead and create the RelayCommands and write them up to a MessageBox by editing our MainViewModel.cs file. public class MainViewModel : ViewModelBase { public string ApplicationTitle { get { return "MVVM LIGHT"; } } public string PageName { get { return "My page:"; } } public string Welcome { get { return "Welcome to MVVM Light"; } } public RelayCommand DisplayAbout { get; private set; } public RelayCommand InputBox { get; private set; } /// <summary> /// Initializes a new instance of the MainViewModel class. /// </summary> public MainViewModel() { if (IsInDesignMode) { // Code runs in Blend --> create design time data. } else { DisplayAbout = new RelayCommand(() => { MessageBox.Show("About box called!"); }); InputBox = new RelayCommand(() => { MessageBox.Show("settings button called"); }); } } If you run the project now you should get something similar to this (notice the AppBar at the bottom):  Now if you hit the question mark then you will get the following MessageBox: The MenuItem works as well so for Settings: As you can see, its pretty easy to add a Command to the ApplicationBar/MenuItem. If you want to look through the full source code then click here.   Subscribe to my feed

    Read the article

  • Creating a Document Library with Content Type in code

    - by David Jacobus
    Originally posted on: http://geekswithblogs.net/djacobus/archive/2013/10/15/154360.aspxIn the past, I have shown how to create a list content type and add the content type to a list in code.  As a Developer, many of the artifacts which we create are widgets which have a List or Document Library as the back end.   We need to be able to create our applications (Web Part, etc.) without having the user involved except to enter the list item data.  Today, I will show you how to do the same with a document library.    A summary of what we will do is as follows:   1.   Create an Empty SharePoint Project in Visual Studio 2.   Add a Code Folder in the solution and Drag and Drop Utilities and Extensions Libraries to the solution 3.   Create a new Feature and add and event receiver  all the code will be in the event receiver 4.   Add the fields which will extend the built-in Document content type 5.   If the Content Type does not exist, Create it 6.   If the Document Library does not exist, Create it with the new Content Type inherited from the Document Content Type 7.   Delete the Document Content Type from the Library (as we have a new one which inherited from it) 8.   Add the fields which we want to be visible from the fields added to the new Content Type   Here we go:   Create an Empty SharePoint Project in Visual Studio      Add a Code Folder in the solution and Drag and Drop Utilities and Extensions Libraries to the solution       The Utilities and Extensions Library will be part of this project which I will provide a download link at the end of this post.  Drag and drop them into your project.  If Dragged and Dropped from windows explorer you will need to show all files and then include them in your project.  Change the Namespace to agree with your project.   Create a new Feature and add and event receiver  all the code will be in the event receiver.  Here We added a new Feature called “CreateDocLib”  and then right click to add an Event Receiver All of our code will be in this Event Receiver.  For this Demo I will only be using the Feature Activated Event.      From this point on we will be looking at code!    We are adding two constants for use columGroup (How we want SharePoint to Group them, usually Company Name) and ctName(ContentType Name)  using System; using System.Runtime.InteropServices; using System.Security.Permissions; using Microsoft.SharePoint; namespace CreateDocLib.Features.CreateDocLib { /// <summary> /// This class handles events raised during feature activation, deactivation, installation, uninstallation, and upgrade. /// </summary> /// <remarks> /// The GUID attached to this class may be used during packaging and should not be modified. /// </remarks> [Guid("56e6897c-97c4-41ac-bc5b-5cd2c04f2dd1")] public class CreateDocLibEventReceiver : SPFeatureReceiver { const string columnGroup = "DJ"; const string ctName = "DJDocLib"; } }     Here we are creating the Feature Activated event.   Adding the new fields (Site Columns) ,  Testing if the Content Type Exists, if not adding it.  Testing if the document Library exists, if not adding it.   #region DocLib public override void FeatureActivated(SPFeatureReceiverProperties properties) { using (SPWeb spWeb = properties.GetWeb() as SPWeb) { //add the fields addFields(spWeb); //add content type SPContentType testCT = spWeb.ContentTypes[ctName]; // we will not create the content type if it exists if (testCT == null) { //the content type does not exist add it addContentType(spWeb, ctName); } if ((spWeb.Lists.TryGetList("MyDocuments") == null)) { //create the list if it dosen't to exist CreateDocLib(spWeb); } } } #endregion The addFields method uses the utilities library to add site columns to the site. We can add as many fields within this method as we like. Here we are adding one for demonstration purposes. Icon as a Url type.  public void addFields(SPWeb spWeb) { Utilities.addField(spWeb, "Icon", SPFieldType.URL, false, columnGroup); }The addContentType method add the new Content Type to the site Content Types. We have already checked to see that it does not exist. In addition, here is where we add the linkages from our site columns previously created to our new Content Type   private static void addContentType(SPWeb spWeb, string name) { SPContentType myContentType = new SPContentType(spWeb.ContentTypes["Document"], spWeb.ContentTypes, name) { Group = columnGroup }; spWeb.ContentTypes.Add(myContentType); addContentTypeLinkages(spWeb, myContentType); myContentType.Update(); } Here we are adding just one linkage as we only have one additional field in our Content Type public static void addContentTypeLinkages(SPWeb spWeb, SPContentType ct) { Utilities.addContentTypeLink(spWeb, "Icon", ct); } Next we add the logic to create our new Document Library, which we have already checked to see if it exists.  We create the document library and turn on content types.  Add the new content type and then delete the old “Document” content types.   private void CreateDocLib(SPWeb web) { using (var site = new SPSite(web.Url)) { var web1 = site.RootWeb; var listId = web1.Lists.Add("MyDocuments", string.Empty, SPListTemplateType.DocumentLibrary); var lib = web1.Lists[listId] as SPDocumentLibrary; lib.ContentTypesEnabled = true; var docType = web.ContentTypes[ctName]; lib.ContentTypes.Add(docType); lib.ContentTypes.Delete(lib.ContentTypes["Document"].Id); lib.Update(); AddLibrarySettings(web1, lib); } }  Finally, we set some document library settings on our new document library with the AddLibrarySettings method. We then ensure that the new site column is visible when viewed in the browser.  private void AddLibrarySettings(SPWeb web, SPDocumentLibrary lib) { lib.OnQuickLaunch = true; lib.ForceCheckout = true; lib.EnableVersioning = true; lib.MajorVersionLimit = 5; lib.EnableMinorVersions = true; lib.MajorWithMinorVersionsLimit = 5; lib.Update(); var view = lib.DefaultView; view.ViewFields.Add("Icon"); view.Update(); } Okay, what's cool here: In a few lines of code, we have created site columns, A content Type, a document library. As a developer, I use this functionality all the time. For instance, I could now just add a web part to this same solutionwhich uses this document Library. I love SharePoint! Here is the complete solution: Create Document Library Code

    Read the article

  • CodePlex Daily Summary for Tuesday, July 30, 2013

    CodePlex Daily Summary for Tuesday, July 30, 2013Popular ReleasesnopCommerce. Open source shopping cart (ASP.NET MVC): nopCommerce 3.10: Highlight features & improvements: • Performance optimization. • New more user-friendly product/product-variant logic. Now we'll have only products (simple and grouped). • Bundle products support added. • Allow a store owner to associate product image for product variant attribute values. To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).Small Tools: Helpers 1.01: Fix params count issue Fix STAThread issue Add support for exe.config filesExtJS based ASP.NET Controls: FineUI v3.3.1: ??FineUI ?? ExtJS ??? ASP.NET ???。 FineUI??? ?? No JavaScript,No CSS,No UpdatePanel,No ViewState,No WebServices ???????。 ?????? IE 7.0、Firefox 3.6、Chrome 3.0、Opera 10.5、Safari 3.0+ ???? Apache License v2.0 ?:ExtJS ?? GPL v3 ?????(http://www.sencha.com/license)。 ???? ??:http://fineui.com/bbs/ ??:http://fineui.com/demo/ ??:http://fineui.com/doc/ ??:http://fineui.codeplex.com/ FineUI ???? ExtJS ????????,???? ExtJS ?,???????????ExtJS?: 1. ????? FineUI ? ExtJS ?:http://fineui.com/bbs/fo...AutoNLayered - Domain Oriented N-Layered .NET 4.5: AutoNLayered v1.0.5: - Fix Dtos. Abstract collections replaced by concrete (correct serialization WCF). - OrderBy in navigation properties. - Unit Test with Fakes. - Map of entities/dto moved to application services. - Libraries updated. Warning using Fakes: http://connect.microsoft.com/VisualStudio/feedback/details/782031/visual-studio-2012-add-fakes-assembly-does-not-add-all-needed-referencesPath Copy Copy: 11.1: Minor release with two new features: Submenu's contextual menu item now has an icon next to it Added reference to JavaScript regular expression format in Settings application Since this release does not have any glaring bug fixes, it is more of an optional update for existing users. It depends on whether you want to be able to spot the Path Copy Copy submenu more easily. I recommend you install it to see if the icon makes sense. As always, please don't hesitate to leave feedback via Discus...CMake Tools for Visual Studio: CMake Tools for Visual Studio 1.0 RC3: This is the third release candidate of CMake Tools for Visual Studio 1.0, which contains the following bug fixes: Opening a CMake file from Windows Explorer while Visual Studio is already open will no start a new instance of Visual Studio. Typing a symbol while the IntelliSense list box is visible and the text typed so far does not match any item in the list will dismiss the list box and insert the symbol typed.R.NET: R.NET 1.5: The major changes in v1.5 are: Initialize method must be called before using R. Settings should be passed to the method. EagerEvaluate method renamed to Evaluate (use Defer method when you want old version of Evaluate).Media Companion: Media Companion MC3.574b: Some good bug fixes been going on with the new XBMC-Link function. Thanks to all who were able to do testing and gave feedback. New:* Added some adhoc extra General movie filters, one of which is Plot = Outline (see fixes above). To see the filters, add the following line to your config.xml: <ShowExtraMovieFilters>True</ShowExtraMovieFilters>. The others are: Imdb in folder name, Imdb in not folder name & Imdb not in folder name & year mismatch. * Movie - display <tag> list on browser tab ...OfflineBrowser: Preview Release with Search: I've added search to this release.VG-Ripper & PG-Ripper: VG-Ripper 2.9.46: changes FIXED LoginMath.NET Numerics: Math.NET Numerics v2.6.0: What's New in Math.NET Numerics 2.6 - Announcement, Explanations and Sample Code. New: Linear Curve Fitting Linear least-squares fitting (regression) to lines, polynomials and linear combinations of arbitrary functions. Multi-dimensional fitting. Also works well in F# with the F# extensions. New: Root Finding Brent's method. ~Candy Chiu, Alexander Täschner Bisection method. ~Scott Stephens, Alexander Täschner Broyden's method, for multi-dimensional functions. ~Alexander Täschner ...AJAX Control Toolkit: July 2013 Release: AJAX Control Toolkit Release Notes - July 2013 Release Version 7.0725July 2013 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - Instructions for using the AJAX Control Toolkit with ASP.NET 4.5 can be found at...MJP's DirectX 11 Samples: Specular Antialiasing Sample: Sample code to complement my presentation that's part of the Physically Based Shading in Theory and Practice course at SIGGRAPH 2013, entitled "Crafting a Next-Gen Material Pipeline for The Order: 1886". Demonstrates various methods of preventing aliasing from specular BRDF's when using high-frequency normal maps. The zip file contains source code as well as a pre-compiled x64 binary.EXCEL??、??、????????:DataPie(??MSSQL 2008、ORACLE、ACCESS 2007): DataPieV3.6.1: ????csv????,???sql??,??csv????Qibla Compass for Windows Phone: Qibla Compass for Windows Phone: This release is in open beta version. You can always download and provide your feedback. Since it was just developed to give users an idea of Qibla Direction and its mapping therefore you might not see major releases in future.Event Scavenger: Version 5: I've decided to do a full (recommended) release of version 5. I've been running it myself for months and did not have any issues with it yet. This release just contains the installs. The web site's documentation has not been updated yet and reflects the previous version details. If you have an issue with this version then you can happily switch back to 4.x. Version 5 can run side-by-side with earlier versions (service) as it has a new service and database.wpadk: WPadk_WP8???: ???:V1.1 ??wp???????????????wp8???????StockSharp: StockSharp 4.1.16: ?????? ????????? - http://stocksharp.com/forum/yaf_postsm28239_S--API-4-1.aspx#post28239GeoTransformer: GeoTransformer 4.5: Extensions can now be installed and uninstalled from the application. The extensions update the same way as the application - silently and automatically. Added ability to search for caches by pressing CTRL+F in the table views. (Thanks to JanisU for implementing this request) Added ability to remove edited customizations for multiple caches at once (use SHIFT or CTRL to select multiple lines in the table). A new experimental version for Windows 8 RT (on ARM processor) is also made availa...Kartris E-commerce: Kartris v2.5003: This fixes an issue where search engines appear to identify as IE and so trigger the noIE page if there is not a non-responsive skin specified.New ProjectsBus Booking System: Bus Booking systemC#??????: ????C#??????????????。Cotizav 2.0: Este proyecto es para el soporte de Cotizaciones.DeferredShading: deferred shading rendererIVR Junction: IVR Junction connects an Interactive Voice Response (IVR) system to cloud services such as YouTube, Facebook and other social media.Mac Address Changer: It's a quite and easy tool to change your mac addressmotokraft user control: user control for motokraftSingle Reference JavaScript Pattern: This is very simple pattern. In here you need to only refer one script in a page. I'm sure it is saving your development time as well as maintenance timeSocial_Life_Time: This is social network that people can communicate with each otherThe Ironic Text Based MMORPG: Modern MMORPGs have become highly interactive, complex systems of skills, stats, and action combat. This game introduces a new level of text based immersion!Timeline Year Control: Timeline Year Control An ASP.Net year indicator timeline control.winrtsock: winsock façade for Windows Runtime for porting bsd socket code to Windows RuntimeZker: No summary?????: C#?????

    Read the article

  • Solving Big Problems with Oracle R Enterprise, Part I

    - by dbayard
    Abstract: This blog post will show how we used Oracle R Enterprise to tackle a customer’s big calculation problem across a big data set. Overview: Databases are great for managing large amounts of data in a central place with rigorous enterprise-level controls.  R is great for doing advanced computations.  Sometimes you need to do advanced computations on large amounts of data, subject to rigorous enterprise-level concerns.  This blog post shows how Oracle R Enterprise enables R plus the Oracle Database enabled us to do some pretty sophisticated calculations across 1 million accounts (each with many detailed records) in minutes. The problem: A financial services customer of mine has a need to calculate the historical internal rate of return (IRR) for its customers’ portfolios.  This information is needed for customer statements and the online web application.  In the past, they had solved this with a home-grown application that pulled trade and account data out of their data warehouse and ran the calculations.  But this home-grown application was not able to do this fast enough, plus it was a challenge for them to write and maintain the code that did the IRR calculation. IRR – a problem that R is good at solving: Internal Rate of Return is an interesting calculation in that in most real-world scenarios it is impractical to calculate exactly.  Rather, IRR is a calculation where approximation techniques need to be used.  In this blog post, we will discuss calculating the “money weighted rate of return” but in the actual customer proof of concept we used R to calculate both money weighted rate of returns and time weighted rate of returns.  You can learn more about the money weighted rate of returns here: http://www.wikinvest.com/wiki/Money-weighted_return First Steps- Calculating IRR in R We will start with calculating the IRR in standalone/desktop R.  In our second post, we will show how to take this desktop R function, deploy it to an Oracle Database, and make it work at real-world scale.  The first step we did was to get some sample data.  For a historical IRR calculation, you have a balances and cash flows.  In our case, the customer provided us with several accounts worth of sample data in Microsoft Excel.      The above figure shows part of the spreadsheet of sample data.  The data provides balances and cash flows for a sample account (BMV=beginning market value. FLOW=cash flow in/out of account. EMV=ending market value). Once we had the sample spreadsheet, the next step we did was to read the Excel data into R.  This is something that R does well.  R offers multiple ways to work with spreadsheet data.  For instance, one could save the spreadsheet as a .csv file.  In our case, the customer provided a spreadsheet file containing multiple sheets where each sheet provided data for a different sample account.  To handle this easily, we took advantage of the RODBC package which allowed us to read the Excel data sheet-by-sheet without having to create individual .csv files.  We wrote ourselves a little helper function called getsheet() around the RODBC package.  Then we loaded all of the sample accounts into a data.frame called SimpleMWRRData. Writing the IRR function At this point, it was time to write the money weighted rate of return (MWRR) function itself.  The definition of MWRR is easily found on the internet or if you are old school you can look in an investment performance text book.  In the customer proof, we based our calculations off the ones defined in the The Handbook of Investment Performance: A User’s Guide by David Spaulding since this is the reference book used by the customer.  (One of the nice things we found during the course of this proof-of-concept is that by using R to write our IRR functions we could easily incorporate the specific variations and business rules of the customer into the calculation.) The key thing with calculating IRR is the need to solve a complex equation with a numerical approximation technique.  For IRR, you need to find the value of the rate of return (r) that sets the Net Present Value of all the flows in and out of the account to zero.  With R, we solve this by defining our NPV function: where bmv is the beginning market value, cf is a vector of cash flows, t is a vector of time (relative to the beginning), emv is the ending market value, and tend is the ending time. Since solving for r is a one-dimensional optimization problem, we decided to take advantage of R’s optimize method (http://stat.ethz.ch/R-manual/R-patched/library/stats/html/optimize.html). The optimize method can be used to find a minimum or maximum; to find the value of r where our npv function is closest to zero, we wrapped our npv function inside the abs function and asked optimize to find the minimum.  Here is an example of using optimize: where low and high are scalars that indicate the range to search for an answer.   To test this out, we need to set values for bmv, cf, t, emv, tend, low, and high.  We will set low and high to some reasonable defaults. For example, this account had a negative 2.2% money weighted rate of return. Enhancing and Packaging the IRR function With numerical approximation methods like optimize, sometimes you will not be able to find an answer with your initial set of inputs.  To account for this, our approach was to first try to find an answer for r within a narrow range, then if we did not find an answer, try calling optimize() again with a broader range.  See the R help page on optimize()  for more details about the search range and its algorithm. At this point, we can now write a simplified version of our MWRR function.  (Our real-world version is  more sophisticated in that it calculates rate of returns for 5 different time periods [since inception, last quarter, year-to-date, last year, year before last year] in a single invocation.  In our actual customer proof, we also defined time-weighted rate of return calculations.  The beauty of R is that it was very easy to add these enhancements and additional calculations to our IRR package.)To simplify code deployment, we then created a new package of our IRR functions and sample data.  For this blog post, we only need to include our SimpleMWRR function and our SimpleMWRRData sample data.  We created the shell of the package by calling: To turn this package skeleton into something usable, at a minimum you need to edit the SimpleMWRR.Rd and SimpleMWRRData.Rd files in the \man subdirectory.  In those files, you need to at least provide a value for the “title” section. Once that is done, you can change directory to the IRR directory and type at the command-line: The myIRR package for this blog post (which has both SimpleMWRR source and SimpleMWRRData sample data) is downloadable from here: myIRR package Testing the myIRR package Here is an example of testing our IRR function once it was converted to an installable package: Calculating IRR for All the Accounts So far, we have shown how to calculate IRR for a single account.  The real-world issue is how do you calculate IRR for all of the accounts?This is the kind of situation where we can leverage the “Split-Apply-Combine” approach (see http://www.cscs.umich.edu/~crshalizi/weblog/815.html).  Given that our sample data can fit in memory, one easy approach is to use R’s “by” function.  (Other approaches to Split-Apply-Combine such as plyr can also be used.  See http://4dpiecharts.com/2011/12/16/a-quick-primer-on-split-apply-combine-problems/). Here is an example showing the use of “by” to calculate the money weighted rate of return for each account in our sample data set.  Recap and Next Steps At this point, you’ve seen the power of R being used to calculate IRR.  There were several good things: R could easily work with the spreadsheets of sample data we were given R’s optimize() function provided a nice way to solve for IRR- it was both fast and allowed us to avoid having to code our own iterative approximation algorithm R was a convenient language to express the customer-specific variations, business-rules, and exceptions that often occur in real-world calculations- these could be easily added to our IRR functions The Split-Apply-Combine technique can be used to perform calculations of IRR for multiple accounts at once. However, there are several challenges yet to be conquered at this point in our story: The actual data that needs to be used lives in a database, not in a spreadsheet The actual data is much, much bigger- too big to fit into the normal R memory space and too big to want to move across the network The overall process needs to run fast- much faster than a single processor The actual data needs to be kept secured- another reason to not want to move it from the database and across the network And the process of calculating the IRR needs to be integrated together with other database ETL activities, so that IRR’s can be calculated as part of the data warehouse refresh processes In our next blog post in this series, we will show you how Oracle R Enterprise solved these challenges.

    Read the article

  • Matrix Multiplication with C++ AMP

    - by Daniel Moth
    As part of our API tour of C++ AMP, we looked recently at parallel_for_each. I ended that post by saying we would revisit parallel_for_each after introducing array and array_view. Now is the time, so this is part 2 of parallel_for_each, and also a post that brings together everything we've seen until now. The code for serial and accelerated Consider a naïve (or brute force) serial implementation of matrix multiplication  0: void MatrixMultiplySerial(std::vector<float>& vC, const std::vector<float>& vA, const std::vector<float>& vB, int M, int N, int W) 1: { 2: for (int row = 0; row < M; row++) 3: { 4: for (int col = 0; col < N; col++) 5: { 6: float sum = 0.0f; 7: for(int i = 0; i < W; i++) 8: sum += vA[row * W + i] * vB[i * N + col]; 9: vC[row * N + col] = sum; 10: } 11: } 12: } We notice that each loop iteration is independent from each other and so can be parallelized. If in addition we have really large amounts of data, then this is a good candidate to offload to an accelerator. First, I'll just show you an example of what that code may look like with C++ AMP, and then we'll analyze it. It is assumed that you included at the top of your file #include <amp.h> 13: void MatrixMultiplySimple(std::vector<float>& vC, const std::vector<float>& vA, const std::vector<float>& vB, int M, int N, int W) 14: { 15: concurrency::array_view<const float,2> a(M, W, vA); 16: concurrency::array_view<const float,2> b(W, N, vB); 17: concurrency::array_view<concurrency::writeonly<float>,2> c(M, N, vC); 18: concurrency::parallel_for_each(c.grid, 19: [=](concurrency::index<2> idx) restrict(direct3d) { 20: int row = idx[0]; int col = idx[1]; 21: float sum = 0.0f; 22: for(int i = 0; i < W; i++) 23: sum += a(row, i) * b(i, col); 24: c[idx] = sum; 25: }); 26: } First a visual comparison, just for fun: The beginning and end is the same, i.e. lines 0,1,12 are identical to lines 13,14,26. The double nested loop (lines 2,3,4,5 and 10,11) has been transformed into a parallel_for_each call (18,19,20 and 25). The core algorithm (lines 6,7,8,9) is essentially the same (lines 21,22,23,24). We have extra lines in the C++ AMP version (15,16,17). Now let's dig in deeper. Using array_view and extent When we decided to convert this function to run on an accelerator, we knew we couldn't use the std::vector objects in the restrict(direct3d) function. So we had a choice of copying the data to the the concurrency::array<T,N> object, or wrapping the vector container (and hence its data) with a concurrency::array_view<T,N> object from amp.h – here we used the latter (lines 15,16,17). Now we can access the same data through the array_view objects (a and b) instead of the vector objects (vA and vB), and the added benefit is that we can capture the array_view objects in the lambda (lines 19-25) that we pass to the parallel_for_each call (line 18) and the data will get copied on demand for us to the accelerator. Note that line 15 (and ditto for 16 and 17) could have been written as two lines instead of one: extent<2> e(M, W); array_view<const float, 2> a(e, vA); In other words, we could have explicitly created the extent object instead of letting the array_view create it for us under the covers through the constructor overload we chose. The benefit of the extent object in this instance is that we can express that the data is indeed two dimensional, i.e a matrix. When we were using a vector object we could not do that, and instead we had to track via additional unrelated variables the dimensions of the matrix (i.e. with the integers M and W) – aren't you loving C++ AMP already? Note that the const before the float when creating a and b, will result in the underling data only being copied to the accelerator and not be copied back – a nice optimization. A similar thing is happening on line 17 when creating array_view c, where we have indicated that we do not need to copy the data to the accelerator, only copy it back. The kernel dispatch On line 18 we make the call to the C++ AMP entry point (parallel_for_each) to invoke our parallel loop or, as some may say, dispatch our kernel. The first argument we need to pass describes how many threads we want for this computation. For this algorithm we decided that we want exactly the same number of threads as the number of elements in the output matrix, i.e. in array_view c which will eventually update the vector vC. So each thread will compute exactly one result. Since the elements in c are organized in a 2-dimensional manner we can organize our threads in a two-dimensional manner too. We don't have to think too much about how to create the first argument (a grid) since the array_view object helpfully exposes that as a property. Note that instead of c.grid we could have written grid<2>(c.extent) or grid<2>(extent<2>(M, N)) – the result is the same in that we have specified M*N threads to execute our lambda. The second argument is a restrict(direct3d) lambda that accepts an index object. Since we elected to use a two-dimensional extent as the first argument of parallel_for_each, the index will also be two-dimensional and as covered in the previous posts it represents the thread ID, which in our case maps perfectly to the index of each element in the resulting array_view. The kernel itself The lambda body (lines 20-24), or as some may say, the kernel, is the code that will actually execute on the accelerator. It will be called by M*N threads and we can use those threads to index into the two input array_views (a,b) and write results into the output array_view ( c ). The four lines (21-24) are essentially identical to the four lines of the serial algorithm (6-9). The only difference is how we index into a,b,c versus how we index into vA,vB,vC. The code we wrote with C++ AMP is much nicer in its indexing, because the dimensionality is a first class concept, so you don't have to do funny arithmetic calculating the index of where the next row starts, which you have to do when working with vectors directly (since they store all the data in a flat manner). I skipped over describing line 20. Note that we didn't really need to read the two components of the index into temporary local variables. This mostly reflects my personal choice, in some algorithms to break down the index into local variables with names that make sense for the algorithm, i.e. in this case row and col. In other cases it may i,j,k or x,y,z, or M,N or whatever. Also note that we could have written line 24 as: c(idx[0], idx[1])=sum  or  c(row, col)=sum instead of the simpler c[idx]=sum Targeting a specific accelerator Imagine that we had more than one hardware accelerator on a system and we wanted to pick a specific one to execute this parallel loop on. So there would be some code like this anywhere before line 18: vector<accelerator> accs = MyFunctionThatChoosesSuitableAccelerators(); accelerator acc = accs[0]; …and then we would modify line 18 so we would be calling another overload of parallel_for_each that accepts an accelerator_view as the first argument, so it would become: concurrency::parallel_for_each(acc.default_view, c.grid, ...and the rest of your code remains the same… how simple is that? Comments about this post by Daniel Moth welcome at the original blog.

    Read the article

  • We've completed the first iteration

    - by CliveT
    There are a lot of features in C# that are implemented by the compiler and not by the underlying platform. One such feature is a lambda expression. Since local variables cannot be accessed once the current method activation finishes, the compiler has to go out of its way to generate a new class which acts as a home for any variable whose lifetime needs to be extended past the activation of the procedure. Take the following example:     Random generator = new Random();     Func func = () = generator.Next(10); In this case, the compiler generates a new class called c_DisplayClass1 which is marked with the CompilerGenerated attribute. [CompilerGenerated] private sealed class c__DisplayClass1 {     // Fields     public Random generator;     // Methods     public int b__0()     {         return this.generator.Next(10);     } } Two quick comments on this: (i)    A display was the means that compilers for languages like Algol recorded the various lexical contours of the nested procedure activations on the stack. I imagine that this is what has led to the name. (ii)    It is a shame that the same attribute is used to mark all compiler generated classes as it makes it hard to figure out what they are being used for. Indeed, you could imagine optimisations that the runtime could perform if it knew that classes corresponded to certain high level concepts. We can see that the local variable generator has been turned into a field in the class, and the body of the lambda expression has been turned into a method of the new class. The code that builds the Func object simply constructs an instance of this class and initialises the fields to their initial values.     c__DisplayClass1 class2 = new c__DisplayClass1();     class2.generator = new Random();     Func func = new Func(class2.b__0); Reflector already contains code to spot this pattern of code and reproduce the form containing the lambda expression, so this is example is correctly decompiled. The use of compiler generated code is even more spectacular in the case of iterators. C# introduced the idea of a method that could automatically store its state between calls, so that it can pick up where it left off. The code can express the logical flow with yield return and yield break denoting places where the method should return a particular value and be prepared to resume.         {             yield return 1;             yield return 2;             yield return 3;         } Of course, there was already a .NET pattern for expressing the idea of returning a sequence of values with the computation proceeding lazily (in the sense that the work for the next value is executed on demand). This is expressed by the IEnumerable interface with its Current property for fetching the current value and the MoveNext method for forcing the computation of the next value. The sequence is terminated when this method returns false. The C# compiler links these two ideas together so that an IEnumerator returning method using the yield keyword causes the compiler to produce the implementation of an Iterator. Take the following piece of code.         IEnumerable GetItems()         {             yield return 1;             yield return 2;             yield return 3;         } The compiler implements this by defining a new class that implements a state machine. This has an integer state that records which yield point we should go to if we are resumed. It also has a field that records the Current value of the enumerator and a field for recording the thread. This latter value is used for optimising the creation of iterator instances. [CompilerGenerated] private sealed class d__0 : IEnumerable, IEnumerable, IEnumerator, IEnumerator, IDisposable {     // Fields     private int 1__state;     private int 2__current;     public Program 4__this;     private int l__initialThreadId; The body gets converted into the code to construct and initialize this new class. private IEnumerable GetItems() {     d__0 d__ = new d__0(-2);     d__.4__this = this;     return d__; } When the class is constructed we set the state, which was passed through as -2 and the current thread. public d__0(int 1__state) {     this.1__state = 1__state;     this.l__initialThreadId = Thread.CurrentThread.ManagedThreadId; } The state needs to be set to 0 to represent a valid enumerator and this is done in the GetEnumerator method which optimises for the usual case where the returned enumerator is only used once. IEnumerator IEnumerable.GetEnumerator() {     if ((Thread.CurrentThread.ManagedThreadId == this.l__initialThreadId)               && (this.1__state == -2))     {         this.1__state = 0;         return this;     } The state machine itself is implemented inside the MoveNext method. private bool MoveNext() {     switch (this.1__state)     {         case 0:             this.1__state = -1;             this.2__current = 1;             this.1__state = 1;             return true;         case 1:             this.1__state = -1;             this.2__current = 2;             this.1__state = 2;             return true;         case 2:             this.1__state = -1;             this.2__current = 3;             this.1__state = 3;             return true;         case 3:             this.1__state = -1;             break;     }     return false; } At each stage, the current value of the state is used to determine how far we got, and then we generate the next value which we return after recording the next state. Finally we return false from the MoveNext to signify the end of the sequence. Of course, that example was really simple. The original method body didn't have any local variables. Any local variables need to live between the calls to MoveNext and so they need to be transformed into fields in much the same way that we did in the case of the lambda expression. More complicated MoveNext methods are required to deal with resources that need to be disposed when the iterator finishes, and sometimes the compiler uses a temporary variable to hold the return value. Why all of this explanation? We've implemented the de-compilation of iterators in the current EAP version of Reflector (7). This contrasts with previous version where all you could do was look at the MoveNext method and try to figure out the control flow. There's a fair amount of things we have to do. We have to spot the use of a CompilerGenerated class which implements the Enumerator pattern. We need to go to the class and figure out the fields corresponding to the local variables. We then need to go to the MoveNext method and try to break it into the various possible states and spot the state transitions. We can then take these pieces and put them back together into an object model that uses yield return to show the transition points. After that Reflector can carry on optimising using its usual optimisations. The pattern matching is currently a little too sensitive to changes in the code generation, and we only do a limited analysis of the MoveNext method to determine use of the compiler generated fields. In some ways, it is a pity that iterators are compiled away and there is no metadata that reflects the original intent. Without it, we are always going to dependent on our knowledge of the compiler's implementation. For example, we have noticed that the Async CTP changes the way that iterators are code generated, so we'll have to do some more work to support that. However, with that warning in place, we seem to do a reasonable job of decompiling the iterators that are built into the framework. Hopefully, the EAP will give us a chance to find examples where we don't spot the pattern correctly or regenerate the wrong code, and we can improve things. Please give it a go, and report any problems.

    Read the article

  • OpenVPN Server Ethernet Bridging Question

    - by Hooplad
    Hello All, I am having a difficult time properly configuring an ethernet bridge using OpenVPN 2.0.9 install on CentOS 5 ( VPN server ). The goal that I am trying to complete is to connect a VM ( instance running on the same CentOS machine ) acting as a Microsoft Business Contact Manager server. I would then like this "BCM server" to serve Windows XP clients on 192.168.1.0/24 network as well as clients connecting from VPN ( 10.8.0.0/24 ). The setup as it is now was based off a known working configuration. The problem with the working configuration was that it would allow to the client to connect and access everything running on the VPN server ( SVN, Samba, VM Server ) but not any computers on the 192.168.1.0/24 network. I must disclose that the VPN server is behind a router/firewall. Ports are being forwarded correctly ( again, clients were able to connect to the VPN server with no problem. netcat confirms the udp port is open as well ). current ifconfig output br0 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C2 inet addr:192.168.1.169 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::221:5eff:fe4d:3ac2/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:846890 errors:0 dropped:0 overruns:0 frame:0 TX packets:3072351 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:42686842 (40.7 MiB) TX bytes:4540654180 (4.2 GiB) eth0 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C2 UP BROADCAST RUNNING SLAVE MULTICAST MTU:1500 Metric:1 RX packets:882641 errors:0 dropped:0 overruns:0 frame:0 TX packets:1781383 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:82342803 (78.5 MiB) TX bytes:2614727660 (2.4 GiB) Interrupt:169 eth1 Link encap:Ethernet HWaddr 00:21:5E:4D:3A:C3 UP BROADCAST RUNNING SLAVE MULTICAST MTU:1500 Metric:1 RX packets:650 errors:0 dropped:0 overruns:0 frame:0 TX packets:1347223 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:67403 (65.8 KiB) TX bytes:1959529142 (1.8 GiB) Interrupt:233 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:17452058 errors:0 dropped:0 overruns:0 frame:0 TX packets:17452058 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:94020256229 (87.5 GiB) TX bytes:94020256229 (87.5 GiB) tap0 Link encap:Ethernet HWaddr DE:18:C6:D7:01:63 inet6 addr: fe80::dc18:c6ff:fed7:163/64 Scope:Link UP BROADCAST RUNNING PROMISC MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:3086 errors:0 dropped:166 overruns:0 carrier:0 collisions:0 txqueuelen:100 RX bytes:0 (0.0 b) TX bytes:315099 (307.7 KiB) vmnet1 Link encap:Ethernet HWaddr 00:50:56:C0:00:01 inet addr:192.168.177.1 Bcast:192.168.177.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fec0:1/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:4224 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 b) TX bytes:0 (0.0 b) vmnet8 Link encap:Ethernet HWaddr 00:50:56:C0:00:08 inet addr:192.168.55.1 Bcast:192.168.55.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fec0:8/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:4226 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:0 (0.0 b) TX bytes:0 (0.0 b) current route table Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 192.168.55.0 * 255.255.255.0 U 0 0 0 vmnet8 192.168.177.0 * 255.255.255.0 U 0 0 0 vmnet1 192.168.1.0 * 255.255.255.0 U 0 0 0 br0 current iptables output Chain INPUT (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere ACCEPT all -- anywhere anywhere Chain FORWARD (policy ACCEPT) target prot opt source destination ACCEPT all -- anywhere anywhere Chain OUTPUT (policy ACCEPT) target prot opt source destination server_known_working.conf local banshee port 1194 proto udp dev tap0 ca ca.crt cert banshee_server.crt key banshee_server.key dh dh1024.pem server 10.8.0.0 255.255.255.0 ifconfig-pool-persist ipp.txt push "route 192.168.1.0 255.255.255.0" client-to-client keepalive 10 120 tls-auth ta.key 0 user nobody group nobody persist-key persist-tun status openvpn-status.log verb 4 The following is the current CentOS server config file. server_ethernet_bridged.conf ( current ) local 192.168.1.169 port 1194 proto udp dev tap0 ca ca.crt cert server.crt key server.key dh dh1024.pem ifconfig-pool-persist ipp.txt server-bridge 192.168.1.169 255.255.255.0 192.168.1.200 192.168.1.210 push "route 192.168.1.0 255.255.255.0 192.168.1.1" client-to-client keepalive 10 120 tls-auth ta.key 0 user nobody group nobody persist-key persist-tun status openvpn-status.log verb 6 The following is one of the client's config file that was used with the known working configuration. client.opvn client dev tap proto udp remote XXX.XXX.XXX 1194 resolv-retry infinite nobind persist-key persist-tun ca client.crt cert client.crt key client.key tls-auth client.key 1 verb 3 I have tried the HOWTO provided by OpenVPN as well as others http://www.thebakershome.net/openvpn%5Ftutorial?page=1 with no success. Any help or suggestions would be appreciated.

    Read the article

  • Silverlight IConvertible TypeConverter

    - by codingbloke
    I recently answered the following question on stackoverflow:  Silverlight 3 custom control: only ‘int’ as numeric type for a property? [e.g. long or int64 seems to break] I quickly knocked up the class ConvertibleTypeConverter<T> that I posted in the question (listed later here as well). Afterward I fully expected to find that of the usual clever “bods who blog” to have covered this probably with a better solution than I.  So far though I’ve not found one so I thought I’d blog it myself. The Problem Here is a classic gotcha I’ve seen asked more than once on stackoverflow :- public class MyClass {     public float SomeValue { get; set; } } <local:MyClass SomeValue="45.15" /> This fails with the error  “Failed to create a 'System.Single' from the text '45.15'”  and results in much premature hair loss.  Fortunately this is SL4, in SL3 the error message is almost meaningless.  So what gives, how can it be that this fails when we can see other very similar values parsing happily all over the place? It comes down the fact that the Xaml parser only handles a few of the primitive data types namely: bool, int, string and double.  Since the parser has no idea how to convert a string to a float we get the above error. The Solution The sensible solution is “use double not float” but lets not dwell on that, there has to be occasions where such an answer isn’t acceptable. In order to achieve parsing of other types we need an implementation of TypeConverter for the type of the property and then we need to use the TypeConverterAttribute to decorate the property .  As an example the Silverlight SDK provides one for DateTime the DateTimeTypeConverter (yes I know DateTime isn’t really a primitive). The following class will parse in Xaml:- public class MyClass {     [TypeConverter(typeof(DateTimeTypeConverter))]     public DateTime SomeValue {get; set; } } So far though we would need to create a TypeConverter for each primitive type we are using, what if I had the following mad class to support in Xaml:- public class StrangePrimitives {     public Boolean BooleanProp { get; set; }     public Byte ByteProp { get; set; }     public Char CharProp { get; set; }     public DateTime DateTimeProp { get; set; }     public Decimal DecimalProp { get; set; }     public Double DoubleProp { get; set; }     public Int16 Int16Prop { get; set; }     public Int32 Int32Prop { get; set; }     public Int64 Int64Prop { get; set; }     public SByte SByteProp { get; set; }     public Single SingleProp { get; set; }     public String StringProp { get; set; }     public UInt16 UInt16Prop { get; set; }     public UInt32 UInt32Prop { get; set; }     public UInt64 UInt64Prop { get; set; } } Then I want to fill an instance of StrangePrimitives with the following Xaml which of course fails. <local:StrangePrimitives x:Key="MyStrangePrimitives"                          BooleanProp="True"                          ByteProp="156"                          CharProp="A"                          DateTimeProp="06 Jun 2010"                          DecimalProp="123.56"                          DoubleProp="8372.937803"                          Int16Prop="16532"                          Int32Prop="73738248"                          Int64Prop="12345678909298"                          SByteProp="-123"                          SingleProp="39.0"                          StringProp="Hello, World!"                          UInt16Prop="40000"                          UInt32Prop="4294967295"                          UInt64Prop="18446744073709551615"      /> I got to thinking, though, one thing all these primitive types have in common is that they all implement IConvertible so it should be possible to write just one converter to handle them all.  Here it is:- The ConvertibleTypeConverter public class ConvertibleTypeConverter<T> : TypeConverter where T : IConvertible {     public override bool CanConvertFrom(ITypeDescriptorContext context, Type sourceType)     {         return sourceType.GetInterface("IConvertible", false) != null;     }     public override bool CanConvertTo(ITypeDescriptorContext context, Type destinationType)     {         return destinationType.GetInterface("IConvertible", false) != null;     }     public override object ConvertFrom(ITypeDescriptorContext context, System.Globalization.CultureInfo culture, object value)     {         return ((IConvertible)value).ToType(typeof(T), culture);     }     public override object ConvertTo(ITypeDescriptorContext context, System.Globalization.CultureInfo culture, object value, Type destinationType)     {         return ((IConvertible)value).ToType(destinationType, culture);     } } I won’t bore you with an explanation of how it works, it simply adapts one existing interface (the IConvertible) and exposes it as another (the TypeConverter).   With that in place the previous strange primitives class can be modified as:- public class StrangePrimitives {     public Boolean BooleanProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Byte>))]     public Byte ByteProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Char>))]     public Char CharProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<DateTime>))]     public DateTime DateTimeProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Decimal>))]     public Decimal DecimalProp { get; set; }     public Double DoubleProp {get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Int16>))]     public Int16 Int16Prop { get; set; }     public Int32 Int32Prop { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Int64>))]     public Int64 Int64Prop { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<SByte>))]     public SByte SByteProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<Single>))]     public Single SingleProp { get; set; }     public String StringProp { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<UInt16>))]     public UInt16 UInt16Prop { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<UInt32>))]     public UInt32 UInt32Prop { get; set; }     [TypeConverter(typeof(ConvertibleTypeConverter<UInt64>))]     public UInt64 UInt64Prop { get; set; } } This results in the previous Xaml parsing happily.  Now it seems such an obvious thing to do that one may wonder why such a class doesn’t already existing in Silverlight or at least in the SDK.   I would not be surprised if there were some very good reasons hence use the ConvertibleTypeConverter with caution.  It does seem to me to be a useful little class to have lying around in the toolbox for the odd occasion where it may be needed.

    Read the article

  • D2K to OA Framework Transition

    - by PRajkumar
    What is the difference between D2K form and OA Framework? It is a very innocent but important question for someone that desires to make transition from D2K to OA Framework. I hope you have already read and implemented OA Framework Getting Started. I will re-visit my own experience of implementing HelloWorld program in "OA Framework". When I implemented HelloWorld a year ago, I had no clue as to what I was doing & why I was doing those steps. I merely copied the steps from Oracle Tutorial without understanding them. Hence in this blog, I will try to explain in simple manner the meaning of OA Framework HelloWorld Program and compare the steps to D2K form [where possible]. To keep things simple, only basics will be discussed. Following key Steps were needed for HelloWorld Step 1 Create a new Workspace and a new Project as dictated by Oracle's tutorial. When defining project, you will specify a default package, which in this case was oracle.apps.ak.hello This means the following: - ak is the short name of the Application in Oracle           [means fnd_applications.short_name] hello is the name of your project Step 2 Next, you will create a OA Page within hello project Think OA Page as the fmx file itself in D2K. I am saying so because this page gets attached to the form function. This page will be created within hello project, hence the package name oracle.apps.ak.hello.webui Note the webui, it is a convention to have page in webui, means this page represents the Web User Interface You will assign the default AM [OAApplicationModule]. Think of AM "Connection Manager" and "Transaction State Manager" for your page          I can't co-relate this to anything in D2k, as there is no concept of Connection Pooling and that D2k is not stateless. Reason being that as soon as you kick off a D2K Form, it connects to a single session of Oracle and sticks to that single Oracle database session. So is not the case in OAF, hence AM is needed. Step 3 You create Region within the Page. ·         Region is what will store your fields. Text input fields will be of type messageTextInput. Think of Canvas in D2K. You can have nested regions. Stacked Canvas in D2K comes the closest to this component of OA Framework Step 4 Add a button to one of the nested regions The itemStyle should be submitButton, in case you want the page to be submitted when this button is clicked There is no WHEN-BUTTON-PRESSED trigger in OAF. In Framework, you will add a controller java code to handle events like Form Submit button clicks. JDeveloper generates the default code for you. Primarily two functions [should I call methods] will be created processRequest [for UI Rendering Handling] and processFormRequest          Think of processRequest as WHEN-NEW-FORM-INSTANCE, though processRequest is very restrictive. Note What is the difference between processRequest and processFormRequest? These two methods are available in the Default Controller class that gets created. processFormRequest This method is commonly used to react/respond to the event that has taken place, for example click of a button. Some examples are if(oapagecontext.getParameter("Cancel") != null) (Do your processing for Cancellation/ Rollback) if(oapagecontext.getParameter("Submit") != null) (Do your validations and commit here) if(oapagecontext.getParameter("Update") != null) (Do your validations and commit here) In the above three examples, you could be calling oapagecontext.forwardImmediately to re-direct the page navigation to some other page if needed. processRequest In this method, usually page rendering related code is written. Effectively, each GUI component is a bean that gets initialised during processRequest. Those who are familiar with D2K forms, something like pre-query may be written in this method. Step 5 In the controller to access the value in field "HelloName" the command is String userContent = pageContext.getParameter("HelloName"); In D2k, we used :block.field. In OAFramework, at submission of page, all the field values get passed into to OAPageContext object. Use getParameter to access the field value To set the value of the field, use OAMessageTextInputBean field HelloName = (OAMessageTextInputBean)webBean.findChildRecursive("HelloName"); fieldHelloName.setText(pageContext,"Setting the default value" ); Note when setting field value in controller: Note 1. Do not set the value in processFormRequest Note 2. If the field comes from View Object, then do not use setText in controller Note 3. For control fields [that are not based on View Objects], you can use setText to assign values in processRequest method Lets take some notes to expand beyond the HelloWorld Project Note 1 In D2K-forms we sort of created a Window, attached to Canvas, and then fields within that Canvas. However in OA Framework, think of Page being fmx/Window, think of Region being a Canvas, and fields being within Regions. This is not a formal/accurate understanding of analogy between D2k and Framework, but is close to being logical. Note 2 In D2k, your Forms fmb file was compiled to fmx. It was fmx file that was deployed on mid-tier. In case of OAF, your OA Page is nothing but a XML file. We call this MDS [meta data]. Whatever name you give to "Page" in OAF, an XML file of the same name gets created. This xml file must then be loaded into database by using XML Importer command. Note 3 Apart from MDS XML file, almost everything else is merely deployed to your mid-tier. Usually this is underneath $JAVA_TOP/oracle/apps/../.. All java files will go underneath java top/oracle/apps/../.. etc. Note 4 When building tutorial, ignore the steps for setting "Attribute Sets". These are not mandatory. Oracle might just have developed their tutorials without including these. Think of these like Visual Attributes of D2K forms Note 5 Controller is where you will write any java code in OA Framework. You can create a Controller per Page or have a different Controller for each of the Regions with the same Page. Note 6 In the method processFormRequest of the Controller, you can access the values of the page by using notation pageContext.getParameter("<fieldname here>"). This method processFormRequest is executed when the OAF Screen/Page is submitted by click of a button. Note 7 Inside the controller, all the Database Related interactions for example interaction with View Objects happen via Application Module. But why so? Because Application Module Manages the transaction state of the Application. OAApplicationModuleImpl oaapplicationmoduleimpl = OAApplicationModuleImpl)oapagecontext.getApplicationModule(oawebbean); OADBTransaction oadbtransaction = OADBTransaction)oaapplicationmoduleimpl.getDBTransaction(); Note 8 In D2K, we have control block or a block based on database view. Similarly, in OA Framework, if the field does not have view Object attached, then it is like a control field. Hence in HelloWorld example, field HelloName is a control field [in D2K terminology]. A view Object can either be based on a view/table, synonym or on a SQL statement. Note 9 I wish to access the fields in multi record block that is based on view Object. Can I do this in Controller? Sure you can. To traverse through those records, do the below ·         Get the reference to the View Object using (OAViewObject)oapagecontext.getApplicationModule(oawebbean).findViewObject("VO Name Here") ·         Loop through the records in View Objects using count returned from oaviewobject.getFetchedRowCount() ·         For each record, fetch the value of the fields within the loop as oracle.jbo.Row row = oaviewobject.getRowAtRangeIndex(loop index here); (String)row.getAttribute("Column name of VO here ");

    Read the article

  • CodePlex Daily Summary for Sunday, June 09, 2013

    CodePlex Daily Summary for Sunday, June 09, 2013Popular ReleasesZXMAK2: Version 2.7.5.5: - several fixes for joystick scanVG-Ripper & PG-Ripper: PG-Ripper 1.4.13: changes NEW: Added Support for "ImageJumbo.com" links FIXED: Ripping of Threads with multiple pagesCKEditor™ Provider for DotNetNuke®: CKEditor Provider 2.00.05: Whats New Updated to CKEditor 4.1.1 Added Auto Save Function (autosave plugin) {Delay can be defined in the Config - Default is 25} New Setting to set the Default Link Type (Editor Config Tab) Added CodeMirror Plugin Settings to the Editor Config Tab Added WordCount Plugin Settings to the Editor Config Tab Added Maximum Upload File Size Info to the Upload Dialog Added Check for Maximum Upload Size on Quick Upload and File Browser Upload changes File-Browser: Fixed an Issue with S...Property Framework: Property Framework (binaries) Latest: Latest stable 6/8/2013xFunc: xFunc (2.2.0.0): Added: user functions;PHP Vulnerability Hunter: PHP Vulnerability Hunter 1.4.0.20 Alpha: PHP Vulnerability Hunter 1.4.0.20 AlphaXomega Framework: Xomega.Framework 1.4: Adding support for Visual Studio 2012 and .Net framework 4.5. Minor bug fixes and enhancements.sb0t v.5: sb0t 5.14: Stability fix in script engine. Avatar.exists property fixed in scripting. cb0t custom font protocol re-added and updated to support new Ares.ASP.NET MVC Forum: MVCForum v1.3.5: This is a bug release version, with a couple of small usability features and UI changes. All the small amount of bugs reported in v1.3 have been fixed, no upgrade needed just overwrite the files and everything should just work.Json.NET: Json.NET 5.0 Release 6: New feature - Added serialized/deserialized JSON to verbose tracing New feature - Added support for using type name handling with ISerializable content Fix - Fixed not using default serializer settings with primitive values and JToken.ToObject Fix - Fixed error writing BigIntegers with JsonWriter.WriteToken Fix - Fixed serializing and deserializing flag enums with EnumMember attribute Fix - Fixed error deserializing interfaces with a valid type converter Fix - Fixed error deser...Christoc's DotNetNuke Module Development Template: DotNetNuke 7 Project Templates V2.3 for VS2012: V2.3 - Release Date 6/5/2013 Items addressed in this 2.3 release Fixed bad namespace for BusinessController in one of the C# templates. Updated documentation in all templates. Setting up your DotNetNuke Module Development Environment Installing Christoc's DotNetNuke Module Development Templates Customizing the latest DotNetNuke Module Development Project TemplatesPulse: Pulse 0.6.7.0: A number of small bug fixes to stabilize the previous Beta. Sorry about the never ending "New Version" bug!QlikView Extension - Animated Scatter Chart: Animated Scatter Chart - v1.0: Version 1.0 including Source Code qar File Example QlikView application Tested With: Browser Firefox 20 (x64) Google Chrome 27 (x64) Internet Explorer 9 QlikView QlikView Desktop 11 - SR2 (x64) QlikView Desktop 11.2 - SR1 (x64) QlikView Ajax Client 11.2 - SR2 (based on x64)BarbaTunnel: BarbaTunnel 7.2: Warning: HTTP Tunnel is not compatible with version 6.x and prior, HTTP packet format has been changed. Check Version History for more information about this release.SuperWebSocket, a .NET WebSocket Server: SuperWebSocket 0.8: This release includes these changes below: Upgrade SuperSocket to 1.5.3 which is much more stable Added handshake request validating api (WebSocketServer.ValidateHandshake(TWebSocketSession session, string origin)) Fixed a bug that the m_Filters in the SubCommandBase can be null if the command's method LoadSubCommandFilters(IEnumerable<SubCommandFilterAttribute> globalFilters) is not invoked Fixed the compatibility issue on Origin getting in the different version protocols Marked ISub...BlackJumboDog: Ver5.9.0: 2013.06.04 Ver5.9.0 (1) ?????????????????????????????????($Remote.ini Tmp.ini) (2) ThreadBaseTest?? (3) ????POP3??????SMTP???????????????? (4) Web???????、?????????URL??????????????? (5) Ftp???????、LIST?????????????? (6) ?????????????????????Media Companion: Media Companion MC3.569b: New* Movies - Autoscrape/Batch Rescrape extra fanart and or extra thumbs. * Movies - Alternative editor can add manually actors. * TV - Batch Rescraper, AutoScrape extrafanart, if option enabled. Fixed* Movies - Slow performance switching to movie tab by adding option 'Disable "Not Matching Rename Pattern"' to Movie Preferences - General. * Movies - Fixed only actors with images were scraped and added to nfo * Movies - Fixed filter reset if selected tab was above Home Movies. * Updated Medi...Nearforums - ASP.NET MVC forum engine: Nearforums v9.0: Version 9.0 of Nearforums with great new features for users and developers: SQL Azure support Admin UI for Forum Categories Avoid html validation for certain roles Improve profile picture moderation and support Warn, suspend, and ban users Web administration of site settings Extensions support Visit the Roadmap for more details. Webdeploy package sha1 checksum: 9.0.0.0: e687ee0438cd2b1df1d3e95ecb9d66e7c538293b Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.93: Added -esc:BOOL switch (CodeSettings.AlwaysEscapeNonAscii property) to always force non-ASCII character (ch > 0x7f) to be escaped as the JavaScript \uXXXX sequence. This switch should be used if creating a Symbol Map and outputting the result to the a text encoding other than UTF-8 or UTF-16 (ASCII, for instance). Fixed a bug where a complex comma operation is the operand of a return statement, and it was looking at the wrong variable for possible optimization of = to just .Document.Editor: 2013.22: What's new for Document.Editor 2013.22: Improved Bullet List support Improved Number List support Minor Bug Fix's, improvements and speed upsNew ProjectsAcer 1420p Leaky Handle Fix: Fixes leaking handles on the Acer 1420p laptop given out at PDC09.Akismet Spam Filter for Community Server 2008.5: Akismet Spam Filter for Community Server 2008.5 Atom Timer: Atom Timer is a thread based time that allows schedules to be created using events.BRICK CMS: These Days,I am tired to listen that: .NET is going down and JAVA/Ruby/Python will replace it. yes,they have been growing up while .NET's going down. do or die?DataTestFramework: ???????&ORM??????Date/Time Interval: The Date Time Interval allows for different types of interval to be created. The class will enumerate the defined interval support LINQ statements. More informaDimas.Net: .net infrastructure to create a web/service server from scratch. it includes n-tier , log , policy injection , mapper , MVC best prictice and etc.Gannet: Gannet is an operating system for us (the target developers) to learn about how an Operating System is put together and what components are needed.Image Resize For Android: Android????????????LightBlog: LightBlog?????Node.js,Express??,Mongodb???markdown??????????Memory: Live artistic interaction using KinectNestedHtmlWriter: This is a helper class library for writing simple HTML document, by using statement in C#.Operation Sneak Peek: Windows Phone game that includes stealth+logic gameplay. Player has to look for hidden letters to discover a secret word and use it to defuse a bomb.Orchard DarkStripes Theme: Orchard theme based on Octopress DarkStripesPath copy from context menu: ????????????????????????????Phantomas: mouhouhahahahaSE1: NO SUMMARY ! SiteLinks DNN Module: The SiteLinks DNN module is a module for displaying a list of existing links on your DNN website. This module works in similarly to the DNN "Links" skin object.sql to object maping: SqlString CodeMapTCP/IP Communication Framework: TCP/IP Communication Framework (TCP/IP CF) is a library that wraps the .NET Socket class and defines several classes for developing communication applications..UTorrentClient Api: UTorrentClient Api is an extensible set of classes that use WebUI to manipulate µTorrent remotely.Visual Studio Spell Checker: A Visual Studio editor extension that checks the spelling of comments, strings, and plain text as you type. Supports configuration and various languages.zjsru_xyw: this is a test projectZTrans: ztrans is language for embedded software development???: test?????????: ??????????? ????:VS2012+SQL2012 ????:ASP.NET(.NET 4.0) ????:MVC3+EF5 ????: ?????,??,?? ???????,??,?? ????????,??,?? ????DIV+CSS?? Jquery??1.6.4 ??Ajax??????

    Read the article

  • CodePlex Daily Summary for Saturday, September 15, 2012

    CodePlex Daily Summary for Saturday, September 15, 2012Popular ReleasesMCEBuddy 2.x: MCEBuddy 2.2.15: Changelog for 2.2.15 (32bit and 64bit) 1. Added support for %originalfilepath% to get the source file full path. Used for custom commands only. 2. Added support for better parsing of Media Portal XML files to extract ShowName and Episode Name and download additional details from TVDB (like Season No, Episode No etc). 3. Added support for TVDB seriesID in metadata 4. Added support for eMail non blocking UI testCrashReporter.NET : Exception reporting library for C# and VB.NET: CrashReporter.NET 1.2: *Added html mail format which shows hierarchical exception report for better understanding.VCC: Latest build, v2.3.00914.0: Automatic drop of latest buildScarlet Road: Scarlet Road Test Build 007: Playable game. Includes source.DotNetNuke Search Engine Sitemaps Provider: Version 02.00.00: New release of the Search Engine Sitemap Providers New version - not backwards compatible with 1.x versions New sandboxing to prevent exceptions in module providers interfering with main provider Now installable using the Host->Extensions page New sitemaps available for Active Forums and Ventrian Property Agent Now derived from DotNetNuke Provider base for better framework integration DotNetNuke minimum compatibility raised to DNN 5.2, .NET to 3.5PDF Viewer Web part: PDF Viewer Web Part: PDF Viewer Web PartChris on SharePoint Solutions: View Grid Banding - v1.0: Initial release of the View Creation and Management Page Column Selector Banding solution.$linq - A Javascript LINQ library: Version 1.0: Version 1.0 Initial releasePowerConverter: PowerConverter Beta: This is the first release of PowerConverter. Allows for converting PE code to Power code.NetView Control for Microsoft Access: DevVersion 19852 - More Databinding and Resizing: NetView Renamed event GotFocus to Clicked Added events Clicked/DoubleClicked Added event BackgroundDoubleClicked Changed nomenclature for world coordinates to (Position1, Position2)|Extent1xExtent2 Renamed Locked -> Readonly Added properties Minimum1/Maximum1 and Minimum2/Maximum2 Removed NetView.DeviceDefinitionArea, obsolete properties Support for resizing Added properties BackColor and BorderColor NetView Properties form added binding field for BusinessId propagate erro...Runtime Dynamic Data Model Builder: Main Library Version 1.0.0.0: Main Library Version 1.0.0.0Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.67: Fix issue #18629 - incorrectly handling null characters in string literals and not throwing an error when outside string literals. update for Issue #18600 - forgot to make the ///#DEBUG= directive also set a known-global for the given debug namespace. removed the kill-switch for disregarding preprocessor define-comments (///#IF and the like) and created a separate CodeSettings.IgnorePreprocessorDefines property for those who really need to turn that off. Some people had been setting -kil...Lakana - WPF Framework: Lakana V2: Lakana V2 contains : - Lakana WPF Forms (with sample project) - Lakana WPF Navigation (with sample project)Microsoft SQL Server Product Samples: Database: OData QueryFeed workflow activity: The OData QueryFeed sample activity shows how to create a workflow activity that consumes an OData resource, and renders entity properties in a Microsoft Excel 2010 worksheet or Microsoft Word 2010 document. Using the sample QueryFeed activity, you can consume any OData resource. The sample activity uses LINQ to project OData metadata into activity designer expression items. By setting activity expressions, a fully qualified OData query string is constructed consisting of Resource, Filter, Or...Arduino for Visual Studio: Arduino 1.x for Visual Studio 2012, 2010 and 2008: Register for the visualmicro.com forum for more news and updates Version 1209.10 includes support for VS2012 and minor fixes for the Arduino debugger beta test team. Version 1208.19 is considered stable for visual studio 2010 and 2008. If you are upgrading from an older release of Visual Micro and encounter a problem then uninstall "Visual Micro for Arduino" using "Control Panel>Add and Remove Programs" and then run the install again. Key Features of 1209.10 Support for Visual Studio 2...Social Network Importer for NodeXL: SocialNetImporter(v.1.5): This new version includes: - Fixed the "resource limit" bug caused by Facebook - Bug fixes To use the new graph data provider, do the following: Unzip the Zip file into the "PlugIns" folder that can be found in the NodeXL installation folder (i.e "C:\Program Files\Social Media Research Foundation\NodeXL Excel Template\PlugIns") Open NodeXL template and you can access the new importer from the "Import" menuAcDown????? - AcDown Downloader Framework: AcDown????? v4.1: ??●AcDown??????????、??、??、???????。????,????,?????????????????????????。???????????Acfun、????(Bilibili)、??、??、YouTube、??、???、??????、SF????、????????????。 ●??????AcPlay?????,??????、????????????????。 ● AcDown??????????????????,????????????????????????????。 ● AcDown???????C#??,????.NET Framework 2.0??。?????"Acfun?????"。 ????32??64? Windows XP/Vista/7/8 ???? 32??64? ???Linux ????(1)????????Windows XP???,?????????.NET Framework 2.0???(x86),?????"?????????"??? (2)???????????Linux???,????????Mono?? ??...Move Mouse: Move Mouse 2.5.2: FIXED - Minor fixes and improvements.MVC Controls Toolkit: Mvc Controls Toolkit 2.3: Added The new release is compatible with Mvc4 RTM. Support for handling Time Zones in dates. Specifically added helper methods to convert to UTC or local time all DateTimes contained in a model received by a controller, and helper methods to handle date only fileds. This together with a detailed documentation on how TimeZones are handled in all situations by the Asp.net Mvc framework, will contribute to mitigate the nightmare of dates and timezones. Multiple Templates, and more options to...DNN Metro7 style Skin package: Metro7 style Skin for DotNetNuke 06.02.00: Maintenance Release Changes on Metro7 06.02.00 Fixed width and height on the jQuery popup for the Editor. Navigation Provider changed to DDR menu Added menu files and scripts Changed skins to Doctype HTML Changed manifest to dnn6 manifest file Changed License to HTML view Fixed issue on Metro7/PinkTitle.ascx with double registering of the Actions Changed source folder structure and start folder, so the project works with the default DNN structure on developing Added VS 20...New ProjectsBizTalk Zombie Management: A powerful tool to handle zombie. As a service you can monitor all zombie instance and process them. For the moment only file is supporting.bxkw8: oooooooooooh long johnsonCellularSolver: The main idea of a this project - create cellular automation (CA) simulation system. We try to reduce ODE/PDE/Integral Equations models to CA-modelEAWebService: EAWebService is web service that executes parallel evolutionary algorithm. Finite Element Method Samples with C#: Finite Element Method Samples with C# Game Jolt C# Trophy API: The Game Jolt Trophy API provides dotNET developers with access to the Game Jolt services including Trophies, High Scores, Data Storage and many more.GNSystem: GNSystem is a simple (yet, no so elegant) Web-Application which contains a Forum system and a CMS\Blog system. GNSystem is written in ASP.Net MVC 4 using C#Hospital Management System (HMS): HMS is a software basically working to make the hospital management much easier and fasterInfinity - WPF.MVC: Framework for WPF/SL/WinFormsKindle: Kindle PublisherMetroCash: A personal finance management programmetroCIS: metroCIS - Eine open-source Anwendung für Windows8 Verwalte dein Studium an der FH Technikum Wien mit dieser App und erleichter dir damit dein Studentenleben.MTAC: MTAC, for My Tfs Administration Center, is a centralized administration tool for TFSMyStart: Create an Open Source implementation of the Windows Start Menu (based initially on Windows 7), to be used on Windows 8.NLite Data Framework: NLite Linq ORM frameworkPDF Viewer Web part: Here now presenting PDF Viewer web part solution with code. Project91405: dfgfdgfdrProject91407: awqwqProject91407M: 111Purchasesales(??????): a simple Sales Manage Project.QueryOver Specification: A simple implementation of the Specification Pattern using NHibernate QueryOver.Shopping Analytics: Esta aplicacion muestra como aprovechar diversas caracteristicas de la plataforma Windows Phone.simbo: Simbo is a simple, fun app for sharing small notes with friends where many of the concepts in your note can be represented by a symbols.SISLOG: El sistema de logística SISLOG es un software que cual será capaz de automatizar y optimizar los procesos que se llevan a cabo en el área de logística.SQL Server Scripts - A RSSUG CodePlex Project: The SQL Server Scripts project is dedicated to supplying high quality scripts to help with the maintenance and development of SQL Server in every environment.Talqum.League: Talqum.League is a League organisator and statistics app.The Pratoriate Foundation: used for all software dev projects for the non profit Pratoriate Foundation.

    Read the article

  • Oracle Solaris Zones Physical to virtual (P2V)

    - by user939057
    IntroductionThis document describes the process of creating and installing a Solaris 10 image build from physical system and migrate it into a virtualized operating system environment using the Oracle Solaris 10 Zones Physical-to-Virtual (P2V) capability.Using an example and various scenarios, this paper describes how to take advantage of theOracle Solaris 10 Zones Physical-to-Virtual (P2V) capability with other Oracle Solaris features to optimize performance using the Solaris 10 resource management advanced storage management using Solaris ZFS plus improving operating system visibility with Solaris DTrace. The most common use for this tool is when performing consolidation of existing systems onto virtualization enabled platforms, in addition to that we can use the Physical-to-Virtual (P2V) capability  for other tasks for example backup your physical system and move them into virtualized operating system environment hosted on the Disaster Recovery (DR) site another option can be building an Oracle Solaris 10 image repository with various configuration and a different software packages in order to reduce provisioning time.Oracle Solaris ZonesOracle Solaris Zones is a virtualization and partitioning technology supported on Oracle Sun servers powered by SPARC and Intel processors.This technology provides an isolated and secure environment for running applications. A zone is a virtualized operating system environment created within a single instance of the Solaris 10 Operating System.Each virtual system is called a zone and runs a unique and distinct copy of the Solaris 10 operating system.Oracle Solaris Zones Physical-to-Virtual (P2V)A new feature for Solaris 10 9/10.This feature provides the ability to build a Solaris 10 images from physical system and migrate it into a virtualized operating system environmentThere are three main steps using this tool1. Image creation on the source system, this image includes the operating system and optionally the software in which we want to include within the image. 2. Preparing the target system by configuring a new zone that will host the new image.3. Image installation on the target system using the image we created on step 1. The host, where the image is built, is referred to as the source system and the host, where theimage is installed, is referred to as the target system. Benefits of Oracle Solaris Zones Physical-to-Virtual (P2V)Here are some benefits of this new feature:  Simple- easy build process using Oracle Solaris 10 built-in commands.  Robust- based on Oracle Solaris Zones a robust and well known virtualization technology.  Flexible- support migration between V series servers into T or -M-series systems.For the latest server information, refer to the Sun Servers web page. PrerequisitesThe target Oracle Solaris system should be running the latest version of the patching patch cluster. and the minimum Solaris version on the target system should be Solaris 10 9/10.Refer to the latest Administration Guide for Oracle Solaris for a complete procedure on how todownload and install Oracle Solaris. NOTE: If the source system that used to build the image is an older version then the targetsystem, then during the process, the operating system will be upgraded to Solaris 10 9/10(update on attach).Creating the Image Used to distribute the software.We will create an image on the source machine. We can create the image on the local file system and then transfer it to the target machine, or build it into a NFS shared storage andmount the NFS file system from the target machine.Optional  before creating the image we need to complete the software installation that we want to include with the Solaris 10 image.An image is created by using the flarcreate command:Source # flarcreate -S -n s10-system -L cpio /var/tmp/solaris_10_up9.flarThe command does the following:  -S specifies that we skip the disk space check and do not write archive size data to the archive (faster).  -n specifies the image name.  -L specifies the archive format (i.e cpio). Optionally, we can add descriptions to the archive identification section, which can help to identify the archive later.Source # flarcreate -S -n s10-system -e "Oracle Solaris with Oracle DB10.2.0.4" -a "oracle" -L cpio /var/tmp/solaris_10_up9.flarYou can see example of the archive identification section in Appendix A: archive identification section.We can compress the flar image using the gzip command or adding the -c option to the flarcreate commandSource # gzip /var/tmp/solaris_10_up9.flarAn md5 checksum can be created for the image in order to ensure no data tamperingSource # digest -v -a md5 /var/tmp/solaris_10_up9.flar Moving the image into the target system.If we created the image on the local file system, we need to transfer the flar archive from the source machine to the target machine.Source # scp /var/tmp/solaris_10_up9.flar target:/var/tmpConfiguring the Zone on the target systemAfter copying the software to the target machine, we need to configure a new zone in order to host the new image on that zone.To install the new zone on the target machine, first we need to configure the zone (for the full zone creation options see the following link: http://docs.oracle.com/cd/E18752_01/html/817-1592/index.html  )ZFS integrationA flash archive can be created on a system that is running a UFS or a ZFS root file system.NOTE: If you create a Solaris Flash archive of a Solaris 10 system that has a ZFS root, then bydefault, the flar will actually be a ZFS send stream, which can be used to recreate the root pool.This image cannot be used to install a zone. You must create the flar with an explicit cpio or paxarchive when the system has a ZFS root.Use the flarcreate command with the -L archiver option, specifying cpio or pax as themethod to archive the files. (For example, see Step 1 in the previous section).Optionally, on the target system you can create the zone root folder on a ZFS file system inorder to benefit from the ZFS features (clones, snapshots, etc...).Target # zpool create zones c2t2d0 Create the zone root folder:Target # chmod 700 /zones Target # zonecfg -z solaris10-up9-zonesolaris10-up9-zone: No such zone configuredUse 'create' to begin configuring a new zone.zonecfg:solaris10-up9-zone> createzonecfg:solaris10-up9-zone> set zonepath=/zoneszonecfg:solaris10-up9-zone> set autoboot=truezonecfg:solaris10-up9-zone> add netzonecfg:solaris10-up9-zone:net> set address=192.168.0.1zonecfg:solaris10-up9-zone:net> set physical=nxge0zonecfg:solaris10-up9-zone:net> endzonecfg:solaris10-up9-zone> verifyzonecfg:solaris10-up9-zone> commitzonecfg:solaris10-up9-zone> exit Installing the Zone on the target system using the imageInstall the configured zone solaris10-up9-zone by using the zoneadm command with the install -a option and the path to the archive.The following example shows how to create an Image and sys-unconfig the zone.Target # zoneadm -z solaris10-up9-zone install -u -a/var/tmp/solaris_10_up9.flarLog File: /var/tmp/solaris10-up9-zone.install_log.AJaGveInstalling: This may take several minutes...The following example shows how we can preserve system identity.Target # zoneadm -z solaris10-up9-zone install -p -a /var/tmp/solaris_10_up9.flar Resource management Some applications are sensitive to the number of CPUs on the target Zone. You need tomatch the number of CPUs on the Zone using the zonecfg command:zonecfg:solaris10-up9-zone>add dedicated-cpuzonecfg:solaris10-up9-zone> set ncpus=16DTrace integrationSome applications might need to be analyzing using DTrace on the target zone, you canadd DTrace support on the zone using the zonecfg command:zonecfg:solaris10-up9-zone>setlimitpriv="default,dtrace_proc,dtrace_user" Exclusive IP stack An Oracle Solaris Container running in Oracle Solaris 10 can have a shared IP stack with the global zone, or it can have an exclusive IP stack (which was released in Oracle Solaris 10 8/07). An exclusive IP stack provides a complete, tunable, manageable and independent networking stack to each zone. A zone with an exclusive IP stack can configure Scalable TCP (STCP), IP routing, IP multipathing, or IPsec. For an example of how to configure an Oracle Solaris zone with an exclusive IP stack, see the following example zonecfg:solaris10-up9-zone set ip-type=exclusivezonecfg:solaris10-up9-zone> add netzonecfg:solaris10-up9-zone> set physical=nxge0 When the installation completes, use the zoneadm list -i -v options to list the installedzones and verify the status.Target # zoneadm list -i -vSee that the new Zone status is installedID NAME STATUS PATH BRAND IP0 global running / native shared- solaris10-up9-zone installed /zones native sharedNow boot the ZoneTarget # zoneadm -z solaris10-up9-zone bootWe need to login into the Zone order to complete the zone set up or insert a sysidcfg file beforebooting the zone for the first time see example for sysidcfg file in Appendix B: sysidcfg filesectionTarget # zlogin -C solaris10-up9-zoneTroubleshootingIf an installation fails, review the log file. On success, the log file is in /var/log inside the zone. Onfailure, the log file is in /var/tmp in the global zone.If a zone installation is interrupted or fails, the zone is left in the incomplete state. Use uninstall -F to reset the zone to the configured state.Target # zoneadm -z solaris10-up9-zone uninstall -FTarget # zonecfg -z solaris10-up9-zone delete -FConclusionOracle Solaris Zones P2V tool provides the flexibility to build pre-configuredimages with different software configuration for faster deployment and server consolidation.In this document, I demonstrated how to build and install images and to integrate the images with other Oracle Solaris features like ZFS and DTrace.Appendix A: archive identification sectionWe can use the head -n 20 /var/tmp/solaris_10_up9.flar command in order to access theidentification section that contains the detailed description.Target # head -n 20 /var/tmp/solaris_10_up9.flarFlAsH-aRcHiVe-2.0section_begin=identificationarchive_id=e4469ee97c3f30699d608b20a36011befiles_archived_method=cpiocreation_date=20100901160827creation_master=mdet5140-1content_name=s10-systemcreation_node=mdet5140-1creation_hardware_class=sun4vcreation_platform=SUNW,T5140creation_processor=sparccreation_release=5.10creation_os_name=SunOScreation_os_version=Generic_142909-16files_compressed_method=nonecontent_architectures=sun4vtype=FULLsection_end=identificationsection_begin=predeploymentbegin 755 predeployment.cpio.ZAppendix B: sysidcfg file sectionTarget # cat sysidcfgsystem_locale=Ctimezone=US/Pacificterminal=xtermssecurity_policy=NONEroot_password=HsABA7Dt/0sXXtimeserver=localhostname_service=NONEnetwork_interface=primary {hostname= solaris10-up9-zonenetmask=255.255.255.0protocol_ipv6=nodefault_route=192.168.0.1}name_service=NONEnfs4_domain=dynamicWe need to copy this file before booting the zoneTarget # cp sysidcfg /zones/solaris10-up9-zone/root/etc/

    Read the article

  • CodePlex Daily Summary for Saturday, March 24, 2012

    CodePlex Daily Summary for Saturday, March 24, 2012Popular Releasesmenu4web: menu4web 0.0.3: menu4web 0.0.3Craig's Utility Library: Craig's Utility Library 3.1: This update adds about 60 new extension methods, a couple of new classes, and a number of fixes including: Additions Added DateSpan class Added GenericDelimited class Random additions Added static thread friendly version of Random.Next called ThreadSafeNext. AOP Manager additions Added Destroy function to AOPManager (clears out all data so system can be recreated. Really only useful for testing...) ORM additions Added PagedCommand and PageCount functions to ObjectBaseClass (same as M...SQL Monitor - managing sql server performance: SQLMon 4.2 alpha 14: 1. improved accuracy of logic fault checking in analysisMapWindow 6 Desktop GIS: MapWindow 6.1.1: MapWindow 6 Desktop GIS is an open source desktop GIS for Microsoft Windows that is built upon the DotSpatial Library. This release requires .Net 4 (Client Profile). Are you a software developer?Instead of downloading MapWindow for development purposes, get started with with the DotSpatial templateDotSpatial: DotSpatial 1.1: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components are available as NuGet pa...Indiefreaks Game Framework: 0.9.2.0: Feature: Added SunBurn engine v2.0.18.7 support (doesn't support versions below). Feature: Added GammaCorrection Post processor to allow developers or even players to tweak the Gamma of the game depending on their screen (courtesy of bamyazi) Feature: Added Windows, Xbox 360 & WP7 enabled StorageManager (based on Nick Gravelyn's EasyStorage) to read/write files for player or game data. Feature: Added VirtualGamePad feature for WP7 allowing developers to define Touch areas on screen and mapped...Code for Rapid C# Windows Development eBook + LINQPad and Data Tools: LLBLGen LINQPad Data Context Driver Version 2.1: Sixth release of a LLBLGen Pro Typed Data Context Driver for LINQPad. For LLBLGen Pro versions 3.1 and 3.5(coming). New features:When you switch the query language to SQL, LINQPad updates the Schema Explorer to show SQL column names rather than CLR property names Connection dialog unloads assemblies when it has closed down so they are no longer locked - this allows them to be rebuilt while LINQPad is still open Connection dialog includes a button to quickly add assemblies needed for the...People's Note: People's Note 0.40: Version 0.40 adds an option to compact the database from the profile screen. Compacting a database can make it smaller and faster by removing empty spaces left over by editing, moving, and deleting notes. To install: copy the appropriate CAB file onto your WM device and run it.Microsoft All-In-One Code Framework - a centralized code sample library: C++, .NET Coding Guideline: Microsoft All-In-One Code Framework Coding Guideline This document describes the coding style guideline for native C++ and .NET (C# and VB.NET) programming used by the Microsoft All-In-One Code Framework project team.WebDAV for WHS: Version 1.0.67: - Added: Check whether the Remote Web Access is turned on or not; - Added: Check for Add-In updates;Image 3D Viewer: Image 3D Viewer: WPF .Net 3.5 .Net 4 .Net 4.5Phalanger - The PHP Language Compiler for the .NET Framework: 3.0 (March 2012) for .NET 4.0: March release of Phalanger 3.0 significantly enhances performance, adds new features and fixes many issues. See following for the list of main improvements: New features: Phalanger Tools installable for Visual Studio 2011 Beta "filter" extension with several most used filters implemented DomDocument HTML parser, loadHTML() method mail() PHP compatible function PHP 5.4 T_CALLABLE token PHP 5.4 "callable" type hint PCRE: UTF32 characters in range support configuration supports <c...Nearforums - ASP.NET MVC forum engine: Nearforums v8.0: Version 8.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Internationalization Custom authentication provider Access control list for forums and threads Webdeploy package checksum: abc62990189cf0d488ef915d4a55e4b14169bc01 Visit Roadmap for more details.BIDS Helper: BIDS Helper 1.6: This beta release is the first to support SQL Server 2012 (in addition to SQL Server 2005, 2008, and 2008 R2). Since it is marked as a beta release, we are looking for bug reports in the next few months as you use BIDS Helper on real projects. In addition to getting all existing BIDS Helper functionality working appropriately in SQL Server 2012 (SSDT), the following features are new... Analysis Services Tabular Smart Diff Tabular Actions Editor Tabular HideMemberIf Tabular Pre-Build ...Json.NET: Json.NET 4.5 Release 1: New feature - Windows 8 Metro build New feature - JsonTextReader automatically reads ISO strings as dates New feature - Added DateFormatHandling to control whether dates are written in the MS format or ISO format, with ISO as the default New feature - Added DateTimeZoneHandling to control reading and writing DateTime time zone details New feature - Added async serialize/deserialize methods to JsonConvert New feature - Added Path to JsonReader/JsonWriter/ErrorContext and exceptions w...SCCM Client Actions Tool: SCCM Client Actions Tool v1.11: SCCM Client Actions Tool v1.11 is the latest version. It comes with following changes since last version: Fixed a bug when ping and cmd.exe kept running in endless loop after action progress was finished. Fixed update checking from Codeplex RSS feed. The tool is downloadable as a ZIP file that contains four files: ClientActionsTool.hta – The tool itself. Cmdkey.exe – command line tool for managing cached credentials. This is needed for alternate credentials feature when running the HTA...WebSocket4Net: WebSocket4Net 0.5: Changes in this release fixed the wss's default port bug improved JsonWebSocket supported set client access policy protocol for silverlight fixed a handshake issue in Silverlight fixed a bug that "Host" field in handshake hadn't contained port if the port is not default supported passing in Origin parameter for handshaking supported reacting pings from server side fixed a bug in data sending fixed the bug sending a closing handshake with no message which would cause an excepti...SuperWebSocket, a .NET WebSocket Server: SuperWebSocket 0.5: Changes included in this release: supported closing handshake queue checking improved JSON subprotocol supported sending ping from server to client fixed a bug about sending a closing handshake with no message refactored the code to improve protocol compatibility fixed a bug about sub protocol configuration loading in Mono improved BasicSubProtocol added JsonWebSocketSessionSurvey™ - web survey & form engine: Survey™ 2.0: The new stable Survey™ Project 2.0.0.1 version contains many new features like: Technical changes: - Use of Jquery, ASTreeview, Tabs, Tooltips and new menuprovider Features & Bugfixes: Survey list and search function Folder structure for surveys New Menustructure Library list New Library fields User list and search functions Layout options for a survey with CSS, page header and footer New IP filter security feature Enhanced Token Management New Question fields as ID, Alias...Speed up Printer migration using PrintBrm and it's configuration files: BRMC.EXE: Run the tool from the extracted directory of the printbrm backup. You can use the following command to extract a backup file to a directory - PRINTBRM.EXE -R -D C:\TEMP\EXPAND -F C:\TEMP\PRINTERBACKUP.PRINTEREXPORTNew ProjectsAsp.NET Url Router: 1.Url rewritting. 2.Provider regex matcher 3.Support custom url validate handler.BC-Web: ch projectCape: Dynamically generates Capistrano recipes for Rake tasks.cstgamebgs: Project for wp7GCalculator: GCalculator for performing basic arithmetic operations. Windows Sidebar Gadget invacc: Invacc- for inventory and Account Onlineirgsh-node: Worker nodes of BlankOn Package Factory - http://irgsh.blankonlinux.or.id/irgsh-repo: Repository manager node of BlankOn Package Factory - http://irgsh.blankonlinux.or.id/irgsh-web: Web interface and task manager of BlankOn Package Factory - http://irgsh.blankonlinux.or.id/Kinect Explorer For SharePoint 2010: Kinect Explorer for SharePoint is a tool which provide Natural User Interface to browse through SharePoint sites. Use body gestures to browse, read, move, copy documents. Use Speech services to read-out the files.MCU: mcu devMVC3ShellCode: MVC3ShellCode MVC3ShellCode MVC3ShellCode MVC3ShellCode MVC3ShellCode MVC3ShellCode NetWatch: NetWatch - network watchdog Small application primary designed for network connectivity monitoring. You can configure set of network tests (ping, http, ...) and time plan for this tests. Application is running in windows notification area and notife you each problem. NMortgage: The goal of this project will be to give a prospective home buyer or an existing home owner the insight they need to explore effects of different repayment strategies or different mortgage structures. Nucleo.NET MVP: The Nucleo MVP framework provides a Model-View-Presenter approach that isn't obtrusive, can be utilized in multiple environments, and is versatile. Providing a lot of features you see in other frameworks, the Nucleo MVP framework provides many extensibility points, pretty much allowing you to rewrite most of the framework. It features dynamic injection support, presenter and view initializers (like what you see in ASP.NET MVC), model property injection, attribute- and convention-based vie...P2PShare: This project is to build a new and moden System for p2p file shearing supporting downloads from HTTP, HTTPS, FTP support for P2Pshare client list servers so files can point to a server or a host only file so no servers are used and only p2p is usedPipeLayer: proyecto de sistemas inteligentespython-irgsh: Python library for BlankOn Package Factory - http://irgsh.blankonlinux.or.id/RamGec XNA Controls - Window Elements Library for XNA Solutions: Lightweight, ultra-high performance and flexible library for displaying and managing Window Controls for XNA system. Features its own Window Designer for creating custom windows and controls.RPG Character Generators and Tools: Various tools for pen and paper style role playing games.Screen scraper: A program that can be used to download public domain MP3 and other media such as pdf documents.SharePoint Bdc request library: The given set of classes simplifies an access to the external data, which can be reached through BDC. The library allows to make simple requests for values from external data source, using a BDC Entity Instance Identifier(s) or a value of a certain BDC Entity field. Developed to interact with Business Data Connectivity of SharePoint 2010.testtom03232012git01: testtom03232012git01testtom03232012git02: testtom03232012git02the north star uc: University projectTyphon: Typhon is a role playing simulation management application, much like Nova, but written in MVC/C#.VRE LabTrove-SharePoint connector: The VRE LabTrove-SharePoint Connector provides a means of integrating the ability to view, post to, and edit posts stored in a LabTrove electronic laboratory notebook from within the familiar environment of Microsoft SharePoint. Once installed and configured, these Web Parts give SharePoint users a straightforward way to interact with any LabTrove installation that they wish to use. They also facilitate users to attach data that is stored in a SharePoint Document Library to the LabTrove posts...

    Read the article

  • CodePlex Daily Summary for Monday, October 14, 2013

    CodePlex Daily Summary for Monday, October 14, 2013Popular ReleasesAD ACL Scanner: 1.3.2: Minor bug fixed: Powershell 4.0 will report: Select—Object: Parameter cannot be processed because the parameter name p is ambiguous.Json.NET: Json.NET 5.0 Release 7: New feature - Added support for Immutable Collections New feature - Added WriteData and ReadData settings to DataExtensionAttribute New feature - Added reference and type name handling support to extension data New feature - Added default value and required support to constructor deserialization Change - Extension data is now written when serializing Fix - Added missing casts to JToken Fix - Fixed parsing large floating point numbers Fix - Fixed not parsing some ISO date ...Fast YouTube Downloader: YouTube Downloader 2.2.0: YouTube Downloader 2.2.0VidCoder: 1.5.8 Beta: Added hardware acceleration options: Bicubic OpenCL scaling algorithm, QSV decoding/encoding and DXVA decoding. Updated HandBrake core to SVN 5834. Updated VidCoder setup icon. Fixed crash when choosing the mp4v2 container on x86 and opening on x64. Warning: the hardware acceleration features require specific hardware or file types to work correctly: QSV: Need an Intel processor that supports Quick Sync Video encoding, with a monitor hooked up to the Intel HD Graphics output and the lat...ASP.net MVC Awesome - jQuery Ajax Helpers: 3.5.2: version 3.5.2 - fix for setting single value to multivalue controls - datepicker min max date offset fix - html encoding for keys fix - enable Column.ClientFormatFunc to be a function call that will return a function version 3.5.1 - fixed html attributes rendering - fixed loading animation rendering - css improvements version 3.5 ========================== - autosize for all popups ( can be turned off by calling in js awe.autoSize = false ) - added Parent, Paremeter extensions ...Coevery - Free ASP.NET CRM: Coevery_WebApp: it just for publish to microsoft web app gellaryWsus Package Publisher: Release v1.3.1310.12: Allow the Update Creation Wizard to be set in full screen mode. Fix a bug which prevent WPP to Reset Remote Sus Client ID. Change the behavior of links in the Update Detail Viewer. Left-Click to open, Right-Click to copy to the Clipboard.TerrariViewer: TerrariViewer v7 [Terraria Inventory Editor]: This is a complete overhaul but has the same core style. I hope you enjoy it. This version is compatible with 1.2.0.3 Please send issues to my Twitter or https://github.com/TJChap2840WDTVHubGen - Adds Metadata, thumbnails and subtitles to WDTV Live Hubs: WDTVHubGen.v2.1.6.maint: I think this covers all of the issues. new additions: fixed the thumbnail problem for backgrounds. general clean up and error checking. need to get this put through the wringer and all feedback is welcome.BIDS Helper: BIDS Helper 1.6.4: This BIDS Helper release brings the following new features and fixes: New Features: A new Bus Matrix style report option when you run the Printer Friendly Dimension Usage report for an SSAS cube. The Biml engine is now fully in sync with the supported subset of Varigence Mist 3.4. This includes a large number of language enhancements, bugfixes, and project deployment support. Fixed Issues: Fixed Biml execution for project connections fixing a bug with Tabular Translations Editor not a...Free language translator and file converter: Free Language Translator 3.4: fixe for new version look up.MoreTerra (Terraria World Viewer): MoreTerra 1.11.3: =========== =New Features= =========== New Markers added for Plantera's Bulb, Heart Fruits and Gold Cache. Markers now correctly display for the gems found in rock debris on the floor. =========== =Compatibility= =========== Fixed header changes found in Terraria 1.0.3.1Dynamics AX 2012 R2 Kitting: First Beta release of Kitting: First Beta release of Kitting Install by using XPO or Models.C# Intellisense for Notepad++: Release v1.0.8.0: - fixed document formatting artifacts To avoid the DLLs getting locked by OS use MSI file for the installation.CS-Script for Notepad++ (C# intellisense and code execution): Release v1.0.8.0: - fixed document formatting artifacts To avoid the DLLs getting locked by OS use MSI file for the installation.Generic Unit of Work and Repositories Framework: v2.0: Async methods for Repostiories - Ivan (@ifarkas) OData Async - Ivan (@ifarkas) Glimpse MVC4 workig with MVC5 Glimpse EF6 Northwind.Repostiory Project (layer) best practices for extending the Repositories Northwind.Services Project (layer), best practices for implementing business facade Live Demo: http://longle.azurewebsites.net/Spa/Product#/list Documentation: http://blog.longle.net/2013/10/09/upgrading-to-async-with-entity-framework-mvc-odata-asyncentitysetcontroller-kendo-ui-gli...Media Companion: Media Companion MC3.581b: Fix in place for TVDB xml issue. New* Movie - General Preferences, allow saving of ignored 'The' or 'A' to end of movie title, stored in sorttitle field. * Movie - New Way for Cropping Posters. Fixed* Movie - Rename of folders/filename. caught error message. * Movie - Fixed Bug in Save Cropped image, only saving in Pre-Frodo format if Both model selected. * Movie - Fixed Cropped image didn't take zoomed ratio into effect. * Movie - Separated Folder Renaming and File Renaming fuctions durin...(Party) DJ Player: DJP.124.12: 124.12 (Feature implementation completed): Changed datatype of HistoryDateInfo from string to DateTime New: HistoryDateInfoConverter for the listbox Improved: HistoryDateInfoDeleter, HistoryDateInfoLoader, HistoryDateInfoAsynchronizer, HistoryItemLoader, HistoryItemsToHistoryDateInfoConverterSmartStore.NET - Free ASP.NET MVC Ecommerce Shopping Cart Solution: SmartStore.NET 1.2.0: HighlightsMulti-store support "Trusted Shops" plugins Highly improved SmartStore.biz Importer plugin Add custom HTML content to pages Performance optimization New FeaturesMulti-store-support: now multiple stores can be managed within a single application instance (e.g. for building different catalogs, brands, landing pages etc.) Added 3 new Trusted Shops plugins: Seal, Buyer Protection, Store Reviews Added Display as HTML Widget to CMS Topics (store owner now can add arbitrary HT...NuGet: NuGet 2.7.1: Released October 07, 2013. Release notes: http://docs.nuget.org/docs/release-notes/nuget-2.7.1 Important note: After downloading the signed build of NuGet.exe, if you perform an update using the "nuget.exe update -self" command, it will revert back to the unsigned build.New Projectsaigiongai: aigiongaiArtificial LIfe BEing - ALIBE: An Autonomous Arduino Robot based on a 4 wheel electric RC style vehicle equipped w/ various sensors and reactionary devices powered by Arduino Mega. BH Auto-injector: This ia a BH hack injector developed for r/SlashDiablo. For more information visit: http://www.reddit.com/r/slashdiablo BloxServer: A server for a game currently in DevBugNet Premium: BugNet Premium is extended version of open source defect tracking tool BugNetCASP Editor: CASP Editor hosted at SimlogicalGrace: Grace is an Dependency Injection Container as well as a couple other services like Data Transformation Service, Validation Service, and Reflection Service.Halogy v1.2 CE1.0: Halogy v1.2 Community Edition v1.0Hot Likes: Simple web Application Like Facebook and TwitterHyper-V Backup.NET: TestIndexedList: This library helps you to create lists with large amount of data and do high speed searches on your data. Kinect Skeleton Stream to BVH Converter: The program converts the skeleton data stream into BVH.Mama Goose's Birthday & Anniversary Calendar: This is a simple calendar that allows entering birthdays and anniversaries and to print the calendar for the year or month.MQL to FDK converter with migration toolkit: "MQL to FDK" project provides an easy and convenient way for conversion of MQL advisers to C# and launching them on FDK.Produzr: Produzr is a very simple CMS without a high learning curve.PSWFWeb - PowerShell Workflow WebConsole: A PSWorkflow job managersacm: naSave Cleaner: Sims 3 save cleanerSimple AES file appender: Simple AES file appenderSims 3 Package Viewer: Sims 3 ProjectsxBot Framework: xBot Framework is a dot Net framework for building a bot for use with Reddit.com.

    Read the article

  • SOA Suite Integration: Part 3: Loading files

    - by Anthony Shorten
    One of the most common scenarios in SOA Integration is the loading of a file into the product from an external source. In Oracle SOA Suite there is a File Adapter that can process many file types into your BPEL process. For this example I will use the File Adapter to load a file of user and emails to update the user object within the Oracle Utilities Application Framework. Remember you can repeat this process with other objects and other file types. Again I am illustrating the ease of integration. The first thing is to create an empty BPEL process that will hold our flow. In Oracle JDeveloper this can be achieved by specifying the Define Service Later template (as other templates have predefined inputs and outputs and in this case we want to specify those). So I will create simpleFileLoad process to house our process. You will start with an empty canvas so you need to first specify the load part of the process using the File Adapter. Select the File Adapter from the Component Palette under BPEL Services and drag and drop it to the left side Partner Links (left is input). You name the Service. In this case I chose LoadFile. Press Next. We will define the interface as part of the wizard so select Define from operation and schema (specified later). Press Next. We are going to choose Read File to denote that we will read the file and specify the default Operation Name as Read. Press Next. The next step is to tell the Adapter the location of the files, how to process them and what to do with them after they have been processed. I am using hardcoded locations in this example but you can have logical locations as well. Press Next. I am now going to tell the adapter how to recognize the files I want to load. In my case I am using CSV files and more importantly I am tell the adapter to run the process for each record in the file it encounters. Press Next. Now, I tell the adapter how often I want to poll for the files. I have taken the defaults. Press Next. At this stage I have no explanation of the format of the input. So I am going to invoke the Native Format Wizard which will guide me through the process of creating the file input format. Clicking the purple cog icon will start the wizard. After an introduction screen (not shown), you specify the format of the input file. The File Adapter supports multiple format types. For this example, I will use Delimited as I am going to load a CSV file. Press Next. The best way for the wizard to work is with a sample. I have a sample file and the wizard will ask how much of the file to use as a template. I will use the defaults. Note: If you are using a language that has other languages other than US-ASCII, it is at this point you specify the character set to use.  Press Next. The sample contains multiple instances of a single record type. The wizard supports complex types as well. We will use the appropriate setting for our file. Press Next. You have to specify the file element and the record element. This will be used by the input wizard to translate the CSV data into an XML structure (this will make sense later). I am using LoadUsers as my file delimiter (root element) and User Record as my record root element. Press Next. As the file is CSV the delimiter is "," so I will also specify that the End Of Line (EOL) indicator indicates the end of a record. Press Next. Up until this point your have not given the columns their names. In my case my sample includes the column names in the first record. This is not always the case but you can specify the names and formats of columns in this dialog (not shown). Press Next. The wizard now generates the schema for the input file. You can specify a name for the schema. I have used userupdate.xsd. We want to verify the schema so press Test. You can test the schema by specifying an input sample. and pressing the green play button. You will see the delimiters you specified earlier for the file and the records. Press Ok to continue. A confirmation screen will be displayed showing you the location of the schema in your project. Press Finish to return to the File Adapter configuration. You will now see the schema and elements prepopulated from the wizard. Press Next. The File Adapter configuration is now complete. Press Finish. Now you need to receive the input from the LoadFile component so we need to place a Receive node in the BPEL process by drag and dropping the Receive component from the Component Palette under BPEL Constructs onto the BPEL process. We link the receive process with the LoadFile component by dragging the left most connect node of the Receive node to the LoadFile component. Once the link is established you need to name the Receive node appropriately and as in the post of the last part of this series you need to generate input variables for the BPEL process to hold the input records in. You need to now add the product Web Service. The process is the same as described in the post of the last part of this series. You drop the Web Service BPEL Service onto the right side of the process and fill in the details of the WSDL URL . You also have to add an Invoke node to call the service and generate the input and outputs variables for the call in the Invoke node. Now, to get the inputs from File to the service. You have to use a Transform (you can use an Assign action but a Transform action is more flexible). You drag and drop the Transform component from the Component Palette under Oracle Extensions and place it between the Receive and Invoke nodes. We name the Transform Node, Mapper File and associate the source of the mapping the schema from the Receive node and the output will be the input variable from the Invoke node. We now build the transform. We first map the user and email attributes by drag and drop the elements from the left to the right. The reason we needed to use the transform is that we will be telling the AS-User service that we want to issue an update action. Remember when we registered the service we actually used Read as the default. If we do not otherwise inform the service to use the Update action it will use the Read action instead (which is not desired). To specify the update action you need to click on the transactionType node on the right and select Set Text to set the action. You need to specify the transactionType of UPD (for update). The mapping is now complete. The final BPEL process is ready for deployment. You then deploy the BPEL process to the server and to test the service by simply dropping a file, in the same pattern/name as you specified, in the directory you specified in the File Adapter. You will see each record as a separate instance entry in the Fusion Middleware Control console. You can now load files into the product. You can repeat this process for each type of file to process. While this was a simple example it illustrates the method of loading data can be achieved using SOA Suite in conjunction with our products.

    Read the article

< Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >