Search Results

Search found 8219 results on 329 pages for 'less'.

Page 316/329 | < Previous Page | 312 313 314 315 316 317 318 319 320 321 322 323  | Next Page >

  • Install NPM Packages Automatically for Node.js on Windows Azure Web Site

    - by Shaun
    In one of my previous post I described and demonstrated how to use NPM packages in Node.js and Windows Azure Web Site (WAWS). In that post I used NPM command to install packages, and then use Git for Windows to commit my changes and sync them to WAWS git repository. Then WAWS will trigger a new deployment to host my Node.js application. Someone may notice that, a NPM package may contains many files and could be a little bit huge. For example, the “azure” package, which is the Windows Azure SDK for Node.js, is about 6MB. Another popular package “express”, which is a rich MVC framework for Node.js, is about 1MB. When I firstly push my codes to Windows Azure, all of them must be uploaded to the cloud. Is that possible to let Windows Azure download and install these packages for us? In this post, I will introduce how to make WAWS install all required packages for us when deploying.   Let’s Start with Demo Demo is most straightforward. Let’s create a new WAWS and clone it to my local disk. Drag the folder into Git for Windows so that it can help us commit and push. Please refer to this post if you are not familiar with how to use Windows Azure Web Site, Git deployment, git clone and Git for Windows. And then open a command windows and install a package in our code folder. Let’s say I want to install “express”. And then created a new Node.js file named “server.js” and pasted the code as below. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7: 8: console.log("Web application opened."); 9: app.listen(process.env.PORT); If we switch to Git for Windows right now we will find that it detected the changes we made, which includes the “server.js” and all files under “node_modules” folder. What we need to upload should only be our source code, but the huge package files also have to be uploaded as well. Now I will show you how to exclude them and let Windows Azure install the package on the cloud. First we need to add a special file named “.gitignore”. It seems cannot be done directly from the file explorer since this file only contains extension name. So we need to do it from command line. Navigate to the local repository folder and execute the command below to create an empty file named “.gitignore”. If the command windows asked for input just press Enter. 1: echo > .gitignore Now open this file and copy the content below and save. 1: node_modules Now if we switch to Git for Windows we will found that the packages under the “node_modules” were not in the change list. So now if we commit and push, the “express” packages will not be uploaded to Windows Azure. Second, let’s tell Windows Azure which packages it needs to install when deploying. Create another file named “package.json” and copy the content below into that file and save. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*" 6: } 7: } Now back to Git for Windows, commit our changes and push it to WAWS. Then let’s open the WAWS in developer portal, we will see that there’s a new deployment finished. Click the arrow right side of this deployment we can see how WAWS handle this deployment. Especially we can find WAWS executed NPM. And if we opened the log we can review what command WAWS executed to install the packages and the installation output messages. As you can see WAWS installed “express” for me from the cloud side, so that I don’t need to upload the whole bunch of the package to Azure. Open this website and we can see the result, which proved the “express” had been installed successfully.   What’s Happened Under the Hood Now let’s explain a bit on what the “.gitignore” and “package.json” mean. The “.gitignore” is an ignore configuration file for git repository. All files and folders listed in the “.gitignore” will be skipped from git push. In the example below I copied “node_modules” into this file in my local repository. This means,  do not track and upload all files under the “node_modules” folder. So by using “.gitignore” I skipped all packages from uploading to Windows Azure. “.gitignore” can contain files, folders. It can also contain the files and folders that we do NOT want to ignore. In the next section we will see how to use the un-ignore syntax to make the SQL package included. The “package.json” file is the package definition file for Node.js application. We can define the application name, version, description, author, etc. information in it in JSON format. And we can also put the dependent packages as well, to indicate which packages this Node.js application is needed. In WAWS, name and version is necessary. And when a deployment happened, WAWS will look into this file, find the dependent packages, execute the NPM command to install them one by one. So in the demo above I copied “express” into this file so that WAWS will install it for me automatically. I updated the dependencies section of the “package.json” file manually. But this can be done partially automatically. If we have a valid “package.json” in our local repository, then when we are going to install some packages we can specify “--save” parameter in “npm install” command, so that NPM will help us upgrade the dependencies part. For example, when I wanted to install “azure” package I should execute the command as below. Note that I added “--save” with the command. 1: npm install azure --save Once it finished my “package.json” will be updated automatically. Each dependent packages will be presented here. The JSON key is the package name while the value is the version range. Below is a brief list of the version range format. For more information about the “package.json” please refer here. Format Description Example version Must match the version exactly. "azure": "0.6.7" >=version Must be equal or great than the version. "azure": ">0.6.0" 1.2.x The version number must start with the supplied digits, but any digit may be used in place of the x. "azure": "0.6.x" ~version The version must be at least as high as the range, and it must be less than the next major revision above the range. "azure": "~0.6.7" * Matches any version. "azure": "*" And WAWS will install the proper version of the packages based on what you defined here. The process of WAWS git deployment and NPM installation would be like this.   But Some Packages… As we know, when we specified the dependencies in “package.json” WAWS will download and install them on the cloud. For most of packages it works very well. But there are some special packages may not work. This means, if the package installation needs some special environment restraints it might be failed. For example, the SQL Server Driver for Node.js package needs “node-gyp”, Python and C++ 2010 installed on the target machine during the NPM installation. If we just put the “msnodesql” in “package.json” file and push it to WAWS, the deployment will be failed since there’s no “node-gyp”, Python and C++ 2010 in the WAWS virtual machine. For example, the “server.js” file. 1: var express = require("express"); 2: var app = express(); 3: 4: app.get("/", function(req, res) { 5: res.send("Hello Node.js and Express."); 6: }); 7:  8: var sql = require("msnodesql"); 9: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:tqy4c0isfr.database.windows.net,1433;Database=msteched2012;Uid=shaunxu@tqy4c0isfr;Pwd=P@ssw0rd123;Encrypt=yes;Connection Timeout=30;"; 10: app.get("/sql", function (req, res) { 11: sql.open(connectionString, function (err, conn) { 12: if (err) { 13: console.log(err); 14: res.send(500, "Cannot open connection."); 15: } 16: else { 17: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 18: if (err) { 19: console.log(err); 20: res.send(500, "Cannot retrieve records."); 21: } 22: else { 23: res.json(results); 24: } 25: }); 26: } 27: }); 28: }); 29: 30: console.log("Web application opened."); 31: app.listen(process.env.PORT); The “package.json” file. 1: { 2: "name": "npmdemo", 3: "version": "1.0.0", 4: "dependencies": { 5: "express": "*", 6: "msnodesql": "*" 7: } 8: } And it failed to deploy to WAWS. From the NPM log we can see it’s because “msnodesql” cannot be installed on WAWS. The solution is, in “.gitignore” file we should ignore all packages except the “msnodesql”, and upload the package by ourselves. This can be done by use the content as below. We firstly un-ignored the “node_modules” folder. And then we ignored all sub folders but need git to check each sub folders. And then we un-ignore one of the sub folders named “msnodesql” which is the SQL Server Node.js Driver. 1: !node_modules/ 2:  3: node_modules/* 4: !node_modules/msnodesql For more information about the syntax of “.gitignore” please refer to this thread. Now if we go to Git for Windows we will find the “msnodesql” was included in the uncommitted set while “express” was not. I also need remove the dependency of “msnodesql” from “package.json”. Commit and push to WAWS. Now we can see the deployment successfully done. And then we can use the Windows Azure SQL Database from our Node.js application through the “msnodesql” package we uploaded.   Summary In this post I demonstrated how to leverage the deployment process of Windows Azure Web Site to install NPM packages during the publish action. With the “.gitignore” and “package.json” file we can ignore the dependent packages from our Node.js and let Windows Azure Web Site download and install them while deployed. For some special packages that cannot be installed by Windows Azure Web Site, such as “msnodesql”, we can put them into the publish payload as well. With the combination of Windows Azure Web Site, Node.js and NPM it makes even more easy and quick for us to develop and deploy our Node.js application to the cloud.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Oracle on Oracle: Is that all?

    - by Darin Pendergraft
    On October 17th, I posted a short blog and a podcast interview with Chirag Andani, talking about how Oracle IT uses its own IDM products. Blog link here. In response, I received a comment from reader Jaime Cardoso ([email protected]) who posted: “- You could have talked about how by deploying Oracle's Open standards base technology you were able to integrate any new system in your infrastructure in days. - You could have talked about how by deploying federation you were enabling the business side to keep all their options open in terms of companies to buy and sell while maintaining perfect employee and customer's single view. - You could have talked about how you are now able to cut response times to your audit and security teams into 1/10th of your former times Instead you spent 6 minutes talking about single sign on and self provisioning? If I didn't knew your IDM offer so well I would now be wondering what its differences from Microsoft's offer was. Sorry for not giving a positive comment here but, please your IDM suite is very good and, you simply aren't promoting it well enough” So I decided to send Jaime a note asking him about his experience, and to get his perspective on what makes the Oracle products great. What I found out is that Jaime is a very experienced IDM Architect with several major projects under his belt. Darin Pendergraft: Can you tell me a bit about your experience? How long have you worked in IT, and what is your IDM experience? Jaime Cardoso: I started working in "serious" IT in 1998 when I became Netscape's technical specialist in Portugal. Netscape Portugal didn't exist so, I was working for their VAR here. Most of my work at the time was with Netscape's mail server and LDAP server. Since that time I've been bouncing between the system's side like Sun resellers, Solaris stuff and even worked with Sun's Engineering in the making of an Hierarchical Storage Product (Sun CIS if you know it) and the application's side, mostly in LDAP and IDM. Over the years I've been doing support, service delivery and pre-sales / architecture design of IDM solutions in most big customers in Portugal, to name a few projects: - The first European deployment of Sun Access Manager (SAPO – Portugal Telecom) - The identity repository of 5/5 of the Biggest Portuguese banks - The Portuguese government federation of services project DP: OK, in your blog response, you mentioned 3 topics: 1. Using Oracle's standards based architecture; (you) were able to integrate any new system in days: can you give an example? What systems, how long did it take, number of apps/users/accounts/roles etc. JC: It's relatively easy to design a user management strategy for a static environment, or if you simply assume that you're an <insert vendor here> shop and all your systems will bow to that vendor's will. We've all seen that path, the use of proprietary technologies in interoperability solutions but, then reality kicks in. As an ISP I recall that I made the technical decision to use Active Directory as a central authentication system for the entire IT infrastructure. Clients, systems, apps, everything was there. As a good part of the systems and apps were running on UNIX, then a connector became needed in order to have UNIX boxes to authenticate against AD. And, that strategy worked but, each new machine required the component to be installed, monitoring had to be made for that component and each new app had to be independently certified. A self care user portal was an ongoing project, AD access assumes the client is inside the domain, something the ISP's customers (and UNIX boxes) weren't nor had any intention of ever being. When the Windows 2008 rollout was done, Microsoft changed the Active Directory interface. The Windows administrators didn't have enough know-how about directories and the way systems outside the MS world behaved so, on the go live, things weren't properly tested and a general outage followed. Several hours and 1 roll back later, everything was back working. But, the ISP still had to change all of its applications to work with the new access methods and reset the effort spent on the self service user portal. To keep with the same strategy, they would also have to trust Microsoft not to change interfaces again. Simply by putting up an Oracle LDAP server in the middle and replicating the user info from the AD into LDAP, most of the problems went away. Even systems for which no AD connector existed had PAM in them so, integration was made at the OS level, fully supported by the OS supplier. Sun Identity Manager already had a self care portal, combined with a user workflow so, all the clearances had to be given before the account was created or updated. Adding a new system as a client for these authentication services was simply a new checkbox in the OS installer and, even True64 systems were, for the first time integrated also with a 5 minute work of a junior system admin. True, all the windows clients and MS apps still went to the AD for their authentication needs so, from the start everybody knew that they weren't 100% free of migration pains but, now they had a single point of problems to look at. If you're looking for numbers: - 500K directory entries (users) - 2-300 systems After the initial setup, I personally integrated about 20 systems / apps against LDAP in 1 day while being watched by the different IT teams. The internal IT staff did the rest. DP: 2. Using Federation allows the business to keep options open for buying and selling companies, and yet maintain a single view for both employee and customer. What do you mean by this? Can you give an example? JC: The market is dynamic. The company that's being bought today tomorrow will be sold again. Companies that spread on different markets may see the regulator forcing a sale of part of a company due to monopoly reasons and companies that are in multiple countries have to comply with different legislations. Our job, as IT architects, while addressing the customers and employees authentication services, is quite hard and, quite contrary. On one hand, we need to give access to all of our employees to the relevant systems, apps and resources and, we already have marketing talking with us trying to find out who's a customer of the bough company but not from ours to address. On the other hand, we have to do that and keep in mind we may have to break up all that effort and that different countries legislation may became a problem with a full integration plan. That's a job for user Federation. you don't want to be the one who's telling your President that he will sell that business unit without it's customer's database (making the deal worth a lot less) or that the buyer will take with him a copy of your entire customer's database. Federation enables you to start controlling permissions to users outside of your traditional authentication realm. So what if the people of that company you just bought are keeping their old logins? Do you want, because of that, to have a dedicated system for their expenses reports? And do you want to keep their sales (and pre-sales) people out of the loop in terms of your group's path? Control the information flow, establish a Federation trust circle and give access to your apps to users that haven't (yet?) been brought into your internal login systems. You can still see your users in a unified view, you obviously control if a user has access to any particular application, either that user is in your local database or stored in a directory on the other side of the world. DP: 3. Cut response times of audit and security teams to 1/10. Is this a real number? Can you give an example? JC: No, I don't have any backing for this number. One of the companies I did system Administration for has a SOX compliance policy in place (I remind you that I live in Portugal so, this definition of SOX may be somewhat different from what you're used to) and, every time the audit team says they'll do another audit, we have to negotiate with them the size of the sample and we spend about 15 man/days gathering all the required info they ask. I did some work with Sun's Identity auditor and, from what I've been seeing, Oracle's product is even better and, I've seen that most of the information they ask would have been provided in a few hours with the help of this tool. I do stand by what I said here but, to be honest, someone from Identity Auditor team would do a much better job than me explaining this time savings. Jaime is right: the Oracle IDM products have a lot of business value, and Oracle IT is using them for a lot more than I was able to cover in the short podcast that I posted. I want to thank Jaime for his comments and perspective. We want these blog posts to be informative and honest – so if you have feedback for the Oracle IDM team on any topic discussed here, please post your comments below.

    Read the article

  • Top 5 Developer Enabling Nuggets in MySQL 5.6

    - by Rob Young
    MySQL 5.6 is truly a better MySQL and reflects Oracle's commitment to the evolution of the most popular and widelyused open source database on the planet.  The feature-complete 5.6 release candidate was announced at MySQL Connect in late September and the production-ready, generally available ("GA") product should be available in early 2013.  While the message around 5.6 has been focused mainly on mass appeal, advanced topics like performance/scale, high availability, and self-healing replication clusters, MySQL 5.6 also provides many developer-friendly nuggets that are designed to enable those who are building the next generation of web-based and embedded applications and services. Boiling down the 5.6 feature set into a smaller set, of simple, easy to use goodies designed with developer agility in mind, these things deserve a quick look:Subquery Optimizations Using semi-JOINs and late materialization, the MySQL 5.6 Optimizer delivers greatly improved subquery performance. Specifically, the optimizer is now more efficient in handling subqueries in the FROM clause; materialization of subqueries in the FROM clause is now postponed until their contents are needed during execution. Additionally, the optimizer may add an index to derived tables during execution to speed up row retrieval. Internal tests run using the DBT-3 benchmark Query #13, shown below, demonstrate an order of magnitude improvement in execution times (from days to seconds) over previous versions. select c_name, c_custkey, o_orderkey, o_orderdate, o_totalprice, sum(l_quantity)from customer, orders, lineitemwhere o_orderkey in (                select l_orderkey                from lineitem                group by l_orderkey                having sum(l_quantity) > 313  )  and c_custkey = o_custkey  and o_orderkey = l_orderkeygroup by c_name, c_custkey, o_orderkey, o_orderdate, o_totalpriceorder by o_totalprice desc, o_orderdateLIMIT 100;What does this mean for developers?  For starters, simplified subqueries can now be coded instead of complex joins for cross table lookups: SELECT title FROM film WHERE film_id IN (SELECT film_id FROM film_actor GROUP BY film_id HAVING count(*) > 12); And even more importantly subqueries embedded in packaged applications no longer need to be re-written into joins.  This is good news for both ISVs and their customers who have access to the underlying queries and who have spent development cycles writing, testing and maintaining their own versions of re-written queries across updated versions of a packaged app.The details are in the MySQL 5.6 docs. Online DDL OperationsToday's web-based applications are designed to rapidly evolve and adapt to meet business and revenue-generationrequirements. As a result, development SLAs are now most often measured in minutes vs days or weeks. For example, when an application must quickly support new product lines or new products within existing product lines, the backend database schema must adapt in kind, and most commonly while the application remains available for normal business operations.  MySQL 5.6 supports this level of online schema flexibility and agility by providing the following new ALTER TABLE online DDL syntax additions:  CREATE INDEX DROP INDEX Change AUTO_INCREMENT value for a column ADD/DROP FOREIGN KEY Rename COLUMN Change ROW FORMAT, KEY_BLOCK_SIZE for a table Change COLUMN NULL, NOT_NULL Add, drop, reorder COLUMN Again, the details are in the MySQL 5.6 docs. Key-value access to InnoDB via Memcached APIMany of the next generation of web, cloud, social and mobile applications require fast operations against simple Key/Value pairs. At the same time, they must retain the ability to run complex queries against the same data, as well as ensure the data is protected with ACID guarantees. With the new NoSQL API for InnoDB, developers have allthe benefits of a transactional RDBMS, coupled with the performance capabilities of Key/Value store.MySQL 5.6 provides simple, key-value interaction with InnoDB data via the familiar Memcached API.  Implemented via a new Memcached daemon plug-in to mysqld, the new Memcached protocol is mapped directly to the native InnoDB API and enables developers to use existing Memcached clients to bypass the expense of query parsing and go directly to InnoDB data for lookups and transactional compliant updates.  The API makes it possible to re-use standard Memcached libraries and clients, while extending Memcached functionality by integrating a persistent, crash-safe, transactional database back-end.  The implementation is shown here:So does this option provide a performance benefit over SQL?  Internal performance benchmarks using a customized Java application and test harness show some very promising results with a 9X improvement in overall throughput for SET/INSERT operations:You can follow the InnoDB team blog for the methodology, implementation and internal test cases that generated these results here. How to get started with Memcached API to InnoDB is here. New Instrumentation in Performance SchemaThe MySQL Performance Schema was introduced in MySQL 5.5 and is designed to provide point in time metrics for key performance indicators.  MySQL 5.6 improves the Performance Schema in answer to the most common DBA and Developer problems.  New instrumentations include: Statements/Stages What are my most resource intensive queries? Where do they spend time? Table/Index I/O, Table Locks Which application tables/indexes cause the most load or contention? Users/Hosts/Accounts Which application users, hosts, accounts are consuming the most resources? Network I/O What is the network load like? How long do sessions idle? Summaries Aggregated statistics grouped by statement, thread, user, host, account or object. The MySQL 5.6 Performance Schema is now enabled by default in the my.cnf file with optimized and auto-tune settings that minimize overhead (< 5%, but mileage will vary), so using the Performance Schema ona production server to monitor the most common application use cases is less of an issue.  In addition, new atomic levels of instrumentation enable the capture of granular levels of resource consumption by users, hosts, accounts, applications, etc. for billing and chargeback purposes in cloud computing environments.The MySQL docs are an excellent resource for all that is available and that can be done with the 5.6 Performance Schema. Better Condition Handling - GET DIAGNOSTICSMySQL 5.6 enables developers to easily check for error conditions and code for exceptions by introducing the new MySQL Diagnostics Area and corresponding GET DIAGNOSTICS interface command. The Diagnostic Area can be populated via multiple options and provides 2 kinds of information:Statement - which provides affected row count and number of conditions that occurredCondition - which provides error codes and messages for all conditions that were returned by a previous operation The addressable items for each are: The new GET DIAGNOSTICS command provides a standard interface into the Diagnostics Area and can be used via the CLI or from within application code to easily retrieve and handle the results of the most recent statement execution.  An example of how it is used might be:mysql> DROP TABLE test.no_such_table; ERROR 1051 (42S02): Unknown table 'test.no_such_table' mysql> GET DIAGNOSTICS CONDITION 1 -> @p1 = RETURNED_SQLSTATE, @p2 = MESSAGE_TEXT; mysql> SELECT @p1, @p2; +-------+------------------------------------+| @p1   | @p2                                | +-------+------------------------------------+| 42S02 | Unknown table 'test.no_such_table' | +-------+------------------------------------+ Options for leveraging the MySQL Diagnotics Area and GET DIAGNOSTICS are detailed in the MySQL Docs.While the above is a summary of some of the key developer enabling 5.6 features, it is by no means exhaustive. You can dig deeper into what MySQL 5.6 has to offer by reading this developer zone article or checking out "What's New in MySQL 5.6" in the MySQL docs.BONUS ALERT!  If you are developing on Windows or are considering MySQL as an alternative to SQL Server for your next project, application or shipping product, you should check out the MySQL Installer for Windows.  The installer includes the MySQL 5.6 RC database, all drivers, Visual Studio and Excel plugins, tray monitor and development tools all a single download and GUI installer.   So what are your next steps? Register for Dec. 13 "MySQL 5.6: Building the Next Generation of Web-Based Applications and Services" live web event.  Hurry!  Seats are limited. Download the MySQL 5.6 Release Candidate (look under the Development Releases tab) Provide Feedback <link to http://bugs.mysql.com/> Join the Developer discussion on the MySQL Forums Explore all MySQL Products and Developer Tools As always, thanks for your continued support of MySQL!

    Read the article

  • How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions

    - by Eric Z Goodnight
    Have a huge folder of images needing tweaks? A few hundred adjustments may seem like a big, time consuming job—but read one to see how Photoshop can do repetitive tasks automatically, even if you don’t know how to program! Photoshop Actions are a simple way to program simple routines in Photoshop, and are a great time saver, allowing you to re-perform tasks over and over, saving you minutes or hours, depending on the job you have to work on. See how any bunch of images and even some fairly complicated photo tweaking can be done automatically to even hundreds of images at once. When Can I use Photoshop Actions? Photoshop actions are a way of recording the tools, menus, and keys pressed while using the program. Each time you use a tool, adjust a color, or use the brush, it can be recorded and played back over any file Photoshop can open. While it isn’t perfect and can get very confused if not set up correctly, it can automate editing hundreds of images, saving you hours and hours if you have big jobs with complex edits. The image illustrated above is a template for a polaroid-style picture frame. If you had several hundred images, it would actually be a simple matter to use Photoshop Actions to create hundreds of new images inside the frame in almost no time at all. Let’s take a look at how a simple folder of images and some Image editing automation can turn lots of work into a simple and easy job. Creating a New Action Actions is a default part of the “Essentials” panel set Photoshop begins with as a default. If you can’t see the panel button under the “History” button, you can find Actions by going to Window > Actions or pressing Alt + F9. Click the in the Actions Panel, pictured in the previous illustration on the left. Choose to create a “New Set” in order to begin creating your own custom Actions. Name your action set whatever you want. Names are not relevant, you’ll simply want to make it obvious that you have created it. Click OK. Look back in the layers panel. You’ll see your new Set of actions has been added to the list. Click it to highlight it before going on. Click the again to create a “New Action” in your new set. If you care to name your action, go ahead. Name it after whatever it is you’re hoping to do—change the canvas size, tint all your pictures blue, send your image to the printer in high quality, or run multiple filters on images. The name is for your own usage, so do what suits you best. Note that you can simplify your process by creating shortcut keys for your actions. If you plan to do hundreds of edits with your actions, this might be a good idea. If you plan to record an action to use every time you use Photoshop, this might even be an invaluable step. When you create a new Action, Photoshop automatically begins recording everything you do. It does not record the time in between steps, but rather only the data from each step. So take your time when recording and make sure you create your actions the way you want them. The square button stops recording, and the circle button starts recording again. With these basics ready, we can take a look at a sample Action. Recording a Sample Action Photoshop will remember everything you input into it when it is recording, even specific photographs you open. So begin recording your action when your first photo is already open. Once your first image is open, click the record button. If you’re already recording, continue on. Using the File > Place command to insert the polaroid image can be easier for Actions to deal with. Photoshop can record with multiple open files, but it often gets confused when you try it. Keep your recordings as simple as possible to ensure your success. When the image is placed in, simply press enter to render it. Select your background layer in your layers panel. Your recording should be following along with no trouble. Double click this layer. Double clicking your background layer will create a new layer from it. Allow it to be renamed “Layer 0” and press OK. Move the “polaroid” layer to the bottom by selecting it and dragging it down below “Layer 0” in the layers panel. Right click “Layer 0” and select “Create Clipping Mask.” The JPG image is cropped to the layer below it. Coincidentally, all actions described here are being recorded perfectly, and are reproducible. Cursor actions, like the eraser, brush, or bucket fill don’t record well, because the computer uses your mouse movements and coordinates, which may need to change from photo to photo. Click the to set your Photograph layer to a “Screen” blending mode. This will make the image disappear when it runs over the white parts of the polaroid image. With your image layer (Layer 0) still selected, navigate to Edit > Transform > Scale. You can use the mouse to resize your Layer 0, but Actions work better with absolute numbers. Visit the Width and Height adjustments in the top options panel. Click the chain icon to link them together, and adjust them numerically. Depending on your needs, you may need to use more or less than 30%. Your image will resize to your specifications. Press enter to render, or click the check box in the top right of your application. + Click on your bottom layer, or “polaroid” in this case. This creates a selection of the bottom layer. Navigate to Image > Crop in order to crop down to your bottom layer selection Your image is now resized to your bottommost layer, and Photoshop is still recording to that effect. For additional effect, we can navigate to Image > Image Rotation > Arbitrary to rotate our image by a small tilt. Choosing 3 degrees clockwise , we click OK to render our choice. Our image is rotated, and this step is recorded. Photoshop will even record when you save your files. With your recording still going, find File > Save As. You can easily tell Photoshop to save in a new folder, other than the one you have been working in, so that your files aren’t overwritten. Navigate to any folder you wish, but do not change the filename. If you change the filename, Photoshop will record that name, and save all your images under whatever you type. However, you can change your filetype without recording an absolute filename. Use the pulldown tab and select a different filetype—in this instance, PNG. Simply click “Save” to create a new PNG based on your actions. Photoshop will record the destination and the change in filetype. If you didn’t edit the name of your file, it will always use the variable filename of any image you open. (This is very important if you want to edit hundreds of images at once!) Click File > Close or the red “X” in the corner to close your filetype. Photoshop can record that as well. Since we have already saved our image as a JPG, click “NO” to not overwrite your original image. Photoshop will also record your choice of “NO” for subsequent images. In your Actions panel, click the stop button to complete your action. You can always click the record button to add more steps later, if you want. This is how your new action looks with its steps expanded. Curious how to put it into effect? Read on to see how simple it is to use that recording you just made. Editing Lots of Images with Your New Action Open a large number of images—as many as you care to work with. Your action should work immediately with every image on screen, although you may have to test and re-record, depending on how you did. Actions don’t require any programming knowledge, but often can get confused or work in a counter-intuitive way. Record your action until it is perfect. If it works once without errors, it’s likely to work again and again! Find the “Play” button in your Actions Panel. With your custom action selected, click “Play” and your routine will edit, save, and close each file for you. Keep bashing “Play” for each open file, and it will keep saving and creating new files until you run out of work you need to do. And in mere moments, a complicated stack of work is done. Photoshop actions can be very complicated, far beyond what is illustrated here, and can even be combined with scripts and other actions, creating automated creation of potentially very complex files, or applying filters to an entire portfolio of digital photos. Have questions or comments concerning Graphics, Photos, Filetypes, or Photoshop? Send your questions to [email protected], and they may be featured in a future How-To Geek Graphics article. Image Credits: All images copyright Stephanie Pragnell and author Eric Z Goodnight, protected under Creative Commons. Latest Features How-To Geek ETC How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) How To Remove People and Objects From Photographs In Photoshop Ask How-To Geek: How Can I Monitor My Bandwidth Usage? Internet Explorer 9 RC Now Available: Here’s the Most Interesting New Stuff Smart Taskbar Is a Thumb Friendly Android Task Launcher Comix is an Awesome Comics Archive Viewer for Linux Get the MakeUseOf eBook Guide to Speeding Up Windows for Free Need Tech Support? Call the Star Wars Help Desk! [Video Classic] Reclaim Vertical UI Space by Adding a Toolbar to the Left or Right Side of Firefox Androidify Turns You into an Android-style Avatar

    Read the article

  • Data Integration 12c Raising the Big Data Roof at Oracle OpenWorld

    - by Tanu Sood
    Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} Author: Dain Hansen, Director, Oracle It was an exciting OpenWorld 2013 for us in the Data Integration track. Our theme this year was all about ‘being future ready’ - previewing one of our biggest releases this year: Oracle Data Integration 12c. Just this week we followed up with this preview by announcing the general availability of 12c release for Oracle’s key data integration products: Oracle Data Integrator 12c and Oracle GoldenGate 12c. The new release delivers extreme performance, increase IT productivity, and simplify deployment, while helping IT organizations to keep pace with new data-oriented technology trends including cloud computing, big data analytics, real-time business intelligence. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} Mark Hurd's keynote on day one set the tone for the Data Integration sessions. Mark focused on big data analytics and the changing consumer expectations. Especially real-time insight is a key theme for Oracle overall and data integration products. In Mark Hurd's keynote we heard from key customers, such as Airbus and Thomson Reuters, how real-time analysis of operational data including machine data creates value, in some cases even saves lives. Thomas Kurian gave a deeper look into Oracle's big data and fast data solutions. In the initial lead Data Integration track session - Brad Adelberg, VP of Development, presented Oracle’s Data Integration 12c product strategy based on key trends from the initial OpenWorld keynotes. Brad talked about how Oracle's data integration products address the new data integration requirements that evolved with cloud computing, big data, and changing consumer expectations and how they set the key themes in our products’ road map. Brad explained why and how fast-time to value, high-performance and future-ready solutions is the top focus areas for product development. If you were not able to attend OpenWorld or this session I recommend reading the white paper: Five New Data Integration Requirements and How to Meet them with Oracle Data Integration, which provides an in-depth look into how Oracle addresses the new trends in the DI market. Following Brad’s session, Nick Wagner provided in depth review of Oracle GoldenGate’s latest features and roadmap. Nick discussed how Oracle GoldenGate’s tight integration with Oracle Database sets the product apart from the competition. We also heard that heterogeneity of the product is still a major focus for GoldenGate’s development and there will be more news on that front when there is a major release. Normal 0 false false false EN-US X-NONE X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0in 5.4pt 0in 5.4pt; mso-para-margin:0in; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-family:"Times New Roman","serif"; mso-fareast-font-family:"MS Mincho";} After GoldenGate’s product strategy session, Denis Gray from the PM team presented Oracle Data Integrator’s product strategy session, talking about the latest and greatest on ODI. Another good session was delivered by long-time GoldenGate users, Comcast.  Jason Hurd and Amit Patel of Comcast talked about the various use cases they deploy Oracle GoldenGate throughout their enterprise, from database upgrades, feeding reporting systems, to active-active database synchronization.  The Comcast team shared many good tips on how to use GoldenGate for both zero downtime upgrades and active-active replication with conflict management requirement. One of our other important goals we had this year for the Data Integration track at OpenWorld was hearing from our customers. We ended day 1 on just that, with a wonderful award ceremony for Oracle Excellence Awards for Oracle Fusion Middleware Innovation. The ceremony was held in the Yerba Buena Center for the Arts. Congratulations to Royal Bank of Scotland and Yalumba Wine Company, the winners in the Data Integration category. You can find more information on the award and the winners in our previous blog post: 2013 Oracle Excellence Awards for Fusion Middleware Innovation… Selected for their innovation use of Oracle’s Data Integration products; the winners for the Data Integration Category are Royal Bank of Scotland and The Yalumba Wine Company. Congratulations!!! Royal Bank of Scotland’s Market and International Banking division provides clients across the globe with seamless trading and competitive pricing, underpinned by a deep knowledge of risk management across the full spectrum of financial products. They handle millions of transactions daily to keep the lifeblood of their clients’ businesses flowing – whether through payment management solutions or through bespoke trade finance solutions. Royal Bank of Scotland is leveraging Oracle GoldenGate and Oracle Data Integrator along with Oracle Business Intelligence Enterprise Edition and the Oracle Database for a variety of solutions. Mainly, Oracle GoldenGate and Oracle Data Integrator are used to feed their data warehouse – providing a real-time data integration solution that feeds transactional data to their analytics system in minutes to enable improved decision making with timely, accurate data for their business users. Oracle Data Integrator’s in-database transformation capabilities and its ability to integrate with Oracle GoldenGate for real-time data capture is the foundation of this implementation. This solution makes it such that changes happening in the analytics systems are available the same day they are deployed on the operational system with 100% data quality guaranteed. Additionally, the solution has helped to reduce their operational database size from 150GB to 10GB. Impressive! Now what if I told you this solution was built in 3 months and had a less than 6 month return on investment? That’s outstanding! The Yalumba Wine Company is situated in the Barossa Valley of Australia. It is the oldest family owned winery in Australia with a unique way of aging their wines in specially crafted 100 liter barrels. Did you know that “Yalumba” is Aboriginal for “all the land around”? The Yalumba Wine Company is growing rapidly, and was in need of introducing a more modern standard to the existing manufacturing processes to meet globalization demands, overall time-to-market, and better operational efficiency objectives of product development. The Yalumba Wine Company worked with a partner, Bristlecone to develop a unique solution whereby Oracle Data Integrator is leveraged to pull data from Salesforce.com and JD Edwards, in addition to their other pre-existing source systems, for consumption into their data warehouse. They have emphasized the overall ease of developing integration workflows with Oracle Data Integrator. The solution has brought better visibility for the business users, shorter data loading and transformation performance to their data warehouse with rapid incorporation of new data sources, and a solid future-proof foundation for their organization. Moving forward, they plan on leveraging more from Oracle’s Data Integration portfolio. Terrific! In addition to these two customers on Tuesday we featured many other important Oracle Data Integrator and Oracle GoldenGate customers. On Tuesday the GoldenGate panel included: Land O’Lakes, Smuckers, and Veolia Water. Besides giving us yummy nutrition and healthy water, these companies have another aspect in common. They all use GoldenGate to boost their ERP application. Please read the recap by Irem Radzik. On Wednesday, the ODI Panel included: Barry Ralston and Ryan Weber of Infinity Insurance, Paul Stracke of Paychex Inc., and Ian Wall of Vertex Pharmaceuticals for a session filled with interesting projects, use cases and approaches to leveraging Oracle Data Integrator. Please read the recap by Sandrine Riley for more. Thanks to everyone who joined with us and we hope to stay connected! To hear more about our Data Integration12c products join us in an upcoming webcast to learn more. Follow us www.twitter.com/ORCLGoldenGate or goto our website at www.oracle.com/goto/dataintegration

    Read the article

  • HTML5-MVC application using VS2010 SP1

    - by nmarun
    This is my first attempt at creating HTML5 pages. VS 2010 allows working with HTML5 now (you just need to make a small change after installing SP1). So my Razor view is now a HTML5 page. I call this application - 5Commerce – (an over-simplified) HTML5 ECommerce site. So here’s the flow of the application: home page renders user enters first and last name, chooses a product and the quantity can enter additional instructions for the order place the order user is then taken to another page showing the order details Off to the details. This is what my page looks in Google Chrome 10 beta (or later) soon after it renders. Here are some of the things to observe on this. Look a little closer and you’ll see a border around the first name textbox – this is ‘autofocus’ in action. I’ve set the autofocus attribute on this textbox. So as soon as the page loads, this control gets focus. 1: <input type="text" autofocus id="firstName" class="inputWidth" data_minlength="" 2: data_maxlength="" placeholder="first name" /> See a partially grayed out ‘last name’ text in the second textbox. This is set using a placeholder attribute (see above). It gets wiped out on-focus and improves the UI visuals in general. The quantity textbox is actually a numerical-only textbox. 1: <input type="number" id="quantity" data_mincount="" class="inputWidth" /> The last line is for additional instructions. This looks like a label but it’s content is editable. Just adding the ‘contenteditable’ attribute to the span allow the user to edit the text inside. 1: <span contenteditable id="additionalInstructions" data_texttype="" class="editableContent">select text and edit </span> All of the above is just plain HTML (no lurking javascript acting in here). Makes it real clean and simple. Going more into the HTML, I see that the _Layout.cshtml already is using some HTML5 content. I created my project before installing SP1, so that was the reason for my surprise. 1: <!DOCTYPE html> This is the doctype declaration in HTML5 and this is supported even by IE6 (just take my word on IE6 now, don’t go install it to test it, especially when MS is doing an IE6 countdown). That’s just amazing and extremely easy to read remember and talk about a few less bytes on every call! I modified the rest of my _Layout.cshtml to the below: 1: <!DOCTYPE html> 2: <html> 3: <head> 4: <title>5Commerce - HTML 5 Ecommerce site</title> 5: <link href="@Url.Content("~/Content/Site.css")" rel="stylesheet" type="text/css" /> 6: <script src="@Url.Content("~/Scripts/jquery-1.4.4.min.js")" type="text/javascript"></script> 7: <script src="@Url.Content("~/Scripts/CustomScripts.js")" type="text/javascript"></script> 8: <script type="text/javascript"> 9: $(document).ready(function () { 10: WireupEvents(); 11: }); 12:</script> 13:  14: </head> 15:  16: <body role="document" class="bodybackground"> 17: <header role="heading"> 18: <h2>5Commerce - HTML 5 Ecommerce site!</h2> 19: </header> 20: <section id="mainForm"> 21: @RenderBody() 22: </section> 23: <footer id="page_footer" role="siteBaseInfo"> 24: <p>&copy; 2011 5Commerce Inc!</p> 25: </footer> 26: </body> 27: </html> I’m sure you’re seeing some of the new tags here. To give a brief intro about them: <header>, <footer>: Marks the header/footer region of a page or section. <section>: A logical grouping of content role attribute: Identifies the responsibility of an element. This attribute can be used by screen readers and can also be filtered through jQuery. SP1 also allows for some intellisense in HTML5. You see the other types of input fields – email, date, datetime, month, url and there are others as well. So once my page loads, i.e., ‘on document ready’, I’m wiring up the events following the principles of unobtrusive javascript. In the snippet below, I’m controlling the behavior of the input controls for specific events. 1: $("#productList").bind('change blur', function () { 2: IsSelectedProductValid(); 3: }); 4:  5: $("#quantity").bind('blur', function () { 6: IsQuantityValid(); 7: }); 8:  9: $("#placeOrderButton").click( 10: function () { 11: if (IsPageValid()) { 12: LoadProducts(); 13: } 14: }); This enables some client-side validation to occur before the data is sent to the server. These validation constraints are obtained through a JSON call to the WCF service and are set to the ‘data_’ attributes of the input controls. Have a look at the ‘GetValidators()’ function below: 1: function GetValidators() { 2: // the post to your webservice or page 3: $.ajax({ 4: type: "GET", //GET or POST or PUT or DELETE verb 5: url: "http://localhost:14805/OrderService.svc/GetValidators", // Location of the service 6: data: "{}", //Data sent to server 7: contentType: "application/json; charset=utf-8", // content type sent to server 8: dataType: "json", //Expected data format from server 9: processdata: true, //True or False 10: success: function (result) {//On Successfull service call 11: if (result.length > 0) { 12: for (i = 0; i < result.length; i++) { 13: if (result[i].PropertyName == "FirstName") { 14: if (result[i].MinLength > 0) { 15: $("#firstName").attr("data_minLength", result[i].MinLength); 16: } 17: if (result[i].MaxLength > 0) { 18: $("#firstName").attr("data_maxLength", result[i].MaxLength); 19: } 20: } 21: else if (result[i].PropertyName == "LastName") { 22: if (result[i].MinLength > 0) { 23: $("#lastName").attr("data_minLength", result[i].MinLength); 24: } 25: if (result[i].MaxLength > 0) { 26: $("#lastName").attr("data_maxLength", result[i].MaxLength); 27: } 28: } 29: else if (result[i].PropertyName == "Quantity") { 30: if (result[i].MinCount > 0) { 31: $("#quantity").attr("data_minCount", result[i].MinCount); 32: } 33: } 34: else if (result[i].PropertyName == "AdditionalInstructions") { 35: if (result[i].TextType.length > 0) { 36: $("#additionalInstructions").attr("data_textType", result[i].TextType); 37: } 38: } 39: } 40: } 41: }, 42: error: function (result) {// When Service call fails 43: alert('Service call failed: ' + result.status + ' ' + result.statusText); 44: } 45: }); 46:  47: //.... 48: } Just before the GetValidators() function runs and sets the validation constraints, this is what the html looks like (seen through the Dev tools of Chrome): After the function executes, you see the values in the ‘data_’  attributes. As and when we enter valid data into these fields, the error messages disappear, since the validation is bound to the blur event of the control. There you see… no error messages (well, the catch here is that once you enter THAT name, all errors disappear automatically). Clicking on ‘Place Order!’ runs the SaveOrder function. You can see the JSON for the order object that is getting constructed and passed to the WCF Service. 1: function SaveOrder() { 2: var addlInstructionsDefaultText = "select text and edit"; 3: var addlInstructions = $("span:first").text(); 4: if(addlInstructions == addlInstructionsDefaultText) 5: { 6: addlInstructions = ''; 7: } 8: var orderJson = { 9: AdditionalInstructions: addlInstructions, 10: Customer: { 11: FirstName: $("#firstName").val(), 12: LastName: $("#lastName").val() 13: }, 14: OrderedProduct: { 15: Id: $("#productList").val(), 16: Quantity: $("#quantity").val() 17: } 18: }; 19:  20: // the post to your webservice or page 21: $.ajax({ 22: type: "POST", //GET or POST or PUT or DELETE verb 23: url: "http://localhost:14805/OrderService.svc/SaveOrder", // Location of the service 24: data: JSON.stringify(orderJson), //Data sent to server 25: contentType: "application/json; charset=utf-8", // content type sent to server 26: dataType: "json", //Expected data format from server 27: processdata: false, //True or False 28: success: function (result) {//On Successfull service call 29: window.location.href = "http://localhost:14805/home/ShowOrderDetail/" + result; 30: }, 31: error: function (request, error) {// When Service call fails 32: alert('Service call failed: ' + request.status + ' ' + request.statusText); 33: } 34: }); 35: } The service saves this order into an XML file and returns the order id (a guid). On success, I redirect to the ShowOrderDetail action method passing the guid. This page will show all the details of the order. Although the back-end weightlifting is done by WCF, I did not show any of that plumbing-work as I wanted to concentrate more on the HTML5 and its associates. However, you can see it all in the source here. I do have one issue with HTML5 and this is an existing issue with HTML4 as well. If you see the snippet above where I’ve declared a textbox for first name, you’ll see the autofocus attribute just dangling by itself. It doesn’t follow the xml syntax of ‘key="value"’ allowing users to continue writing badly-formatted html even in the new version. You’ll see the same issue with the ‘contenteditable’ attribute as well. The work-around is that you can do ‘autofocus=”true”’ and it’ll work fine plus make it well-formatted. But unless the standards enforce this, there will be people (me included) who’ll get by, by just typing the bare minimum! Hoping this will get fixed in the coming version-updates. Source code here. Verdict: I think it’s time for us to embrace the new HTML5. Thank you HTML4 and Welcome HTML5.

    Read the article

  • Windows Azure Mobile Services: New support for iOS apps, Facebook/Twitter/Google identity, Emails, SMS, Blobs, Service Bus and more

    - by ScottGu
    A few weeks ago I blogged about Windows Azure Mobile Services - a new capability in Windows Azure that makes it incredibly easy to connect your client and mobile applications to a scalable cloud backend. Earlier today we delivered a number of great improvements to Windows Azure Mobile Services.  New features include: iOS support – enabling you to connect iPhone and iPad apps to Mobile Services Facebook, Twitter, and Google authentication support with Mobile Services Blob, Table, Queue, and Service Bus support from within your Mobile Service Sending emails from your Mobile Service (in partnership with SendGrid) Sending SMS messages from your Mobile Service (in partnership with Twilio) Ability to deploy mobile services in the West US region All of these improvements are now live in production and available to start using immediately. Below are more details on them: iOS Support This week we delivered initial support for connecting iOS based devices (including iPhones and iPads) to Windows Azure Mobile Services.  Like the rest of our Windows Azure SDK, we are delivering the native iOS libraries to enable this under an open source (Apache 2.0) license on GitHub.  We’re excited to get your feedback on this new library through our forum and GitHub issues list, and we welcome contributions to the SDK. To create a new iOS app or connect an existing iOS app to your Mobile Service, simply select the “iOS” tab within the Quick Start view of a Mobile Service within the Windows Azure Portal – and then follow either the “Create a new iOS app” or “Connect to an existing iOS app” link below it: Clicking either of these links will expand and display step-by-step instructions for how to build an iOS application that connects with your Mobile Service: Read this getting started tutorial to walkthrough how you can build (in less than 5 minutes) a simple iOS “Todo List” app that stores data in Windows Azure.  Then follow the below tutorials to explore how to use the iOS client libraries to store data and authenticate users. Get Started with data in Mobile Services for iOS Get Started with authentication in Mobile Services for iOS Facebook, Twitter, and Google Authentication Support Our initial preview of Mobile Services supported the ability to authenticate users of mobile apps using Microsoft Accounts (formerly called Windows Live ID accounts).  This week we are adding the ability to also authenticate users using Facebook, Twitter, and Google credentials.  These are now supported with both Windows 8 apps as well as iOS apps (and a single app can support multiple forms of identity simultaneously – so you can offer your users a choice of how to login). The below tutorials walkthrough how to register your Mobile Service with an identity provider: How to register your app with Microsoft Account How to register your app with Facebook How to register your app with Twitter How to register your app with Google The tutorials above walkthrough how to obtain a client ID and a secret key from the identity provider. You can then click on the “Identity” tab of your Mobile Service (within the Windows Azure Portal) and save these values to enable server-side authentication with your Mobile Service: You can then write code within your client or mobile app to authenticate your users to the Mobile Service.  For example, below is the code you would write to have them login to the Mobile Service using their Facebook credentials: Windows Store App (using C#): var user = await App.MobileService                     .LoginAsync(MobileServiceAuthenticationProvider.Facebook); iOS app (using Objective C): UINavigationController *controller = [self.todoService.client     loginViewControllerWithProvider:@"facebook"     completion:^(MSUser *user, NSError *error) {        //... }]; Learn more about authenticating Mobile Services using Microsoft Account, Facebook, Twitter, and Google from these tutorials: Get started with authentication in Mobile Services for Windows Store (C#) Get started with authentication in Mobile Services for Windows Store (JavaScript) Get started with authentication in Mobile Services for iOS Using Windows Azure Blob, Tables and ServiceBus with your Mobile Services Mobile Services provide a simple but powerful way to add server logic using server scripts. These scripts are associated with the individual CRUD operations on your mobile service’s tables. Server scripts are great for data validation, custom authorization logic (e.g. does this user participate in this game session), augmenting CRUD operations, sending push notifications, and other similar scenarios.   Server scripts are written in JavaScript and are executed in a secure server-side scripting environment built using Node.js.  You can edit these scripts and save them on the server directly within the Windows Azure Portal: In this week’s release we have added the ability to work with other Windows Azure services from your Mobile Service server scripts.  This is supported using the existing “azure” module within the Windows Azure SDK for Node.js.  For example, the below code could be used in a Mobile Service script to obtain a reference to a Windows Azure Table (after which you could query it or insert data into it):     var azure = require('azure');     var tableService = azure.createTableService("<< account name >>",                                                 "<< access key >>"); Follow the tutorials on the Windows Azure Node.js dev center to learn more about working with Blob, Tables, Queues and Service Bus using the azure module. Sending emails from your Mobile Service In this week’s release we have also added the ability to easily send emails from your Mobile Service, building on our partnership with SendGrid. Whether you want to add a welcome email upon successful user registration, or make your app alert you of certain usage activities, you can do this now by sending email from Mobile Services server scripts. To get started, sign up for SendGrid account at http://sendgrid.com . Windows Azure customers receive a special offer of 25,000 free emails per month from SendGrid. To sign-up for this offer, or get more information, please visit http://www.sendgrid.com/azure.html . One you signed up, you can add the following script to your Mobile Service server scripts to send email via SendGrid service:     var sendgrid = new SendGrid('<< account name >>', '<< password >>');       sendgrid.send({         to: '<< enter email address here >>',         from: '<< enter from address here >>',         subject: 'New to-do item',         text: 'A new to-do was added: ' + item.text     }, function (success, message) {         if (!success) {             console.error(message);         }     }); Follow the Send email from Mobile Services with SendGrid tutorial to learn more. Sending SMS messages from your Mobile Service SMS is a key communication medium for mobile apps - it comes in handy if you want your app to send users a confirmation code during registration, allow your users to invite their friends to install your app or reach out to mobile users without a smartphone. Using Mobile Service server scripts and Twilio’s REST API, you can now easily send SMS messages to your app.  To get started, sign up for Twilio account. Windows Azure customers receive 1000 free text messages when using Twilio and Windows Azure together. Once signed up, you can add the following to your Mobile Service server scripts to send SMS messages:     var httpRequest = require('request');     var account_sid = "<< account SID >>";     var auth_token = "<< auth token >>";       // Create the request body     var body = "From=" + from + "&To=" + to + "&Body=" + message;       // Make the HTTP request to Twilio     httpRequest.post({         url: "https://" + account_sid + ":" + auth_token +              "@api.twilio.com/2010-04-01/Accounts/" + account_sid + "/SMS/Messages.json",         headers: { 'content-type': 'application/x-www-form-urlencoded' },         body: body     }, function (err, resp, body) {         console.log(body);     }); I’m excited to be speaking at the TwilioCon conference this week, and will be showcasing some of the cool scenarios you can now enable with Twilio and Windows Azure Mobile Services. Mobile Services availability in West US region Our initial preview of Windows Azure Mobile Services was only supported in the US East region of Windows Azure.  As with every Windows Azure service, overtime we will extend Mobile Services to all Windows Azure regions. With this week’s preview update we’ve added support so that you can now create your Mobile Service in the West US region as well: Summary The above features are all now live in production and are available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using Mobile Services today. Visit the Windows Azure Mobile Developer Center to learn more about how to build apps with Mobile Services. We’ll have even more new features and enhancements coming later this week – including .NET 4.5 support for Windows Azure Web Sites.  Keep an eye out on my blog for details as new features become available. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Solaris: What comes next?

    - by alanc
    As you probably know by now, a few months ago, we released Solaris 11 after years of development. That of course means we now need to figure out what comes next - if Solaris 11 is “The First Cloud OS”, then what do we need to make future releases of Solaris be, to be modern and competitive when they're released? So we've been having planning and brainstorming meetings, and I've captured some notes here from just one of those we held a couple weeks ago with a number of the Silicon Valley based engineers. Now before someone sees an idea here and calls their product rep wanting to know what's up, please be warned what follows are rough ideas, and as I'll discuss later, none of them have any committment, schedule, working code, or even plan for integration in any possible future product at this time. (Please don't make me force you to read the full Oracle future product disclaimer here, you should know it by heart already from the front of every Oracle product slide deck.) To start with, we did some background research, looking at ideas from other Oracle groups, and competitive OS'es. We examined what was hot in the technology arena and where the interesting startups were heading. We then looked at Solaris to see where we could apply those ideas. Making Network Admins into Socially Networking Admins We all know an admin who has grumbled about being the only one stuck late at work to fix a problem on the server, or having to work the weekend alone to do scheduled maintenance. But admins are humans (at least most are), and crave companionship and community with their fellow humans. And even when they're alone in the server room, they're never far from a network connection, allowing access to the wide world of wonders on the Internet. Our solution here is not building a new social network - there's enough of those already, and Oracle even has its own Oracle Mix social network already. What we proposed is integrating Solaris features to help engage our system admins with these social networks, building community and bringing them recognition in the workplace, using achievement recognition systems as found in many popular gaming platforms. For instance, if you had a Facebook account, and a group of admin friends there, you could register it with our Social Network Utility For Facebook, and then your friends might see: Alan earned the achievement Critically Patched (April 2012) for patching all his servers. Matt is only at 50% - encourage him to complete this achievement today! To avoid any undue risk of advertising who has unpatched servers that are easier targets for hackers to break into, this information would be tightly protected via Facebook's world-renowned privacy settings to avoid it falling into the wrong hands. A related form of gamification we considered was replacing simple certfications with role-playing-game-style Experience Levels. Instead of just knowing an admin passed a test establishing a given level of competency, these would provide recruiters with a more detailed level of how much real-world experience an admin has. Achievements such as the one above would feed into it, but larger numbers of experience points would be gained by tougher or more critical tasks - such as recovering a down system, or migrating a service to a new platform. (As long as it was an Oracle platform of course - migrating to an HP or IBM platform would cause the admin to lose points with us.) Unfortunately, we couldn't figure out a good way to prevent (if you will) “gaming” the system. For instance, a disgruntled admin might decide to start ignoring warnings from FMA that a part is beginning to fail or skip preventative maintenance, in the hopes that they'd cause a catastrophic failure to earn more points for bolstering their resume as they look for a job elsewhere, and not worrying about the effect on your business of a mission critical server going down. More Z's for ZFS Our suggested new feature for ZFS was inspired by the worlds most successful Z-startup of all time: Zynga. Using the Social Network Utility For Facebook described above, we'd tie it in with ZFS monitoring to help you out when you find yourself in a jam needing more disk space than you have, and can't wait a month to get a purchase order through channels to buy more. Instead with the click of a button you could post to your group: Alan can't find any space in his server farm! Can you help? Friends could loan you some space on their connected servers for a few weeks, knowing that you'd return the favor when needed. ZFS would create a new filesystem for your use on their system, and securely share it with your system using Kerberized NFS. If none of your friends have space, then you could buy temporary use space in small increments at affordable rates right there in Facebook, using your Facebook credits, and then file an expense report later, after the urgent need has passed. Universal Single Sign On One thing all the engineers agreed on was that we still had far too many "Single" sign ons to deal with in our daily work. On the web, every web site used to have its own password database, forcing us to hope we could remember what login name was still available on each site when we signed up, and which unique password we came up with to avoid having to disclose our other passwords to a new site. In recent years, the web services world has finally been reducing the number of logins we have to manage, with many services allowing you to login using your identity from Google, Twitter or Facebook. So we proposed following their lead, introducing PAM modules for web services - no more would you have to type in whatever login name IT assigned and try to remember the password you chose the last time password aging forced you to change it - you'd simply choose which web service you wanted to authenticate against, and would login to your Solaris account upon reciept of a cookie from their identity service. Pinning notes to the cloud We also all noted that we all have our own pile of notes we keep in our daily work - in text files in our home directory, in notebooks we carry around, on white boards in offices and common areas, on sticky notes on our monitors, or on scraps of paper pinned to our bulletin boards. The contents of the notes vary, some are things just for us, some are useful for our groups, some we would share with the world. For instance, when our group moved to a new building a couple years ago, we had a white board in the hallway listing all the NIS & DNS servers, subnets, and other network configuration information we needed to set up our Solaris machines after the move. Similarly, as Solaris 11 was finishing and we were all learning the new network configuration commands, we shared notes in wikis and e-mails with our fellow engineers. Users may also remember one of the popular features of Sun's old BigAdmin site was a section for sharing scripts and tips such as these. Meanwhile, the online "pin board" at Pinterest is taking the web by storm. So we thought, why not mash those up to solve this problem? We proposed a new BigAddPin site where users could “pin” notes, command snippets, configuration information, and so on. For instance, once they had worked out the ideal Automated Installation manifest for their app server, they could pin it up to share with the rest of their group, or choose to make it public as an example for the world. Localized data, such as our group's notes on the servers for our subnet, could be shared only to users connecting from that subnet. And notes that they didn't want others to see at all could be marked private, such as the list of phone numbers to call for late night pizza delivery to the machine room, the birthdays and anniversaries they can never remember but would be sleeping on the couch if they forgot, or the list of automatically generated completely random, impossible to remember root passwords to all their servers. For greater integration with Solaris, we'd put support right into the command shells — redirect output to a pinned note, set your path to include pinned notes as scripts you can run, or bring up your recent shell history and pin a set of commands to save for the next time you need to remember how to do that operation. Location service for Solaris servers A longer term plan would involve convincing the hardware design groups to put GPS locators with wireless transmitters in future server designs. This would help both admins and service personnel trying to find servers in todays massive data centers, and could feed into location presence apps to help show potential customers that while they may not see many Solaris machines on the desktop any more, they are all around. For instance, while walking down Wall Street it might show “There are over 2000 Solaris computers in this block.” [Note: this proposal was made before the recent media coverage of a location service aggregrator app with less noble intentions, and in hindsight, we failed to consider what happens when such data similarly falls into the wrong hands. We certainly wouldn't want our app to be misinterpreted as “There are over $20 million dollars of SPARC servers in this building, waiting for you to steal them.” so it's probably best it was rejected.] Harnessing the power of the GPU for Security Most modern OS'es make use of the widespread availability of high powered GPU hardware in today's computers, with desktop environments requiring 3-D graphics acceleration, whether in Ubuntu Unity, GNOME Shell on Fedora, or Aero Glass on Windows, but we haven't yet made Solaris fully take advantage of this, beyond our basic offering of Compiz on the desktop. Meanwhile, more businesses are interested in increasing security by using biometric authentication, but must also comply with laws in many countries preventing discrimination against employees with physical limations such as missing eyes or fingers, not to mention the lost productivity when employees can't login due to tinted contacts throwing off a retina scan or a paper cut changing their fingerprint appearance until it heals. Fortunately, the two groups considering these problems put their heads together and found a common solution, using 3D technology to enable authentication using the one body part all users are guaranteed to have - pam_phrenology.so, a new PAM module that uses an array USB attached web cams (or just one if the user is willing to spin their chair during login) to take pictures of the users head from all angles, create a 3D model and compare it to the one in the authentication database. While Mythbusters has shown how easy it can be to fool common fingerprint scanners, we have not yet seen any evidence that people can impersonate the shape of another user's cranium, no matter how long they spend beating their head against the wall to reshape it. This could possibly be extended to group users, using modern versions of some of the older phrenological studies, such as giving all users with long grey beards access to the System Architect role, or automatically placing users with pointy spikes in their hair into an easy use mode. Unfortunately, there are still some unsolved technical challenges we haven't figured out how to overcome. Currently, a visit to the hair salon causes your existing authentication to expire, and some users have found that shaving their heads is the only way to avoid bad hair days becoming bad login days. Reaction to these ideas After gathering all our notes on these ideas from the engineering brainstorming meeting, we took them in to present to our management. Unfortunately, most of their reaction cannot be printed here, and they chose not to accept any of these ideas as they were, but they did have some feedback for us to consider as they sent us back to the drawing board. They strongly suggested our ideas would be better presented if we weren't trying to decipher ink blotches that had been smeared by the condensation when we put our pint glasses on the napkins we were taking notes on, and to that end let us know they would not be approving any more engineering offsites in Irish themed pubs on the Friday of a Saint Patrick's Day weekend. (Hopefully they mean that situation specifically and aren't going to deny the funding for travel to this year's X.Org Developer's Conference just because it happens to be in Bavaria and ending on the Friday of the weekend Oktoberfest starts.) They recommended our research techniques could be improved over just sitting around reading blogs and checking our Facebook, Twitter, and Pinterest accounts, such as considering input from alternate viewpoints on topics such as gamification. They also mentioned that Oracle hadn't fully adopted some of Sun's common practices and we might have to try harder to get those to be accepted now that we are one unified company. So as I said at the beginning, don't pester your sales rep just yet for any of these, since they didn't get approved, but if you have better ideas, pass them on and maybe they'll get into our next batch of planning.

    Read the article

  • Finding the Right Solution to Source and Manage Your Contractors

    - by mark.rosenberg(at)oracle.com
    Many of our PeopleSoft Enterprise applications customers operate in service-based industries, and all of our customers have at least some internal service units, such as IT, marketing, and facilities. Employing the services of contractors, often referred to as "contingent labor," to deliver either or both internal and external services is common practice. As we've transitioned from an industrial age to a knowledge age, talent has become a primary competitive advantage for most organizations. Contingent labor offers talent on flexible terms; it offers the ability to scale up operations, close skill gaps, and manage risk in the process of delivering services. Talent comes from many sources and the rise in the contingent worker (contractor, consultant, temporary, part time) has increased significantly in the past decade and is expected to reach 40 percent in the next decade. Managing the total pool of talent in a seamless integrated fashion not only saves organizations money and increases efficiency, but creates a better place for workers of all kinds to work. Although the term "contingent labor" is frequently used to describe both contractors and employees who have flexible schedules and relationships with an organization, the remainder of this discussion focuses on contractors. The term "contingent labor" is used interchangeably with "contractor." Recognizing the importance of contingent labor, our PeopleSoft customers often ask our team, "What Oracle vendor management system (VMS) applications should I evaluate for managing contractors?" In response, I thought it would be useful to describe and compare the three most common Oracle-based options available to our customers. They are:   The enterprise licensed software model in which you implement and utilize the PeopleSoft Services Procurement (sPro) application and potentially other PeopleSoft applications;  The software-as-a-service model in which you gain access to a derivative of PeopleSoft sPro from an Oracle Business Process Outsourcing Partner; and  The managed service provider (MSP) model in which staffing industry professionals utilize either your enterprise licensed software or the software-as-a-service application to administer your contingent labor program. At this point, you may be asking yourself, "Why three options?" The answer is that since there is no "one size fits all" in terms of talent, there is also no "one size fits all" for effectively sourcing and managing contingent workers. Various factors influence how an organization thinks about and relates to its contractors, and each of the three Oracle-based options addresses an organization's needs and preferences differently. For the purposes of this discussion, I will describe the options with respect to (A) pricing and software provisioning models; (B) control and flexibility; (C) level of engagement with contractors; and (D) approach to sourcing, employment law, and financial settlement. Option 1:  Enterprise Licensed Software In this model, you purchase from Oracle the license and support for the applications you need. Typically, you license PeopleSoft sPro as your VMS tool for sourcing, monitoring, and paying your contract labor. In conjunction with sPro, you can also utilize PeopleSoft Human Capital Management (HCM) applications (if you do not already) to configure more advanced business processes for recruiting, training, and tracking your contractors. Many customers choose this enterprise license software model because of the functionality and natural integration of the PeopleSoft applications and because the cost for the PeopleSoft software is explicit. There is no fee per transaction to source each contractor under this model. Our customers that employ contractors to augment their permanent staff on billable client engagements often find this model appealing because there are no fees to affect their profit margins. With this model, you decide whether to have your own IT organization run the software or have the software hosted and managed by either Oracle or another application services provider. Your organization, perhaps with the assistance of consultants, configures, deploys, and operates the software for managing your contingent workforce. This model offers you the highest level of control and flexibility since your organization can configure the contractor process flow exactly to your business and security requirements and can extend the functionality with PeopleTools. This option has proven very valuable and applicable to our customers engaged in government contracting because their contingent labor management practices are subject to complex standards and regulations. Customers find a great deal of value in the application functionality and configurability the enterprise licensed software offers for managing contingent labor. Some examples of that functionality are... The ability to create a tiered network of preferred suppliers including competencies, pricing agreements, and elaborate candidate management capabilities. Configurable alerts and online collaboration for bid, resource requisition, timesheet, and deliverable entry, routing, and approval for both resource and deliverable-based services. The ability to manage contractors with the same PeopleSoft HCM and Projects applications that are used to manage the permanent workforce. Because it allows you to utilize much of the same PeopleSoft HCM and Projects application functionality for contractors that you use for permanent employees, the enterprise licensed software model supports the deepest level of engagement with the contingent workforce. For example, you can: fill job openings with contingent labor; guide contingent workers through essential safety and compliance training with PeopleSoft Enterprise Learning Management; and source contingent workers directly to project-based assignments in PeopleSoft Resource Management and PeopleSoft Program Management. This option enables contingent workers to collaborate closely with your permanent staff on complex, knowledge-based efforts - R&D projects, billable client contracts, architecture and engineering projects spanning multiple years, and so on. With the enterprise licensed software model, your organization maintains responsibility for the sourcing, onboarding (including adherence to employment laws), and financial settlement processes. This means your organization maintains on staff or hires the expertise in these domains to utilize the software and interact with suppliers and contractors. Option 2:  Software as a Service (SaaS) The effort involved in setting up and operating VMS software to handle a contingent workforce leads many organizations to seek a system that can be activated and configured within a few days and for which they can pay based on usage. Oracle's Business Process Outsourcing partner, Provade, Inc., provides exactly this option to our customers. Provade offers its vendor management software as a service over the Internet and usually charges your organization a fee that is a percentage of your total contingent labor spending processed through the Provade software. (Percentage of spend is the predominant fee model, although not the only one.) In addition to lower implementation costs, the effort of configuring and maintaining the software is largely upon Provade, not your organization. This can be very appealing to IT organizations that are thinly stretched supporting other important information technology initiatives. Built upon PeopleSoft sPro, the Provade solution is tailored for simple and quick deployment and administration. Provade has added capabilities to clone users rapidly and has simplified business documents, like work orders and change orders, to facilitate enterprise-wide, self-service adoption with little to no training. Provade also leverages Oracle Business Intelligence Enterprise Edition (OBIEE) to provide integrated spend analytics and dashboards. Although pure customization is more limited than with the enterprise licensed software model, Provade offers a very effective option for organizations that are regularly on-boarding and off-boarding high volumes of contingent staff hired to perform discrete support tasks (for example, order fulfillment during the holiday season, hourly clerical work, desktop technology repairs, and so on) or project tasks. The software is very configurable and at the same time very intuitive to even the most computer-phobic users. The level of contingent worker engagement your organization can achieve with the Provade option is generally the same as with the enterprise licensed software model since Provade can automatically establish contingent labor resources in your PeopleSoft applications. Provade has pre-built integrations to Oracle's PeopleSoft and the Oracle E-Business Suite procurement, projects, payables, and HCM applications, so that you can evaluate, train, assign, and track contingent workers like your permanent employees. Similar to the enterprise licensed software model, your organization is responsible for the contingent worker sourcing, administration, and financial settlement processes. This means your organization needs to maintain the staff expertise in these domains. Option 3:  Managed Services Provider (MSP) Whether you are using the enterprise licensed model or the SaaS model, you may want to engage the services of sourcing, employment, payroll, and financial settlement professionals to administer your contingent workforce program. Firms that offer this expertise are often referred to as "MSPs," and they are typically staffing companies that also offer permanent and temporary hiring services. (In fact, many of the major MSPs are Oracle applications customers themselves, and they utilize the PeopleSoft Solution for the Staffing Industry to run their own business operations.) Usually, MSPs place their staff on-site at your facilities, and they can utilize either your enterprise licensed PeopleSoft sPro application or the Provade VMS SaaS software to administer the network of suppliers providing contingent workers. When you utilize an MSP, there is a separate fee for the MSP's service that is typically funded by the participating suppliers of the contingent labor. Also in this model, the suppliers of the contingent labor (not the MSP) usually pay the contingent labor force. With an MSP, you are intentionally turning over business process control for the advantages associated with having someone else manage the processes. The software option you choose will to a certain extent affect your process flexibility; however, the MSPs are often able to adapt their processes to the unique demands of your business. When you engage an MSP, you will want to give some thought to the level of engagement and "partnering" you need with your contingent workforce. Because the MSP acts as an intermediary, it can be very valuable in handling high volume, routine contracting for which there is a relatively low need for "partnering" with the contingent workforce. However, if your organization (or part of your organization) engages contingent workers for high-profile client projects that require diplomacy, intensive amounts of interaction, and personal trust, introducing an MSP into the process may prove less effective than handling the process with your own staff. In fact, in many organizations, it is common to enlist an MSP to handle contractors working on internal projects and to have permanent employees handle the contractor relationships that affect the portion of the services portfolio focused on customer-facing, billable projects. One of the key advantages of enlisting an MSP is that you do not have to maintain the expertise required for orchestrating the sourcing, hiring, and paying of contingent workers.  These are the domain of the MSPs. If your own staff members are not prepared to manage the essential "overhead" processes associated with contingent labor, working with an MSP can make solid business sense. Proper administration of a contingent workforce can make the difference between project success and failure, operating profit and loss, and legal compliance and fines. Concluding Thoughts There is little doubt that thoughtfully and purposefully constructing a service delivery strategy that leverages the strengths of contingent workers can lead to better projects, deliverables, and business results. What requires a bit more thinking is determining the platform (or platforms) that will enable each part of your organization to best deliver on its mission.

    Read the article

  • CodePlex Daily Summary for Tuesday, September 25, 2012

    CodePlex Daily Summary for Tuesday, September 25, 2012Popular ReleasesRawr: Rawr 5.0.0: This is the Downloadable WPF version of Rawr!For web-based version see http://elitistjerks.com/rawr.php You can find the version notes at: http://rawr.codeplex.com/wikipage?title=VersionNotes Rawr Addon (NOT UPDATED YET FOR MOP)We now have a Rawr Official Addon for in-game exporting and importing of character data hosted on Curse. The Addon does not perform calculations like Rawr, it simply shows your exported Rawr data in wow tooltips and lets you export your character to Rawr (including ba...Coevery - Free CRM: Coevery 1.0.0.26: The zh-CN issue has been solved. We also add a project management module.VidCoder: 1.4.1 Beta: Updated to HandBrake 4971. This should fix some issues with stuck PGS subtitles. Fixed build break which prevented pre-compiled XML serializers from showing up. Fixed problem where a preset would get errantly marked as modified when re-opening the encode settings window or importing a new preset.D3 Loot Tracker: 1.3: Added the ability to reload a previous session to be able to resume it. Removed goblin detection, let's keep this an item tracking utility only. Fixed a bug with crafting sound setting not working properly. Completely re-styled the UI.JSLint for Visual Studio 2010: 1.4.0: VS2012 support is alphaBlackJumboDog: Ver5.7.2: 2012.09.23 Ver5.7.2 (1)InetTest?? (2)HTTP?????????????????100???????????Player Framework by Microsoft: Player Framework for Windows 8 (Preview 6): IMPORTANT: List of breaking changes from preview 5 Added separate samples download with .vsix dependencies instead of source dependencies Support for FreeWheel SmartXML ad responses Support for Smooth Streaming SDK DownloaderPlugins Support for VMAP and TTML polling for live scenarios Support for custom smooth streaming byte stream and scheme handlers Support for new play time and position tracking plugin Added IsLiveChanged event Added AdaptivePlugin.MaxBitrate property Add...WPF Application Framework (WAF): WPF Application Framework (WAF) 2.5.0.8: Version: 2.5.0.8 (Milestone 8): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Changelog Legend: [B] Breaking change; [O] Marked member as obsolete WAF: Mark the class DataModel as serializable. InfoMan: Minor improvements. InfoMan: Add unit tests for all modules. Othe...LogicCircuit: LogicCircuit 2.12.9.20: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionToolbars on text note dialog are more flexible now. You can select font face, size, color, and background of text you are typing. RAM now can be initialized to one of the following: random va...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.2020.421): New features: Disable a specific part of SiteMap to keep the data without displaying them in the CRM application. It simply comments XML part of the sitemap (thanks to rboyers for this feature request) Right click an item and click on "Disable" to disable it Items disabled are greyed and a suffix "- disabled" is added Right click an item and click on "Enable" to enable it Refresh list of web resources in the web resources pickerHigLabo: HigLabo_20120919: Add XXXAsync method to all Client class for async await pattern. (HttpClient,BoxNetClient,DropboxClient,FacebookClient,FtpClient,RssClient,SugarSyncClient,TwitterClient,WindowsLiveClient) Add all api to HigLabo.Net.Ftp project. Add strong name to all assembly. Add HttpBodyMultipartFormData to provide upload multipart form data with http protocol. Add HttpBodyFormUrlEncodedData to provide form url encoded post data with http protocol. FacebookClient,RssClient,WindowsLiveClient,BoxNetClient cl...AJAX Control Toolkit: September 2012 Release: AJAX Control Toolkit Release Notes - September 2012 Release Version 60919September 2012 release of the AJAX Control Toolkit. AJAX Control Toolkit .NET 4.5 – AJAX Control Toolkit for .NET 4.5 and sample site (Recommended). AJAX Control Toolkit .NET 4 – AJAX Control Toolkit for .NET 4 and sample site (Recommended). AJAX Control Toolkit .NET 3.5 – AJAX Control Toolkit for .NET 3.5 and sample site (Recommended). Notes: - The current version of the AJAX Control Toolkit is not compatible with ...Sense/Net CMS - Enterprise Content Management: SenseNet 6.1.2 Community Edition: Sense/Net 6.1.2 Community EditionMain new featuresOur current release brings a lot of bugfixes, including the resolution of js/css editing cache issues, xlsx file handling from Office, expense claim demo workspace fixes and much more. Besides fixes 6.1.2 introduces workflow start options and other minor features like a reusable Reject client button for approval scenarios and resource editor enhancements. We have also fixed an issue with our install package to bring you a flawless installation...Solution Extender for Microsoft Dynamics CRM 2011: Solution Extender (2.0.0.6): Fix a problem when serializing entity records (this fix the problem when exporting queues)Visual C++ Directories Editor: VC++ Directories 2012 Editor v1.0 ML (x32-x64): version 1.0 ML for Visual C++ 2012WinRT XAML Toolkit: WinRT XAML Toolkit - 1.2.3: WinRT XAML Toolkit based on the Windows 8 RTM SDK. Download the latest source from the SOURCE CODE page. For compiled version use NuGet. You can add it to your project in Visual Studio by going to View/Other Windows/Package Manager Console and entering: PM> Install-Package winrtxamltoolkit Features AsyncUI extensions Controls and control extensions Converters Debugging helpers Imaging IO helpers VisualTree helpers Samples Recent changes NOTE: Namespace changes DebugConsol...Python Tools for Visual Studio: 1.5 RC: PTVS 1.5RC Available! We’re pleased to announce the release of Python Tools for Visual Studio 1.5 RC. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, Edit/Intellisense/Debug/Profile, Cloud, HPC, IPython, etc. support. The primary new feature for the 1.5 release is Django including Azure support! The http://www.djangoproject.com is a pop...Launchbar: Lanchbar 4.0.0: This application requires .NET 4.5 which you can find here: www.microsoft.com/visualstudio/downloadsAssaultCube Reloaded: 2.5.4 -: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, please wait while we try to package for those OSes. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) Changelog: New logo Improved airstrike! Reset nukes...Extended WPF Toolkit: Extended WPF Toolkit - 1.7.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's new in the 1.7.0 Release?New controls Zoombox Pie New features / bug fixes PropertyGrid.ShowTitle property added to allow showing/hiding the PropertyGrid title. Modifications to the PropertyGrid.EditorDefinitions collection will now automatically be applied to the PropertyGrid. Modifications to the PropertyGrid.PropertyDefinitions collection will now be reflected automaticaly...New ProjectsAffine transformations (Iterated function system): Small app for generating fractals using iterated function system.aspnet mvc store: Lite ASP.NET MVC CMSAugmented Reality: .Autocomplete: AutocompleteBCS to provide stock information in SharePoint 2013: This .Net assembly BCS external system provides live, read only data on Dow Jones 30 stocks details from MSN money webservices.Blood Alcohol Measurement Tool: Az alkalmazás kijelzi a felhasználó véralkoholszint változását az elfogyasztott alkoholos italok függvényében.Busqueda Incremental con un TEXTBOX: Hola, aqui estoy de nuevo con un aporte mas para la comunidad, en muchos foros he visto que estan buscando como hacer una busqueda incremental en un TEXTBOX.CRM 2011: Reassign or Transfer Personal Views: This Project allows CRM Administrator to quickly transfer (i.e. assign) advanced find views from one CRM user to another CRM user. ctripITSM doc: this is a documentation share for ctripITSM projectet Sprint 3: etsprint3Finance App for Windows 8: This is an WinJS Windows 8 application that computes various financial metrics.gadgets: Windows Sidebar Gargetsjprj: jprjKerosene ORM: Kerosene is a self-adaptive and configuration-less ORM library, with a SQL syntax based on C# dynamics, WCF, and Entity Framework capabilities for POCO objects.Lobster: ?? ?? ?????????.My Google Map: MyGoogleMap est un outils de génération de carte. Onestop.Contrib.CustomAdmin: Onestop.Contrib.CustomAdmin is a Theme for Orchard CMS providing for changing the admin dashboard elements such as Title and Logo.Onestop.Contrib.Disqus: Onestop.Contrib.Disqus is an advanced commenting module for Orchard CMS that uses Disqus.Onestop.Contrib.LayoutSelector: Onestop.Contrib.LayoutSelector is a simple part for switching to different versions of Layout.cshtml when editing Orchard content items.Onestop.Contrib.Navigation: Onestop.Contrib.Navigation is an advanced Menu Management system designed for Orchard CMS. Onestop.Contrib.Seo: Onestop.Contrib.Seo is an advanced Search Engine Optimization module for Orchard CMSOnestop.Contrib.SlideShow: Onestop.Contrib.SlideShow is an advanced module for managing slide animations on a page.pf2012: Simple HTML5 game as PF2012 electronic greeting.PHP-2012: this is for php for schoolPruebaProyecto: Carmen Asencio AmbrosioPulawJS: MVC Platform for JavaScript. Inspired by Zend FrameworkRevolution Emulator: Habbo Hotel flash emulator targeting the .NET 4.5 VM, written in C#.Ruby Rookie: This Project is for learning Ruby purposesSISLOG_Proy2: SISLOGSkyShellEx: SkyShellEx allows to sync any folder to SkyDrive via a simple ShellExtension. The sync option appears on the context menu of folders where applicable. sosoft: Sosoft Project.C# WinForm.Include Alarm Clock.sound9: sound9testddgit09242012: dTFS Agile Work Item Rollover: This is a command line utility used to rollover incomplete work from one sprint to the next. The tool has both interactive and silent modes, allowing you to seTFS Deployment Studio: Helps deploying applications built using TFS to the servers where they belong.TreeView In-place Editing in MVVM: This project demonstrates a clean way of doing the in-place editing in the WPF TreeView controlUniSoft: Teste source controlValidation Framework for .NET: Framework for validation of method paramters and return values.WarOfDev: war of developer, make your coding interesting . Waterhouse: A C# console application that takes various text inputs and converts it to Morse Code by blinking the numlock indicator.Web Package Pro: in dev

    Read the article

  • Enabling XML-documentation for code contracts

    - by DigiMortal
    One nice feature that code contracts offer is updating of code documentation. If you are using source code documenting features of Visual Studio then code contracts may automate some tasks you otherwise have to implement manually. In this posting I will show you some XML documentation files with documented contracts. I will also explain how this feature works. Enabling XML-documentation in project settings As a first thing let’s enable generating of code documentation under project settings. Open project properties, move to Build page and make check to checkbox called “XML documentation file”. Save project settings and rebuild project. When project is built go to bin/Debug folder and open the XML-file. Here is my XML. <?xml version="1.0"?> <doc>     <assembly>         <name>Eneta.Examples.CodeContracts.Testable</name>     </assembly>     <members>         <member name="T:Eneta.Examples.CodeContracts.Testable.Randomizer">             <summary>             Class for generating random integers in user specified range.             </summary>         </member>         <member name="M:Eneta.Examples.CodeContracts.Testable.Randomizer.#ctor(Eneta.Examples.CodeContracts.Testable.IRandomGenerator)">             <summary>             Constructor of Randomizer. Initializes Randomizer class.             </summary>             <param name="generator">Instance of random number generator.</param>         </member>         <member name="M:Eneta.Examples.CodeContracts.Testable.Randomizer.GetRandomFromRangeContracted(System.Int32,System.Int32)">             <summary>             Returns random integer in given range.             </summary>             <param name="min">Minimum value of random integer.</param>             <param name="max">Maximum value of random integer.</param>         </member>     </members> </doc> You can see nothing about code contracts here. Enabling code contracts documentation Code contracts have their own settings and conditions for documentation. Open project properties and move to Code Contracts tab. From “Contract Reference Assembly” dropdown check Build and make check to checkbox “Emit contracts into XML doc file”. And again – save project setting, build the project and move to bin/Debug folder. Now you can see that there are two files for XML-documentation: <assembly name>.XML <assembly name>.old.XML First files is documentation with contracts, second file is original documentation without contracts. Let’s see now what is inside our new XML-documentation file. <?xml version="1.0"?> <doc>   <assembly>     <name>Eneta.Examples.CodeContracts.Testable</name>   </assembly>   <members>     <member name="T:Eneta.Examples.CodeContracts.Testable.Randomizer">       <summary>             Class for generating random integers in user specified range.             </summary>     </member>     <member name="M:Eneta.Examples.CodeContracts.Testable.Randomizer.#ctor(Eneta.Examples.CodeContracts.Testable.IRandomGenerator)">       <summary>             Constructor of Randomizer. Initializes Randomizer class.             </summary>       <param name="generator">Instance of random number generator.</param>     </member>     <member name="M:Eneta.Examples.CodeContracts.Testable.Randomizer.GetRandomFromRangeContracted(System.Int32,System.Int32)">       <summary>             Returns random integer in given range.             </summary>       <param name="min">Minimum value of random integer.</param>       <param name="max">Maximum value of random integer.</param>       <requires description="Min must be less than max" exception="T:System.ArgumentOutOfRangeException">                 min &lt; max</requires>       <exception cref="T:System.ArgumentOutOfRangeException">                 min &gt;= max</exception>       <ensures description="Return value is out of range">                 Contract.Result&lt;int&gt;() &gt;= min &amp;&amp;                 Contract.Result&lt;int&gt;() &lt;= max</ensures>     </member>   </members> </doc> As you can see then code contracts are pretty well documented. Messages that I provided with code contracts are also available in documentation. If I wrote very good and informative messages then these messages are very useful also in contracts documentation. Code contracts and Sandcastle Sandcastle knows nothing about code contracts by default. There is separate package of file for Sandcastle that is provided you by code contracts installation. You can read from code contracts manual: “Sandcastle (http://www.codeplex.com/Sandcastle) is a freely available tool that generates help les and web sites describing your APIs, based on the XML doc comments in your source code. The CodeContracts install contains a set of les that can be copied over a Sandcastle installation to take advantage of the additional contract information. The produced documentation adds a contract section to methods with declared requires and/or ensures. In order for Sandcastle to produce Contract sections, you need to patch a number of files in its installation. Please refer to the Sandcastle Readme.txt found under Start Menu/CodeContracts/Sandcastle for instructions. A future release of Sandcastle will hopefully support contract sections without the need for this patching step.” Integrating code contracts documentation to Sandcastle will be one of my next postings about code contracts. Conclusion if you are using code documentation then documentation about code contracts can be added to documentation very easily. All you have to do is to enable XML-documentation for contracts and build your project. Later you can use Sandcastle files provided by code contracts installer to integrate contracts documentation to your output documentation package.

    Read the article

  • CodePlex Daily Summary for Saturday, June 02, 2012

    CodePlex Daily Summary for Saturday, June 02, 2012Popular ReleasesZXMAK2: Version 2.6.2.3: - add support for ZIP files created on UNIX system; - improve WAV support (fixed PCM24, FLOAT32; added PCM32, FLOAT64); - fix drag-n-drop on modal dialogs; - tape AutoPlay feature (thanks to Woody for algorithm)Librame Utility: Librame Utility 3.5.1: 2012-06-01 ???? ?、????(System.Web.Caching ???) 1、??????(? Librame.Settings ??); 2、?? SQL ????; 3、??????; 4、??????; ?、???? 1、????:??MD5、SHA1、SHA256、SHA384、SHA512?; 2、???????:??BASE64、DES、??DES、AES?; ?:???? GUID (???????)??KEY,?????????????。 ?、????? 1、?????、??、?????????; 2、??????????; ?:??????????????(Enum.config)。 ?、???? 1、??????、??、??、??、????????; 2、?????????????????; ?、?????? 1、????? XML ? JSON ?????????(??? XML ??); ?、????? 1、??????????(??? MediaInfo.dll(32?)??); 2、????????(??? ffmpeg...TestProject_Git: asa: asdf.Net Code Samples: Full WCF Duplex Service Example: Full WCF Duplex Service ExampleVivoSocial: VivoSocial 2012.06.01: Version 2012.06.01 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 2012.06.01 release and see if they persist. You can download the new releases from social.codeplex.com or our downloads page. If you have any questions about this release, please post them in our Support forums. If you are experiencing a bug or would like to request a new feature, please submit it to our issue tracker. This release has been tested on ...Kendo UI ASP.NET Sample Applications: Sample Applications (2012-06-01): Sample application(s) demonstrating the use of Kendo UI in ASP.NET applications.Better Explorer: Better Explorer Beta 1: Finally, the first Beta is here! There were a lot of changes, including: Translations into 10 different languages (the translations are not complete and will be updated soon) Conditional Select new tools for managing archives Folder Tools tab new search bar and Search Tab new image editing tools update function many bug fixes, stability fixes, and memory leak fixes other new features as well! Please check it out and if there are any problems, let us know. :) Also, do not forge...myManga: myManga v1.0.0.3: Will include MangaPanda as a default option. ChangeLog Updating from Previous Version: Extract contents of Release - myManga v1.0.0.3.zip to previous version's folder. Replaces: myManga.exe BakaBox.dll CoreMangaClasses.dll Manga.dll Plugins/MangaReader.manga.dll Plugins/MangaFox.manga.dll Plugins/MangaHere.manga.dll Plugins/MangaPanda.manga.dllPlayer Framework by Microsoft: Player Framework for Windows 8 Metro (Preview 3): Player Framework for HTML/JavaScript and XAML/C# Metro Style Applications. Additional DownloadsIIS Smooth Streaming Client SDK for Windows 8 Microsoft PlayReady Client SDK for Metro Style Apps Release notes:Support for Windows 8 Release Preview (released 5/31/12) Advertising support (VAST, MAST, VPAID, & clips) Miscellaneous improvements and bug fixesMicrosoft Ajax Minifier: Microsoft Ajax Minifier 4.54: Fix for issue #18161: pretty-printing CSS @media rule throws an exception due to mismatched Indent/Unindent pair.Silverlight Toolkit: Silverlight 5 Toolkit Source - May 2012: Source code for December 2011 Silverlight 5 Toolkit release.Json.NET: Json.NET 4.5 Release 6: New feature - Added IgnoreDataMemberAttribute support New feature - Added GetResolvedPropertyName to DefaultContractResolver New feature - Added CheckAdditionalContent to JsonSerializer Change - Metro build now always uses late bound reflection Change - JsonTextReader no longer returns no content after consecutive underlying content read failures Fix - Fixed bad JSON in an array with error handling creating an infinite loop Fix - Fixed deserializing objects with a non-default cons...DotNetNuke® Community Edition CMS: 06.02.00: Major Highlights Fixed issue in the Site Settings when single quotes were being treated as escape characters Fixed issue loading the Mobile Premium Data after upgrading from CE to PE Fixed errors logged when updating folder provider settings Fixed the order of the mobile device capabilities in the Site Redirection Management UI The User Profile page was completely rebuilt. We needed User Profiles to have multiple child pages. This would allow for the most flexibility by still f...????: ????2.0.1: 1、?????。WiX Toolset: WiX v3.6 RC: WiX v3.6 RC (3.6.2928.0) provides feature complete Burn with VS11 support. For more information see Rob's blog post about the release: http://robmensching.com/blog/posts/2012/5/28/WiX-v3.6-Release-Candidate-availableJavascript .NET: Javascript .NET v0.7: SetParameter() reverts to its old behaviour of allowing JavaScript code to add new properties to wrapped C# objects. The behavior added briefly in 0.6 (throws an exception) can be had via the new SetParameterOptions.RejectUnknownProperties. TerminateExecution now uses its isolate to terminate the correct context automatically. Added support for converting all C# integral types, decimal and enums to JavaScript numbers. (Previously only the common types were handled properly.) Bug fixe...Phalanger - The PHP Language Compiler for the .NET Framework: 3.0 (May 2012): Fixes: unserialize() of negative float numbers fix pcre possesive quantifiers and character class containing ()[] array deserilization when the array contains a reference to ISerializable parsing lambda function fix round() reimplemented as it is in PHP to avoid .NET rounding errors filesize bypass for FileInfo.Length bug in Mono New features: Time zones reimplemented, uses Windows/Linux databaseSharePoint Euro 2012 - UEFA European Football Predictor: havivi.euro2012.wsp (1.1): New fetures:Admin enable / disable match Hide/Show Euro 2012 SharePoint lists (3 lists) Installing SharePoint Euro 2012 PredictorSharePoint Euro 2012 Predictor has been developed as a SharePoint Sandbox solution to support SharePoint Online (Office 365) Download the solution havivi.euro2012.wsp from the download page: Downloads Upload this solution to your Site Collection via the solutions area. Click on Activate to make the web parts in the solution available for use in the Site C...????SDK for .Net 4.0+(OAuth2.0+??V2?API): ??V2?SDK???: ?????????API?? ???????OAuth2.0?? ????:????????????,??????????“SOURCE CODE”?????????Changeset,http://weibosdk.codeplex.com/SourceControl/list/changesets ???:????????,DEMO??AppKey????????????????,?????AppKey,????AppKey???????????,?????“????>????>????>??????”LINQ_Koans: LinqKoans v.02: Cleaned up a bitNew ProjectsAppleScript Slim: A super slimmed down library allowing you to execute AppleScript from your mono project (from your non MonoMac project).Ateneo Libri: Progetto web per la compravendita di libri universitariAzurehostedservicedashboard: Azure Hosted Service Dashboardcampus: Proyecto de pueba de MVC3 y codeplexDot Net Code Comment Analyzer: This Visual studio 2010 plugin, can count the comments in the C# code in the currently open solution in VS IDE. It shows a summary of the comments across all c# files in the project. this is useful when we want to enforce code comments , Code comments help in maintaining the code base , understanding code faster than going through the lines of code, makes code less dependant on a developer Individual.firstteamproject: H?c tfsFITClub: FITClub is platform fighting arcade game for 2 to 4 players. Enemies are controlled by AI. The goal is to force enemies down into the water or lava and keep safe from their attacks. You collect items to temporarily change your abilities. Multiplayer between more phones is coming soon.Jumpstart Branding for Sharepoint 2010: Basic Master Pages for SharePoint 2010 including a general, minified, heavily commented version of v4.master, a centered, fixed width, minified, commented Master Page and two Visual Studio 2010 solutions, one for farms and a second for sandboxes, to help you create a feature for deploying your Master Pages and other branding assets. Jumptart Branding for SP 2010 has been designed to help you quickly and easily jumpstart your next SharePoint 2010 Branding project.KelControl: Programme exe de controle d'activites. 1 - controle de la reponses de site web - http webrequest d'une Url - analyse du retour ( enetete http) - si ok ( Appel ws similaire a etat mais independant a faire apres niveau 4) - si erreur ( appel ws incrementer l'erreur) (par exemple au bout de 3 erreur declanchement alerte) dans un 1er temp on ne s'occupe pas du ws on inscrit les actions dans un fichier txt par exemple. process complet: - timer 15 minutes (param...KHTest: Visual Studio ??Librame Utility: Librame Utility 3.5.1Linux: this is the Linux project.Magic Morse: ?????????Maps: this is the Maps project.Mark Tarefas: Controlador de Tarefas para Mark AssessoriaMaxxFolderSize: MaxxUtils.FolderSizeMCTSTestCode: Project to hold code tried during learning MCTS CertificationMOBZHash: MOBZHash shows MD5 or SHA hash values for files, and reports files with identical hashes (which are most likely duplicates).NandleNF: Nandle NFNginx: this is the Nginx project. Oficina_SIGA: Siga, é um sistema de gerenciamento de oficinas.Plug-in: this is the Plug-in project.SharePoint Comments Anywhere: This is a very simple project which provides a commenting web part and a list template with the instance to store user's comments. SharePoint OTB only provides commenting capability on the Blog sites where users add their posts and anyone can view the page and add comments. Comments Anywhere can be configured on any list, pages library or any page of the SharePoint site with a web part zone enabling users to add their comments virtually anywhere you as an admin or a power user of your s...SharePoint Site Owners Webpart: SharePoint web part to display SharePoint site owners.Tarea AACQ: Proyecto para tarea FACCI 4A 01/06/12testprjct: test summaryTirailleur: Tirailleur code can be used to model an expanding wildfire (forest fire) perimeter. The code is implemented in VB.NET, and should be easy to translate to other languages. There are just a couple of classes handling the important work. These can be extracted and imported to another program. The code here includes some dummy client objects to represent the containing program. webdama: italian checkers game c#Webowo: wbowo projekt test obslugi tortoise i coldplex

    Read the article

  • CodePlex Daily Summary for Saturday, May 19, 2012

    CodePlex Daily Summary for Saturday, May 19, 2012Popular ReleasesZXMAK2: Version 2.6.1.8: - fix download links with badly formatted content-disposition - little refactoring for AY8910 code - added Sprinter emulation pluginGhostBuster: GhostBuster Setup (91520): Added WMI based RestorePoint support Removed test code from program.cs Improved counting. Changed color of ghosted but unfiltered devices. Changed HwEntries into an ObservableCollection. Added Properties Form. Added Properties MenuItem to Context Menu. Added Hide Unfiltered Devices to Context Menu. If you like this tool, leave me a note, rate this project or write a review or Donate to Ghostbuster. Donate to GhostbusterProject Tracy: Tracy 2.1 Stable (2.1.4): 2.1.4 ???:?dll?????Bin??? ??AppData??????ACCESS 2007?SQL Server2008??、??、????????: DataPie_V3.2: V3.2, 2012?5?19? ????ORACLE??????。AvalonDock: AvalonDock 2.0.0795: Welcome to the Beta release of AvalonDock 2.0 After 4 months of hard work I'm ready to upload the beta version of AvalonDock 2.0. This new version boosts a lot of new features and now is stable enough to be deployed in production scenarios. For this reason I encourage everyone is using AD 1.3 or earlier to upgrade soon to this new version. The final version is scheduled for the end of June. What is included in Beta: 1) Stability! thanks to all users contribution I’ve corrected a lot of issues...myCollections: Version 2.1.0.0: New in this version : Improved UI New Metro Skin Improved Performance Added Proxy Settings New Music and Books Artist detail Lot of Bug FixingfastJSON: v1.9.8: v1.9.8 - added DeepCopy(obj) and DeepCopy<T>(obj) - refactored code to JSONParameters and removed the JSON overloads - added support to serialize anonymous types (deserialize is not possible at the moment) - bug fix $types output with non object rootAspxCommerce: AspxCommerce1.1: AspxCommerce - 'Flexible and easy eCommerce platform' offers a complete e-Commerce solution that allows you to build and run your fully functional online store in minutes. You can create your storefront; manage the products through categories and subcategories, accept payments through credit cards and ship the ordered products to the customers. We have everything set up for you, so that you can only focus on building your own online store. Note: To login as a superuser, the username and pass...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1616.403): BUG FIX Hide save button when Titles or Descriptions element is selectedMapWindow 6 Desktop GIS: MapWindow 6.1.2: Looking for a .Net GIS Map Application?MapWindow 6 Desktop GIS is an open source desktop GIS for Microsoft Windows that is built upon the DotSpatial Library. This release requires .Net 4 (Client Profile). Are you a software developer?Instead of downloading MapWindow for development purposes, get started with with the DotSpatial template. The extensions you create from the template can be loaded in MapWindow.DotSpatial: DotSpatial 1.2: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Tutorials are available. Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components ...Mugen Injection: Mugen Injection 2.2.1 (WinRT supported): Added ManagedScopeLifecycle. Increase performance. Added support for resolve 'params'.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.52: Make preprocessor comment-statements nestable; add the ///#IFNDEF statement. (Discussion #355785) Don't throw an error for old-school JScript event handlers, and don't rename them if they aren't global functions.DotNetNuke® Events: 06.00.00: This is a serious release of Events. DNN 6 form pattern - We have take the full route towards DNN6: most notably the incorporation of the DNN6 form pattern with streamlined UX/UI. We have also tried to change all formatting to a div based structure. A daunting task, since the Events module contains a lot of forms. Roger has done a splendid job by going through all the forms in great detail, replacing all table style layouts into the new DNN6 div class="dnnForm XXX" type of layout with chang...LogicCircuit: LogicCircuit 2.12.5.15: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionThis release is fixing one but nasty bug. Two functions XOR and XNOR when used with 3 or more inputs were incorrectly evaluating their results. If you have a circuit that is using these functions...Image Popup Module dotnetnuke: Image Pop-up In HTML Module Source: Image Pop-up In HTML Module is a module to show pop ups Please Follow the steps to use this module 1 Install the module and drop on your page where you want to show the pop up 2 In your HTML module editor add the token "{imagepopup}" 3 In your HTML module editor add class="popup-img" in your images which you want to show in popup.FileZilla Server Config File Editor: FileZillaConfig 1.0.0.1: Sorry for not including the config file with the previous release. It was a "lost in translation" when I was moving my local repository to CodePlex repository. Sorry for the rookie mistake.LINQ to Twitter: LINQ to Twitter Beta v2.0.25: Supports .NET 3.5, .NET 4.0, Silverlight 4.0, Windows Phone 7.1, Client Profile, and Windows 8. 100% Twitter API coverage. Also available via NuGet! Follow @JoeMayo.BlogEngine.NET: BlogEngine.NET 2.6: Get DotNetBlogEngine for 3 Months Free! Click Here for More Info BlogEngine.NET Hosting - 3 months free! Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take...BlackJumboDog: Ver5.6.2: 2012.05.07 Ver5.6.2 (1) Web???????、????????·????????? (2) Web???????、?????????? COMSPEC PATHEXT WINDIR SERVERADDR SERVERPORT DOCUMENTROOT SERVERADMIN REMOTE_PORT HTTPACCEPTCHRSET HTTPACCEPTLANGUAGE HTTPACCEPTEXCODINGNew ProjectsAsset Tracking: Bespoke inhouse solution for managing asset's within the organisation.Chsword Project: Chsword project is a collection of .net project.conjee: Conjee UI DesignDealKhuyenMaiV2.com: d? án web cu?i kì nhóm g2Devtm.ServiceModel: ServiceFactory The library provides easy access to all your services through the helper ServiceFactory. This way to consume your services requires absolutely no place the call to service in a block (try / finally) because all proxies provided by the helper "ServiceFactory" are dynamically generated for the contract as a parameter. This block is built into the code provided for each method.Dream Runtime Analyzer: Dream Runtime Analyzer is a tool made to help Furcadia dreamweavers test their dreams for bandwidth usage and optimize their dragonspeak performance. It allows you to see which DragonSpeak lines were transmitted the most and thus tell you which areas need to be optimized.DynamicsNAV Protocol Handler: Target of this project is to develop DYNAMICSNAV protocol handler which will solve problems of side-by-side installation of many NAV versions on one PC. Today only one version could be handled through the hyperlinks. from.js: Powerful and High-speed LINQ implementation for JavaScriptFurcadia Installer Browser: A program that can access files within a Furcadia installer and allow the user to open them from within the install package, extract some or all the files inside the package, check data integrity of each file and compare the content of two installers.Furcadia Map Normalizer: Furcadia Map Normalizer is a small tool that helps recover a damaged Furcadia map after a live-edit bug. It restores out-of-range elements within back to zero.Homework: TSU students in action :DHRASP: human resourcesiseebooks: this is book s website for self developmentITORG CMS: ITORG Simple Content Managment System ASP.NET MVC 3Kinesthesia (Kinect-based MIDI controller): A simple yet highly configurable Kinect-based MIDI controller with MIDI playback, gesture recognition and voice control.LameBT: A .NET Bluetooth 2.0 stack (HOST and ACL only) based on LibUSB, supporting multiple USB bluetooth dongles.pongISEN: projet de l'ISEN pongRadminPassword: ????????? ??? ??????????????? ????? ??????? ? ????????? ????????? ?????????? ?????????? ?? Radmin. A program to automatically enter the passwords in the famous PC remote control software Radmin.RicciWebSiteSystem: soon websiteScripted Deployment of a System Center 2012 Configuration Manager Secondary Site: In System Center 2012 Configuration Manager, you can no longer deploy a secondary site server using Setup (wizard or scripted). Instead, you must use the Configuration Manager console to create a new secondary site. This is less than ideal if you want to deploy several secondary sites or want to automate the process for any other reason. This project provides a script that will allow you to install a new System Center 2012 Configuration Manager secondary site server without using the Con...Snapshot: Snap is a screen and desktop capture application that automatically uploads your screen captures to a remote image host and leaves you their direct links.SOA based Open Source E-Commerce System: This project will be a new Ecommerce System, based on service oriented architecture.Symphony Framework: The Symphony Framework is a set of classes and capabilities that are designed to assist the Synergy/DE developer enhance the power of the Synergy .NET development environment and migrate their traditional Synergy/DE applications to a Windows Presentation Foundation desktop user experience.testddgit0518201201: ghtestddtfs0518201201: ertesttom05072012git01: fsdfdstesttom05182012git01: fdstesttom05182012hg01: Summarytesttom05182012tfs01: fdsfdsfdsVisualCron - web client: VisualCron, www.visualcron.com, is an advanced scheduler and automation tool. VisualCron has a WinForms interface built on the VisualCron API. This projects is a proof of concept web client built upon the VisualCron API. The project was originally built by VisualCron developers as a test to provide a realtime/responsive web client.

    Read the article

  • CodePlex Daily Summary for Thursday, August 14, 2014

    CodePlex Daily Summary for Thursday, August 14, 2014Popular ReleasesWordMat: WordMat for Mac: WordMat for Mac has a few limitations compared to the Windows version - Graph is not supported (Gnuplot, GeoGebra and Excel works) - Units are not supported yet (Coming up) The Mac version is yet as tested as the windows version.Awake: Awake v1.4.0 (Stand-Alone-Exe): Awake is a tool, that resides in system tray and prevents the computer from entering the idle state, thus successfully preventing it from entering sleep/hibernation/the lock screen. It does not change any system settings, therefore it does not require administrative privileges. This tool is designed for those who cannot change the timings in their power settings, because of some corporate policy.Node.js Tools for Visual Studio: Latest dev build: An intermediate release with the latest changes and bug fixes.HP OneView PowerShell Library: HP OneView PowerShell Library 1.10.1193: Branch to HP OneView 1.10 Release. NOTE: This library version does not support older appliance versions. Fixed New-HPOVProfile to check for Firmware and BIOS management for supported platforms. Would erroneously error when neither -firmware or -bios were passed. Fixed Remove-HPOV* cmdlets which did not handle -force switch parameter correctly Fixed New-HPOVUplinkSet and New-HPOVNetwork Fixed Download-File where HTTP stream compression was not handled, resulting in incorrectly writt...Linq 4 Javascript: Version 2.3: Minor Changes Made In Queryable - don't check for collection length with >=. Use === (In The Next Method) TypeScript Change Only - Remove collection source and other inherit properties from all the chainables. Also in typescript add private - public to all properties. This should cleanup the typescript namespace a bit TypeScript Change Only - Change return type of ToDictionary to TKey, T instead of T, TKey Changed the unit test to Typescript so I can test how the caller experience is in...NeoLua (Lua for .net dynamic language runtime): NeoLua-0.8.17: Fix: table.insert Fix: table auto convert Fix: Runtime-functions were defined as private it should be internal. Fix: min,max MichaelSenko release.Azure Maching Learning Excel Add-In: Beta: Download the zip file and extract into your local directory. Then watch the video tutorials for installation steps.MFCMAPI: August 2014 Release: Build: 15.0.0.1042 Full release notes at SGriffin's blog. If you just want to run the MFCMAPI or MrMAPI, get the executables. If you want to debug them, get the symbol files and the source. The 64 bit builds will only work on a machine with Outlook 2010/2013 64 bit installed. All other machines should use the 32 bit builds, regardless of the operating system. Facebook BadgeOooPlayer: 1.1: Added: Support for speex, TAK and OptimFrog files Added: An option to not to load cover art Added: Smaller package size Fixed: Unable to drag&drop audio files to playlist Updated: FLAC, WacPack and Opus playback libraries Updated: ID3v1 and ID3v2 tag librariesEWSEditor: EwsEditor 1.10 Release: • Export and import of items as a full fidelity steam works - without proxy classes! - I used raw EWS POSTs. • Turned off word wrap for EWS request field in EWS POST windows. • Several windows with scrolling texts boxes were limiting content to 32k - I removed this restriction. • Split server timezone info off to separate menu item from the timezone info windows so that the timezone info window could be used without logging into a mailbox. • Lots of updates to the TimeZone window. • UserAgen...Python Tools for Visual Studio: 2.1 RC: Release notes for PTVS 2.1 RC We’re pleased to announce the release candidate for Python Tools for Visual Studio 2.1. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including CPython/IronPython, editing, IntelliSense, interactive debugging, profiling, Microsoft Azure, IPython, and cross-platform debugging support. PTVS 2.1 RC is available for: Visual Studio Expre...Sense/Net ECM - Enterprise CMS: SenseNet 6.3.1 Community Edition: Sense/Net 6.3.1 Community EditionSense/Net 6.3.1 is an important step toward a more modular infrastructure, robustness and maintainability. With this release we finally introduce a packaging and a task management framework, and the Image Editor that will surely make the job of content editors more fun. Please review the changes and new features since Sense/Net 6.3 and give a feedback on our forum! Main new featuresSnAdmin (packaging framework) Task Management Image Editor OData REST A...Aspose for Apache POI: Missing Features of Apache POI SS - v 1.2: Release contain the Missing Features in Apache POI SS SDK in comparison with Aspose.Cells What's New ? Following Examples: Create Pivot Charts Detect Merged Cells Sort Data Printing Workbooks Feedback and Suggestions Many more examples are available at Aspose Docs. Raise your queries and suggest more examples via Aspose Forums or via this social coding site.MFCBDAINF: MFCBDAINF: Added recognition of TBS, Hauppauge, DVBWorld and FireDTV proprietary GUID'sFluffy: Fluffy 0.3.35.4: Change log: Text editorSKGL - Serial Key Generating Library: SKGL Extension Methods 4 (1.0.5.1): This library contains methods for: Time change check (make sure the time has not been changed on the client computer) Key Validation (this will use http://serialkeymanager.com/ to validate keys against the database) Key Activation (this will, depending on the settings, activate a key with a specific machine code) Key Activation Trial (allows you to update a key if it is a trial key) Get Machine Code (calculates a machine code given any hash function) Get Eight Byte Hash (returns an...Touchmote: Touchmote 1.0 beta 13: Changes Less GPU usage Works together with other Xbox 360 controls Bug fixesModern UI for WPF: Modern UI 1.0.6: The ModernUI assembly including a demo app demonstrating the various features of Modern UI for WPF. BREAKING CHANGE LinkGroup.GroupName renamed to GroupKey NEW FEATURES Improved rendering on high DPI screens, including support for per-monitor DPI awareness available in Windows 8.1 (see also Per-monitor DPI awareness) New ModernProgressRing control with 8 builtin styles New LinkCommands.NavigateLink routed command New Visual Studio project templates 'Modern UI WPF App' and 'Modern UI W...ClosedXML - The easy way to OpenXML: ClosedXML 0.74.0: Multiple thread safe improvements including AdjustToContents XLHelper XLColor_Static IntergerExtensions.ToStringLookup Exception now thrown when saving a workbook with no sheets, instead of creating a corrupt workbook Fix for hyperlinks with non-ASCII Characters Added basic workbook protection Fix for error thrown, when a spreadsheet contained comments and images Fix to Trim function Fix Invalid operation Exception thrown when the formula functions MAX, MIN, and AVG referenc...SEToolbox: SEToolbox 01.042.019 Release 1: Added RadioAntenna broadcast name to ship name detail. Added two additional columns for Asteroid material generation for Asteroid Fields. Added Mass and Block number columns to main display. Added Ellipsis to some columns on main display to reduce name confusion. Added correct SE version number in file when saving. Re-added in reattaching Motor when drag/dropping or importing ships (KeenSH have added RotorEntityId back in after removing it months ago). Added option to export and r...New ProjectsAndroid PCM Audio Recording: Android PCM Audio Recording The source code records the PCM audio in android device.Azure Maching Learning Excel Add-In: The Azure ML Excel Add-In enables you to interact with Microsoft Azure Machine Learning WebServices through excel by adding the scoring endpoint as a function.bitboxx bbcontact: The bitboxx bbcontact module is a DNN module for providing a simple configurable contact form with easy setup and email notificationJD eSurvey Java Open Source Online Survey Application: JD eSurvey is an open source enterprise survey web application written in Java and based on the Spring Framework and Hibernate ORM developed by JD Software.Kobayashi Royale: A tactical space combat turn based game.OneApp Framework: Framework for building true cross platform application.Raspberry Pi Control Center: A GTK+ based Raspberry Pi Control Center. Made to be simple and fastSharePoint Farm's Logs Collector: Get Farm Logs from a centralized place. Sonar Snitch: Ferramenta para filtrar e monitorar aplicações no Sonar. Indica o quanto cada aplicação foi alterada em uma série de indicadores conhecidos.

    Read the article

  • Sharing A Stage: JDeveloper/ADF & NetBeans/Java EE 6?

    - by Geertjan
    A highlight for me during last week's Oracle Developer Day in Romania (which I blogged about here) was meeting Jernej Kaše (who is from Slovenia, just like my philosopher hero Slavoj Žižek), who is an Oracle Fusion Middleware evangelist. At the conference, while I was presenting NetBeans and Java EE 6 in one room, Jernej was presenting JDeveloper and ADF in another room. The application he created looks as follows, i.e., a realistic CRUD app, with a master/detail view, a search feature, and validation: In a conversation during a break, we started imagining a scenario where the two of us would be on the same stage, taking turns talking about NetBeans/Java EE and JDeveloper/ADF. In that way, attendees at a conference wouldn't need to choose which of the two topics to attend, because they'd be handled in the same session, with the session possibly being longer so that sufficient time could be spent on the respective technologies. (The JDeveloper/ADF session would then not be competing with the NetBeans/Java EE 6 session, since they'd be handled simultaneously.) The session would focus on the similarities/differences between the two respective tools/solutions, which would be extremely interesting and also unique. The crucial question in making this kind of co-presentation possible is whether (and how quickly) an application such as the one created above with JDeveloper/ADF could be created with NetBeans/Java EE 6. The NetBeans/Java EE 6 story is extremely strong on the model and controler levels, but less strong on the view layer. Though there are choices between using PrimeFaces, RichFaces, and IceFaces, that support is quite limited in the absence of a visual designer or of other specific tools (e.g., code generators to generate snippets of PrimeFaces) connected to JSF component libraries. However, it so happens that in recent months we at NetBeans have established really good connections with the PrimeFaces team (more about that another time). So I asked them what it would take to write the above UI in PrimeFaces. The PrimeFaces team were very helpful. They sent me the following screenshot, which is of the UI they created in PrimeFaces, reproducing the ADF screenshot above: Of course, the above is purely the UI layer, there's no EJB and entity classes and data connection hooked into it yet. However, this is the Facelets file that the PrimeFaces team sent me, i.e., using the PrimeFaces component library, that produces the above result: <?xml version="1.0" encoding="UTF-8"?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:f="http://java.sun.com/jsf/core" xmlns:p="http://primefaces.org/ui"> <f:view> <h:head> <style type="text/css"> .alignRight { text-align: right; } .alignLeft { text-align: left; } .alignTop { vertical-align: top; } .ui-validation-required { color: red; font-size: 14px; margin-right: 5px; position: relative; vertical-align: top; } .ui-selectonemenu .ui-selectonemenu-trigger .ui-icon { margin-top: 7px !important; } </style> </h:head> <h:body> <h:form prependId="false" id="form"> <p:panel header="Employees"> <h:panelGrid columns="4" id="searchPanel"> Search <p:selectOneMenu> <f:selectItem itemLabel="FirstName" itemValue="FirstName" /> <f:selectItem itemLabel="LastName" itemValue="LastName" /> <f:selectItem itemLabel="Email" itemValue="Email" /> <f:selectItem itemLabel="PhoneNumber" itemValue="PhoneNumber" /> </p:selectOneMenu> <p:inputText /> <p:commandLink process="searchPanel" update="@form"> <h:graphicImage name="next.gif" library="img" /> </p:commandLink> </h:panelGrid> <h:panelGrid columns="3" columnClasses="alignTop,,alignTop" style="width:90%;margin-left:10%"> <h:panelGrid columns="2" columnClasses="alignRight,alignLeft"> <h:outputLabel for="firstName">FirstName</h:outputLabel> <p:inputText id="firstName" /> <h:outputLabel for="lastName"> <sup class="ui-validation-required">*</sup>LastName </h:outputLabel> <p:inputText id="lastName" style="width:250px;" /> <h:outputLabel for="email"> <sup class="ui-validation-required">*</sup>Email </h:outputLabel> <p:inputText id="email" style="width:250px;" /> <h:outputLabel for="phoneNumber" value="PhoneNumber" /> <p:inputMask id="phoneNumber" mask="999.999.9999" /> <h:outputLabel for="hireDate"> <sup class="ui-validation-required">*</sup>HireDate</h:outputLabel> <p:calendar id="hireDate" pattern="MM/dd/yyyy" showOn="button" /> </h:panelGrid> <p:outputPanel style="min-width:40px;" /> <h:panelGrid columns="2" columnClasses="alignRight,alignLeft"> <h:outputLabel for="jobId"> <sup class="ui-validation-required">*</sup>JobId </h:outputLabel> <p:selectOneMenu id="jobId" > <f:selectItem itemLabel="Administration Vice President" itemValue="Administration Vice President" /> <f:selectItem itemLabel="Vice President" itemValue="Vice President" /> </p:selectOneMenu> <h:outputLabel for="salary">Salary</h:outputLabel> <p:inputText id="salary" styleClass="alignRight" /> <h:outputLabel for="commissionPct">CommissionPct</h:outputLabel> <p:inputText id="commissionPct" style="width:30px;" maxlength="3" /> <h:outputLabel for="manager">ManagerId</h:outputLabel> <p:selectOneMenu id="manager"> <f:selectItem itemLabel="Steven King" itemValue="Steven" /> <f:selectItem itemLabel="Michael Cook" itemValue="Michael" /> <f:selectItem itemLabel="John Benjamin" itemValue="John" /> <f:selectItem itemLabel="Dav Glass" itemValue="Dav" /> </p:selectOneMenu> <h:outputLabel for="department">DepartmentId</h:outputLabel> <p:selectOneMenu id="department"> <f:selectItem itemLabel="90" itemValue="90" /> <f:selectItem itemLabel="80" itemValue="80" /> <f:selectItem itemLabel="70" itemValue="70" /> <f:selectItem itemLabel="60" itemValue="60" /> <f:selectItem itemLabel="50" itemValue="50" /> <f:selectItem itemLabel="40" itemValue="40" /> <f:selectItem itemLabel="30" itemValue="30" /> <f:selectItem itemLabel="20" itemValue="20" /> </p:selectOneMenu> </h:panelGrid> </h:panelGrid> <p:outputPanel id="buttonPanel"> <p:commandButton value="First" process="@this" update="@form" /> <p:commandButton value="Previous" process="@this" update="@form" style="margin-left:15px;" /> <p:commandButton value="Next" process="@this" update="@form" style="margin-left:15px;" /> <p:commandButton value="Last" process="@this" update="@form" style="margin-left:15px;" /> </p:outputPanel> <p:tabView style="margin-top:25px"> <p:tab title="Job History"> <p:dataTable var="history"> <p:column headerText="StartDate"> <h:outputText value="#{history.startDate}"> <f:convertDateTime pattern="MM/dd/yyyy" /> </h:outputText> </p:column> <p:column headerText="EndDate"> <h:outputText value="#{history.endDate}"> <f:convertDateTime pattern="MM/dd/yyyy" /> </h:outputText> </p:column> <p:column headerText="JobId"> <h:outputText value="#{history.jobId}" /> </p:column> <p:column headerText="DepartmentId"> <h:outputText value="#{history.departmentIdId}" /> </p:column> </p:dataTable> </p:tab> </p:tabView> </p:panel> </h:form> </h:body> </f:view> </html> Right now, NetBeans IDE only has code completion to create the above. So there's not much help for creating such a UI right now. I don't believe that a visual designer is mandatory to create the above. A few code generators and file templates could do the job too. And I'm looking forward to seeing those kinds of tools for PrimeFaces, as well as other JSF component libraries, appearing in NetBeans IDE in upcoming releases. A related option would be for the NetBeans generated CRUD app to include the option of having a master/detail view, as well as the option of having a search feature, i.e., the application generators would provide the option of having additional features typical in Java enterprise apps. In the absence of such tools, there still is room, I believe, for NetBeans/Java EE and JDeveloper/ADF sharing a stage at a conference. The above file would have been prepared up front and the presenter would state that fact. The UI layer is only one aspect of a Java EE 6 application, so that the presenter would have ample other features to show (i.e., the entity class generation, the tools for working with servlets, with session beans, etc) prior to getting to the point where the statement would be made: "On the UI layer, I have prepared this Facelets file, which I will now show you can be connected to the lower layers of the application as follows." At that point, the session beans could be hooked into the Facelets file, the file would be saved, the browser refreshed, and then the whole application would work exactly as the ADF application does. So, Jernej, let's share a stage soon!

    Read the article

  • Code Reuse is (Damn) Hard

    - by James Michael Hare
    Being a development team lead, the task of interviewing new candidates was part of my job.  Like any typical interview, we started with some easy questions to get them warmed up and help calm their nerves before hitting the hard stuff. One of those easier questions was almost always: “Name some benefits of object-oriented development.”  Nearly every time, the candidate would chime in with a plethora of canned answers which typically included: “it helps ease code reuse.”  Of course, this is a gross oversimplification.  Tools only ease reuse, its developers that ultimately can cause code to be reusable or not, regardless of the language or methodology. But it did get me thinking…  we always used to say that as part of our mantra as to why Object-Oriented Programming was so great.  With polymorphism, inheritance, encapsulation, etc. we in essence set up the concepts to help facilitate reuse as much as possible.  And yes, as a developer now of many years, I unquestionably held that belief for ages before it really struck me how my views on reuse have jaded over the years.  In fact, in many ways Agile rightly eschews reuse as taking a backseat to developing what's needed for the here and now.  It used to be I was in complete opposition to that view, but more and more I've come to see the logic in it.  Too many times I've seen developers (myself included) get lost in design paralysis trying to come up with the perfect abstraction that would stand all time.  Nearly without fail, all of these pieces of code become obsolete in a matter of months or years. It’s not that I don’t like reuse – it’s just that reuse is hard.  In fact, reuse is DAMN hard.  Many times it is just a distraction that eats up architect and developer time, and worse yet can be counter-productive and force wrong decisions.  Now don’t get me wrong, I love the idea of reusable code when it makes sense.  These are in the few cases where you are designing something that is inherently reusable.  The problem is, most business-class code is inherently unfit for reuse! Furthermore, the code that is reusable will often fail to be reused if you don’t have the proper framework in place for effective reuse that includes standardized versioning, building, releasing, and documenting the components.  That should always be standard across the board when promoting reusable code.  All of this is hard, and it should only be done when you have code that is truly reusable or you will be exerting a large amount of development effort for very little bang for your buck. But my goal here is not to get into how to reuse (that is a topic unto itself) but what should be reused.  First, let’s look at an extension method.  There’s many times where I want to kick off a thread to handle a task, then when I want to reign that thread in of course I want to do a Join on it.  But what if I only want to wait a limited amount of time and then Abort?  Well, I could of course write that logic out by hand each time, but it seemed like a great extension method: 1: public static class ThreadExtensions 2: { 3: public static bool JoinOrAbort(this Thread thread, TimeSpan timeToWait) 4: { 5: bool isJoined = false; 6:  7: if (thread != null) 8: { 9: isJoined = thread.Join(timeToWait); 10:  11: if (!isJoined) 12: { 13: thread.Abort(); 14: } 15: } 16: return isJoined; 17: } 18: } 19:  When I look at this code, I can immediately see things that jump out at me as reasons why this code is very reusable.  Some of them are standard OO principles, and some are kind-of home grown litmus tests: Single Responsibility Principle (SRP) – The only reason this extension method need change is if the Thread class itself changes (one responsibility). Stable Dependencies Principle (SDP) – This method only depends on classes that are more stable than it is (System.Threading.Thread), and in itself is very stable, hence other classes may safely depend on it. It is also not dependent on any business domain, and thus isn't subject to changes as the business itself changes. Open-Closed Principle (OCP) – This class is inherently closed to change. Small and Stable Problem Domain – This method only cares about System.Threading.Thread. All-or-None Usage – A user of a reusable class should want the functionality of that class, not parts of that functionality.  That’s not to say they most use every method, but they shouldn’t be using a method just to get half of its result. Cost of Reuse vs. Cost to Recreate – since this class is highly stable and minimally complex, we can offer it up for reuse very cheaply by promoting it as “ready-to-go” and already unit tested (important!) and available through a standard release cycle (very important!). Okay, all seems good there, now lets look at an entity and DAO.  I don’t know about you all, but there have been times I’ve been in organizations that get the grand idea that all DAOs and entities should be standardized and shared.  While this may work for small or static organizations, it’s near ludicrous for anything large or volatile. 1: namespace Shared.Entities 2: { 3: public class Account 4: { 5: public int Id { get; set; } 6:  7: public string Name { get; set; } 8:  9: public Address HomeAddress { get; set; } 10:  11: public int Age { get; set;} 12:  13: public DateTime LastUsed { get; set; } 14:  15: // etc, etc, etc... 16: } 17: } 18:  19: ... 20:  21: namespace Shared.DataAccess 22: { 23: public class AccountDao 24: { 25: public Account FindAccount(int id) 26: { 27: // dao logic to query and return account 28: } 29:  30: ... 31:  32: } 33: } Now to be fair, I’m not saying there doesn’t exist an organization where some entites may be extremely static and unchanging.  But at best such entities and DAOs will be problematic cases of reuse.  Let’s examine those same tests: Single Responsibility Principle (SRP) – The reasons to change for these classes will be strongly dependent on what the definition of the account is which can change over time and may have multiple influences depending on the number of systems an account can cover. Stable Dependencies Principle (SDP) – This method depends on the data model beneath itself which also is largely dependent on the business definition of an account which can be very inherently unstable. Open-Closed Principle (OCP) – This class is not really closed for modification.  Every time the account definition may change, you’d need to modify this class. Small and Stable Problem Domain – The definition of an account is inherently unstable and in fact may be very large.  What if you are designing a system that aggregates account information from several sources? All-or-None Usage – What if your view of the account encompasses data from 3 different sources but you only care about one of those sources or one piece of data?  Should you have to take the hit of looking up all the other data?  On the other hand, should you have ten different methods returning portions of data in chunks people tend to ask for?  Neither is really a great solution. Cost of Reuse vs. Cost to Recreate – DAOs are really trivial to rewrite, and unless your definition of an account is EXTREMELY stable, the cost to promote, support, and release a reusable account entity and DAO are usually far higher than the cost to recreate as needed. It’s no accident that my case for reuse was a utility class and my case for non-reuse was an entity/DAO.  In general, the smaller and more stable an abstraction is, the higher its level of reuse.  When I became the lead of the Shared Components Committee at my workplace, one of the original goals we looked at satisfying was to find (or create), version, release, and promote a shared library of common utility classes, frameworks, and data access objects.  Now, of course, many of you will point to nHibernate and Entity for the latter, but we were looking at larger, macro collections of data that span multiple data sources of varying types (databases, web services, etc). As we got deeper and deeper in the details of how to manage and release these items, it quickly became apparent that while the case for reuse was typically a slam dunk for utilities and frameworks, the data access objects just didn’t “smell” right.  We ended up having session after session of design meetings to try and find the right way to share these data access components. When someone asked me why it was taking so long to iron out the shared entities, my response was quite simple, “Reuse is hard...”  And that’s when I realized, that while reuse is an awesome goal and we should strive to make code maintainable, often times you end up creating far more work for yourself than necessary by trying to force code to be reusable that inherently isn’t. Think about classes the times you’ve worked in a company where in the design session people fight over the best way to implement a class to make it maximally reusable, extensible, and any other buzzwordable.  Then think about how quickly that design became obsolete.  Many times I set out to do a project and think, “yes, this is the best design, I can extend it easily!” only to find out the business requirements change COMPLETELY in such a way that the design is rendered invalid.  Code, in general, tends to rust and age over time.  As such, writing reusable code can often be difficult and many times ends up being a futile exercise and worse yet, sometimes makes the code harder to maintain because it obfuscates the design in the name of extensibility or reusability. So what do I think are reusable components? Generic Utility classes – these tend to be small classes that assist in a task and have no business context whatsoever. Implementation Abstraction Frameworks – home-grown frameworks that try to isolate changes to third party products you may be depending on (like writing a messaging abstraction layer for publishing/subscribing that is independent of whether you use JMS, MSMQ, etc). Simplification and Uniformity Frameworks – To some extent this is similar to an abstraction framework, but there may be one chosen provider but a development shop mandate to perform certain complex items in a certain way.  Or, perhaps to simplify and dumb-down a complex task for the average developer (such as implementing a particular development-shop’s method of encryption). And what are less reusable? Application and Business Layers – tend to fluctuate a lot as requirements change and new features are added, so tend to be an unstable dependency.  May be reused across applications but also very volatile. Entities and Data Access Layers – these tend to be tuned to the scope of the application, so reusing them can be hard unless the abstract is very stable. So what’s the big lesson?  Reuse is hard.  In fact it’s damn hard.  And much of the time I’m not convinced we should focus too hard on it. If you’re designing a utility or framework, then by all means design it for reuse.  But you most also really set down a good versioning, release, and documentation process to maximize your chances.  For anything else, design it to be maintainable and extendable, but don’t waste the effort on reusability for something that most likely will be obsolete in a year or two anyway.

    Read the article

  • CodePlex Daily Summary for Tuesday, May 27, 2014

    CodePlex Daily Summary for Tuesday, May 27, 2014Popular ReleasesAD4 Application Designer for flow based .NET applications: AD4.AppDesigner.18.11: AD4.Iteration.18.11(Rendering of flow chart) Bugfix: AppWrapperClassContainer MakeWrapperBuildInstances MakeFlowClassCtor Note: Currently not all elements are shown in flow chart area! This version is able to generate the source code of some samples. See: Sample Applications (You find the flow description in the documents folder of each sample). The gluing code of the AD4.AppDesigner was created by the previous version of the AD4.AppDesigner. You find the current app definition in t...ClosedXML - The easy way to OpenXML: ClosedXML 0.71.1: More performance improvements. It's faster and consumes less memory.SQL Server Change Data Capture Application: Release 1: This is the initial release of SQLCDCApp. Download the zip file. Unzip and run SQLCDCApp.exe to start.Role Based Views in Microsoft Dynamics CRM 2011: Role Based Views in CRM 2011 and 2013 - 1.1.0.0: Issues fixed in this build: 1. Works for CRM 2013 2. Lookup view not getting blockedMathos Parser: 1.0.6.1 + source code: Removed the bug with comma/dot reported and fixed by Diego.Dynamics AX Build Scripts: DynamicsAXCommunity Powershell module (0.3.0): Change log(previous release was 0.2.4) 0.3.0 AxBuild (http://msdn.microsoft.com/en-us/library/dn528954.aspx) is supported and used by default. Smarter dealing with versions - e.g. listing configurations for all versions in the same time. $AxVersionPreference shouldn't be normally needed. Additional properties returned by Get-AXConfig. 0.2.5 -StartupCmd and -Wait added to Start-AXClient. Handles console output sent from AX (http://dev.goshoom.net/en/2012/05/console-output-ax/).xFunc: xFunc 2.15.6: Fidex #77QuickMon: Version 3.12: This release is mostly just to improve the UI for the Windows client. There are a few minor fixes as well. 1. Polling frequency presets fixed (slow, normal and fast) 2. Added collector call duration to history 3. History now displays time, state, duration and details in separate columns 4. Added a quick launch drop down list to main Window (only visible when mouse hover over it) 5. Removed the toolbar border. 6. Changed Windows service collector to report error only when all services from al...VK.NET - Vkontakte API for .NET: VkNet 1.0.5: ?????????? ????? ??????.Kartris E-commerce: Kartris v2.6002: Minor release: Double check that Logins_GetList sproc is present, sometimes seems to get missed earlier if upgrading which can give error when viewing logins page Added CSV and TXT export option; this is not Google Products compatible, but can give a good base for creating a file for some other systems such as Amazon Fixed some minor combination and options issues to improve interface back and front Turn bitcoin and some other gateways off by default Minor CSS changes Fixed currenc...SimCityPak: SimCityPak 0.3.1.0: Main New Features: Fixed Importing of Instance Names (get rid of the Dutch translations) Added advanced editor for Decal Dictionaries Added possibility to import .PNG to generate new decals Added advanced editor for Path display entriesTiny Deduplicator: Tiny Deduplicator 1.0.1.0: Increased version number to 1.0.1.0 Moved all options to a separate 'Options' dialog window. Allows the user to specify a selection strategy which will help when dealing with large numbers of duplicate files. Available options are "None," "Keep First," and "Keep Last"C64 Studio: 3.5: Add: BASIC renumber function Add: !PET pseudo op Add: elseif for !if, } else { pseudo op Add: !TRACE pseudo op Add: Watches are saved/restored with a solution Add: Ctrl-A works now in export assembly controls Add: Preliminary graphic import dialog (not fully functional yet) Add: range and block selection in sprite/charset editor (Shift-Click = range, Alt-Click = block) Fix: Expression evaluator could miscalculate when both division and multiplication were in an expression without parenthesisSEToolbox: SEToolbox 01.031.009 Release 1: Added mirroring of ConveyorTubeCurved. Updated Ship cube rotation to rotate ship back to original location (cubes are reoriented but ship appears no different to outsider), and to rotate Grouped items. Repair now fixes the loss of Grouped controls due to changes in Space Engineers 01.030. Added export asteroids. Rejoin ships will merge grouping and conveyor systems (even though broken ships currently only maintain the Grouping on one part of the ship). Installation of this version wi...Player Framework by Microsoft: Player Framework for Windows and WP v2.0: Support for new Universal and Windows Phone 8.1 projects for both Xaml and JavaScript projects. See a detailed list of improvements, breaking changes and a general overview of version 2 ADDITIONAL DOWNLOADSSmooth Streaming Client SDK for Windows 8 Applications Smooth Streaming Client SDK for Windows 8.1 Applications Smooth Streaming Client SDK for Windows Phone 8.1 Applications Microsoft PlayReady Client SDK for Windows 8 Applications Microsoft PlayReady Client SDK for Windows 8.1 Applicat...TerraMap (Terraria World Map Viewer): TerraMap 1.0.6: Added support for the new Terraria v1.2.4 update. New items, walls, and tiles Added the ability to select multiple highlighted block types. Added a dynamic, interactive highlight opacity slider, making it easier to find highlighted tiles with dark colors (and fixed blurriness from 1.0.5 alpha). Added ability to find Enchanted Swords (in the stone) and Water Bolt books Fixed Issue 35206: Hightlight/Find doesn't work for Demon Altars Fixed finding Demon Hearts/Shadow Orbs Fixed inst...DotNet.Highcharts: DotNet.Highcharts 4.0 with Examples: DotNet.Highcharts 4.0 Tested and adapted to the latest version of Highcharts 4.0.1 Added new chart type: Heatmap Added new type PointPlacement which represents enumeration or number for the padding of the X axis. Changed target framework from .NET Framework 4 to .NET Framework 4.5. Closed issues: 974: Add 'overflow' property to PlotOptionsColumnDataLabels class 997: Split container from JS 1006: Series/Categories with numeric names don't render DotNet.Highcharts.Samples Updated s...PowerShell App Deployment Toolkit: PowerShell App Deployment Toolkit v3.1.3: Added CompressLogs option to the config file. Each Install / Uninstall creates a timestamped zip file with all MSI and PSAppDeployToolkit logs contained within Added variable expansion to all paths in the configuration file Added documentation for each of the Toolkit internal variables that can be used Changed Install-MSUpdates to continue if any errors are encountered when installing updates Implement /Force parameter on Update-GroupPolicy (ensure that any logoff message is ignored) ...WordMat: WordMat v. 1.07: A quick fix because scientific notation was broken in v. 1.06 read more at http://wordmat.blogspot.com????: 《????》: 《????》(c???)??“????”???????,???????????????C?????????。???????,???????????????????????. ??????????????????????????????????;????????????????????????????。New Projects2112110016: BÀI T?P OOP2112110072: 2112110072 Bai tap OOPA more efficient algorithm for an NP-complete problem: Algorithm for finding Hamiltonian cycle.Asprise OCR for C#/VB.NET Sample Applications: Embeded with a high performance OCR (optical character recognition) engine, Asprise OCR SDK library for Java, VB.NET, CSharp.NET, VC++, VB6.0, C, C++, Delphi onDisenio1Obli1: Obligatorio desarrollado por Santiago Perez y Germán Otero Universidad ORT, Año 2014DuAnCuoiKy: lap trinh windows form cuoi ky, quan ly sinh vien truong hoc student management systemEnterprise Integration and BPM - A WF Integration and BPM blueprint: A CQRS based EAI (Enterprise Application Integration) and BMP (Business Process Management) blueprint using Message Queues and Windows Workflow Foundation.GU.ERP: saLaunchPad2: A remote device timing and control systemNMusicCreator: A simple program for creating music.PPM: ??SharePoint 2013 Latency Health Analyzer Rule: Analyze network latency for all SQL Servers using this custom health analyzer rule.SnapPea: A simple MVC content management system.The Bubble Index: The Bubble Index, a Java (TM) application to measure the level of financial bubbles. Published with GNU General Public License version 2 (GPLv2).Tools for Manufacturing: t4mfg is a set of programs to be used by small to medium sized companies that manufacture goods to order.Unix Project Template: A simple framework for creating unix projects using C++ and make.

    Read the article

  • CodePlex Daily Summary for Monday, January 31, 2011

    CodePlex Daily Summary for Monday, January 31, 2011Popular ReleasesMVC Controls Toolkit: Mvc Controls Toolkit 0.8: Fixed the following bugs: *Variable name error in the jvascript file that prevented the use of the deleted item template of the Datagrid *Now after the changes applied to an item of the DataGrid are cancelled all input fields are reset to the very initial value they had. *Other minor bugs. Added: *This version is available both for MVC2, and MVC 3. The MVC 3 version has a release number of 0.85. This way one can install both version. *Client Validation support has been added to all control...Office Web.UI: Beta preview (Source): This is the first Beta. it includes full source code and all available controls. Some designers are not ready, and some features are not finalized allready (missing properties, draft styles) ThanksASP.net Ribbon: Version 2.2: This release brings some new controls (part of Office Web.UI). A few bugs are fixed and it includes the "auto resize" feature as you resize the window. (It can cause an infinite loop when the window is too reduced, it's why this release is not marked as "stable"). I will release more versions 2.3, 2.4... until V3 which will be the official launch of Office Web.UI. Both products will evolve at the same speed. Thanks.Barcode Rendering Framework: 2.1.1.0: Final release for VS2008 Finally fixed bugs with code 128 symbology.HERB.IQ: HERB.IQ.UPGRADE.0.5.3.exe: HERB.IQ.UPGRADE.0.5.3.exexUnit.net - Unit Testing for .NET: xUnit.net 1.7: xUnit.net release 1.7Build #1540 Important notes for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. This release adds the following new features: Added support for ASP.NET MVC 3 Added Assert.Equal(double expected, double actual, int precision) Ad...DoddleReport - Automatic HTML/Excel/PDF Reporting: DoddleReport 1.0: DoddleReport will add automatic tabular-based reporting (HTML/PDF/Excel/etc) for any LINQ Query, IEnumerable, DataTable or SharePoint List For SharePoint integration please click Here PDF Reporting has been placed into a separate assembly because it requies AbcPdf http://www.websupergoo.com/download.htmSpark View Engine: Spark v1.5: Release Notes There have been a lot of minor changes going on since version 1.1, but most important to note are the major changes which include: Support for HTML5 "section" tag. Spark has now renamed its own section tag to "segment" instead to avoid clashes. You can still use "section" in a Spark sense for legacy support by specifying ParseSectionAsSegment = true if needed while you transition Bindings - this is a massive feature that further simplifies your views by giving you a powerful ...Marr DataMapper: Marr DataMapper 1.0.0 beta: First release.WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.3: Version: 2.0.0.3 (Milestone 3): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...Rawr: Rawr 4.0.17 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 and on the Version Notes page: http://rawr.codeplex.com/wikipage?title=VersionNotes As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you...Squiggle - A Free open source LAN Messenger: Squiggle 2.5 Beta: In this release following are the new features: Localization: Support for Arabic, French, German and Chinese (Simplified) Bridge: Connect two Squiggle nets across the WAN or different subnets Aliases: Special codes with special meaning can be embedded in message like (version),(datetime),(time),(date),(you),(me) Commands: cls, /exit, /offline, /online, /busy, /away, /main Sound notifications: Get audio alerts on contact online, message received, buzz Broadcast for group: You can ri...VivoSocial: VivoSocial 7.4.2: Version 7.4.2 of VivoSocial has been released. If you experienced any issues with the previous version, please update your modules to the 7.4.2 release and see if they persist. If you have any questions about this release, please post them in our Support forums. If you are experiencing a bug or would like to request a new feature, please submit it to our issue tracker. Web Controls * Updated Business Objects and added a new SQL Data Provider File. Groups * Fixed a security issue whe...PHP Manager for IIS: PHP Manager 1.1.1 for IIS 7: This is a minor release of PHP Manager for IIS 7. It contains all the functionality available in 56962 plus several bug fixes (see change list for more details). Also, this release includes Russian language support. SHA1 codes for the downloads are: PHPManagerForIIS-1.1.0-x86.msi - 6570B4A8AC8B5B776171C2BA0572C190F0900DE2 PHPManagerForIIS-1.1.0-x64.msi - 12EDE004EFEE57282EF11A8BAD1DC1ADFD66A654mojoPortal: 2.3.6.1: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2361-released.aspx Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.Parallel Programming with Microsoft Visual C++: Drop 6 - Chapters 4 and 5: This is Drop 6. It includes: Drafts of the Preface, Introduction, Chapters 2-7, Appendix B & C and the glossary Sample code for chapters 2-7 and Appendix A & B. The new material we'd like feedback on is: Chapter 4 - Parallel Aggregation Chapter 5 - Futures The source code requires Visual Studio 2010 in order to run. There is a known bug in the A-Dash sample when the user attempts to cancel a parallel calculation. We are working to fix this.NodeXL: Network Overview, Discovery and Exploration for Excel: NodeXL Excel Template, version 1.0.1.160: The NodeXL Excel template displays a network graph using edge and vertex lists stored in an Excel 2007 or Excel 2010 workbook. What's NewThis release improves NodeXL's Twitter and Pajek features. See the Complete NodeXL Release History for details. Installation StepsFollow these steps to install and use the template: Download the Zip file. Unzip it into any folder. Use WinZip or a similar program, or just right-click the Zip file in Windows Explorer and select "Extract All." Close Ex...Kooboo CMS: Kooboo CMS 3.0 CTP: Files in this downloadkooboo_CMS.zip: The kooboo application files Content_DBProvider.zip: Additional content database implementation of MSSQL, RavenDB and SQLCE. Default is XML based database. To use them, copy the related dlls into web root bin folder and remove old content provider dlls. Content provider has the name like "Kooboo.CMS.Content.Persistence.SQLServer.dll" View_Engines.zip: Supports of Razor, webform and NVelocity view engine. Copy the dlls into web root bin folder to enable...UOB & ME: UOB ME 2.6: UOB ME 2.6????: ???? V1.0: ???? V1.0 ??New ProjectsAuto Complete Control for ASP.NET: Autocomplete Control is a fully functional ASP.NET control for word suggestions and autocomplete. We had been using Ajax Control Toolkit AutoComplete Extender in our projects before, but we have needed some extra features and functionalities.Cours ESIEE: MAJ des cours ESIEE depuis la plateforme Icampus et autres documentsEngineering World Expenses: Demo expenses application for Engineering World 2011Entity Framework / Linq to Sql Poco Code Generator: Poco Orm data access layer (Dto) code generator for Entity Framework and Linq to Sql. Customizable code generation via simple templating system. Utilizes Managed Extensibility Framework (MEF) in order for application parts to dynamically composed and plug-able.linqish.py: Python module for manipulating iterables. An implementation of the .Net Framework's Linq to Objects for Python.Machinekey setter: This code sample is Windows Azure SDK 1.3 custom plugin. This sample do working at set custom key to machinekey of web.config file in your WebRole.MapReduce.NET: MapReduce.NET intends to implement the original paper proposed by Google on MapReduce.Marr DataMapper: Marr DataMapper provides a fast and easy to use wrapper around ADO.NET that enables you to focus more on your data access queries without having to write plumbing code. Load one-to-one, one-to-many, and hierarchical entity models with ease. No special base class required.Orchard Silverlight: Orchard module enabling embedding Silverlight applications and creating Silverlight-based content.RouteMagic: Library of useful routing helpers and classes.Smart Skelta Utilites: Smart Skelta Utilies will provide utilties like Visual Studio 2008 Skelta Starter Kit(Project Templates and Project Item Templates),Code Snippets for Skelta Components,Skleta Attachment Extracter Web based Logger,Skelta Server utility and others for skelta based development.Solfix: Solfix is a programming language tbat is work-in-progress, but it has a lot of functionality! You can make applications for console to windows applications. The main point of Solfix is to make coding easier and less time than before.SQLite Manager: A minimal manage for sqlite databases.State Search: StateSearch provides state search algoritms such as A*, IDA*, BestFirst, etc to solve problems such as puzzles and/or path searchingTable Check Custom Field Type: SharePoint Custom Field Type for displaying a list of values with checkboxes and people editors.testsgb: testWindows Phone 7 Extension Framework: An extension method framework for Windows Phone 7 to make your code more fluent and adding a lot of common functions you don't need to reproduce.

    Read the article

  • Ubuntu 12.04 does not see windows already install on my computer (dual installation)

    - by jacinta
    I was trying to install the ubuntu 12.4 along side windows 7 on my new HP Pavilion 64k desktop with windows 7 computer but Ubuntu said that ( This computer has no detected operating system) and some one said (I suggest you chkdsk your Windows partition. I also suggest you resize the NTFS in WIndows then install Ubuntu to the free space.) Therefore I did (To shrink a simple or spanned volume using the Windows interface In Disk Management, right-click the simple or spanned volume you want to shrink. Click Shrink Volume…. Follow the instructions on your screen.) Then When I try to install ubuntu 12.4 after doing this, I received the same error. I was going to undo what I did but I see that I lose 1g when I do that so now what do I do? it says I can do a new simple volume and maybe then the space will no longer be unallocated. Please help me. I think I have a bad cd (ubuntu 12.4) cause from my research I see that I am not suppose to get a screen saying that (The computer has no detected operating system) I think this is a bad cd and I hope I did not mess up my computer. Please help. .................................................................................... O k I think I am following what you said about how to edit my question irrational john. I did chkdsk as you and actionparsnip (andrew-woodhead666) told me to AND ALSO did a lot of other things before I found out how to chkdsk. No problems. Thank you. Then I put back the space (extended) I took from system. I still was only able to put back 15 and not 16 so it is up to 99mb not back to 100mb. Then I shrank HP (C) as you told me, to 10 13,240 mb which is (12.93gb Unallocated). I did not change it into NTSF by doing the (New Simple Volume Action) I just left it. Then I tried to install UBUNTU 12.04 live CD amd64 and it gave me the results it was sometimes giving me before which is result (THAT Ubuntu) does not tell me weather I have or have not an already installed windows7. It just goes to a window that would have showed me information on what I have and on the bottom (DEVICE FOR BOOT LOADER INSTALLATION /dev/sda ) and the option to go BACK, QUIT, or INSTALL. (I think it is the INSTALLATION TYPE window). Therefore I do what I have been doing and I QUIT. What do I do now? Sorry that it seems like I cannot do anything on my own. On the Youtube video how to install ubuntu dual-boot alongside windows UBUNTU is installed so easy. The installation option page gives 3 options including dual instillation and the disk even lets you use a slider to slide to the size of the partition size you want. Yet my UBUNTU live cd is a mess and I checked it as one of you guys told me and got back information that it is good. Oh well this guy says you should press a control key to tell which device you are using to install ubuntu before the screen comes up. I guess cause it is old. This page also shows you easy stuff that do not show up on my cd. how to dual-boot UBUNTU and windows 7 P.S.. I saw this on the windows 7 website windows.microsoft.com/en-US/windows7/Formatting-disks-and-drives-frequently-asked-questions CREATE A BOOT PARTITION I HAD TO LEAVE OUT THE HTTP STUFF CAUSE I AM ONLY ALLOWED 2 ON A PAGE IT SAID To create a boot partition Warning Warning If you are installing different versions of Windows, you must install the earliest version first. If you don't do this, your computer may become inoperable. Open Computer Management by clicking the Start button Picture of the Start button, clicking Control Panel, clicking System and Security, clicking Administrative Tools, and then double-clicking Computer Management.? Administrator permission required If you're prompted for an administrator password or confirmation, type the password or provide confirmation. In the left pane, under Storage, click Disk Management. Right-click an unallocated region on your hard disk, and then click New Simple Volume. In the New Simple Volume Wizard, click Next. Type the size of the volume you want to create in megabytes (MB) or accept the maximum default size, and then click Next. Accept the default drive letter or choose a different drive letter to identify the volume, and then click Next. In the Format Partition dialog box, do one of the following: If you don't want to format the volume right now, click Do not format this volume, and then click Next. To format the volume with the default settings, click Next. For more information about formatting, see Formatting disks and drives: frequently asked questions. Review your choices, and then click Finish. AND THIS ON ANOTHER PAGE. Formatting disks and drives: frequently asked questions Hard disks, the primary storage devices on your computer, need to be formatted before you can use them. When you format a disk, you configure it with a file system so that Windows can store information on the disk. Hard disks in new computers running Windows are already formatted. If you buy an additional hard disk to expand the storage of your computer, you might need to format it. Storage devices such as USB flash drives and flash memory cards usually come preformatted by the manufacturer, so you probably won't need to format them. CDs and DVDs, on the other hand, use different formats from hard disks and removable storage devices. For information about formatting CDs and DVDs, see Which CD or DVD format should I use? Warning Warning Formatting erases any existing files on a hard disk. If you format a hard disk that has files on it, the files will be deleted. WHAT I DID WAS I GOT TO COMPUTER MANAGEMENT SECTION THEN I CLICKED ON DRIVE HP(C) (it put stripes on to show it is selected) Then I click on ACTION selected ALL TASKS AND THEN selected SHRINK VOLUME and then chose how much space from what it was giving me that I wanted. (12.93gb) AND THAT WAS ALL I DID. THEN I TRIED TO INSTALL UBUNTU i NEVER GOT THE 3RD SCREEN THAT IS IN THE VIDEO I INCLUDED (THE YOUTUBE WITH THE ENGLISH GUY) INSTALLATION TYPE I ALSO DID NOT GET THE 4TH SCREEN THAT ALLOWS YOU TO SELECT PARTITION SIZE what i got next was the 2nd INSTILLATION TYPE window shown on the (LINUX BS DOS.COM) PAGE THAT I INCLUDED and it showed no information about any drives (no drives /partition or stuff was shown) only the Boot Loader statement and the dev/sda bar and that's why i did not press install but chose to QUIT. SORRY I JUST NOW SAW YOUR ANSWER IRRATIONAL JOHN. I SHRANK HP(C) BY 12.93GB MY UNALLOCATED SPACE IS NOW 12.93GB HP(C) = 907.17gb NTSF...YOU ARE CORRECT WITH EVERYTHING YOU SAID This is what i read on (http://)windows.microsoft.com/en-US/windows7/Create-a-boot-partition I am only allowed 2 links Create a boot partition You must be logged on as an administrator to perform these steps. A boot partition is a partition that contains the files for the Windows operating system. If you want to install a second operating system on your computer (called a dual-boot or multiboot configuration), you need to create another partition on the hard disk, and then install the additional operating system on the new partition. Your hard disk would then have one system partition and two boot partitions. (A system partition is the partition that contains the hardware-related files. These tell the computer where to look to start Windows.) To create a partition on a basic disk, there must be unallocated disk space on your hard disk. With Disk Management, you can create a maximum of three primary partitions on a hard disk. You can create extended partitions, which include logical drives within them, if you need more partitions on the disk. Picture of disk space in Computer ManagementUnallocated disk space If there is no unallocated space, you will either need to create space by shrinking or deleting an existing partition or by using a third-party partitioning tool to repartition your hard disk. For more information, see Can I repartition my hard disk? To create a boot partition Warning Warning If you are installing different versions of Windows, you must install the earliest version first. If you don't do this, your computer may become inoperable. Open Computer Management by clicking the Start button Picture of the Start button, clicking Control Panel, clicking System and Security, clicking Administrative Tools, and then double-clicking Computer Management.? Administrator permission required If you're prompted for an administrator password or confirmation, type the password or provide confirmation. In the left pane, under Storage, click Disk Management. Right-click an unallocated region on your hard disk, and then click New Simple Volume. In the New Simple Volume Wizard, click Next. Type the size of the volume you want to create in megabytes (MB) or accept the maximum default size, and then click Next. Accept the default drive letter or choose a different drive letter to identify the volume, and then click Next. In the Format Partition dialog box, do one of the following: If you don't want to format the volume right now, click Do not format this volume, and then click Next. To format the volume with the default settings, click Next. For more information about formatting, see Formatting disks and drives: frequently asked questions. Review your choices, and then click Finish. I did what you told me @irrational john and this is the screen shot. I ENTERED ubuntu@ubuntu:~$ sudo os-prober computer did not respond so I entered ubuntu@ubuntu:~$ sudo apt-get -y remove dmraid computer responded with Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be REMOVED: dmraid 0 upgraded, 0 newly installed, 1 to remove and 0 not upgraded. After this operation, 141 kB disk space will be freed. (Reading database ... 147515 files and directories currently installed.) Removing dmraid ... update-initramfs is disabled since running on read-only media Processing triggers for man-db ... I entered ubuntu@ubuntu:~$ sudo os-prober Computer Responded with /dev/sda1:Windows 7 (loader):Windows:chain /dev/sda3:Windows Recovery Environment (loader):Windows1:chain ubuntu@ubuntu:~$ ............... @obsessiveFOSS I don't know what is a Grub menu and I do not know what is the Ubuntu boot option The answer you gave to me was correct. This one {This apparently removes the dmraid metadata. After doing that, you can use the desktop icon Install Ubuntu 12.04 LTS to start the Ubuntu installer. This time the Installation Type window should contain the option to Install Ubuntu alongside Windows 7.} This is what I decided to do. I did not see the rest of your help 'till now. Never the less. I think the best thing for me to do now is to get a cheap used laptop and either do a dual installation or just install Ubuntu on to it. This way if I have any issues that I cannot solve like the one I had here, at least I will still have a usable computer to work on and to use to get answers with because I am not an expert like the people on this forum. Thanks a lot I will try to keep learning and do research enough to some day help someone else.

    Read the article

  • unexpected EOF and end of document

    - by WASasquatch
    I have been fiddling with this code for a few days. Mind you I am a beginner. I just want to get my script to be able to download a remote file, and scan MineCraft plugins. I got the scan plugins to work, but I'm having two other issues. One, I can't get the mc_addplugin to work correctly, and I get a Unexpected EOF and unexpected end of document when running any other command besides mc_scanplugins or mc_start bash: -c: line 0: unexpected EOF while looking for matching `"' bash: -c: line 1: syntax error: unexpected end of file Help would be so much appreciated! Thanks in advance. #!/bin/bash # /etc/init.d/craftbukkit # version 0.9.1 2012-07-06 (YYYY-MM-DD) ### BEGIN INIT INFO # Provides: craftbukkit # Required-Start: $local_fs $remote_fs # Required-Stop: $local_fs $remote_fs # Should-Start: $network # Should-Stop: $network # Default-Start: 2 3 4 5 # Default-Stop: 0 1 6 # Short-Description: Starts craftbukkit server # Description: Starts and controls the craftbukkit server ### END INIT INFO # SETTINGS SERVICE='craftbukkit-1.2.5-R1.0.jar' OPTIONS='nogui' USERNAME='smith' # LIST ALL THE WORLDS IN YOUR CRAFTBUKKIT SERVER FOLDER WORLDS[1]='world' WORLDS[2]='world_nether' WORLDS[3]='world_the_end' WORLDS[4]='flat_world' MCPATH='/var/www/servers/Foundation' PLUGINSPATH='/var/www/servers/Foundation/plugins' TEMPPLUGINS='/var/www/servers/Foundationplugins/temp_plugins' BACKUPPATH='/var/www/servers/Foundation/backup' CPU_COUNT=2 INVOCATION="java -Xmx2024M -Xms2024M -XX:+UseConcMarkSweepGC -XX:+CMSIncrementalPacing -XX:ParallelGCThreads=$CPU_COUNT -XX:+AggressiveOpts -jar $SERVICE $OPTIONS" ME=`whoami` as_user() { if [ $ME == $USERNAME ] ; then bash -c "$1" else su - $USERNAME -c "$1" fi } mc_start() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is already running!" else echo "Starting $SERVICE..." cd $MCPATH as_user "cd $MCPATH && screen -dmS craftbukkit $INVOCATION" sleep 7 if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is now running." else echo "Error! Could not start $SERVICE!" fi fi } mc_saveoff() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running... suspending saves" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"say The server is preforming a backup. Server going to read-only mode. Do not build...\"\015'" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"save-off\"\015'" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"save-all\"\015'" sync sleep 10 else echo "$SERVICE is not running. Not suspending saves." fi } mc_save() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running... Saving worlds..." as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"save-all\"\015'" sync sleep 10 echo "Save complete!" else echo "$SERVICE is not running. Cannot save!" fi } mc_saveon() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running... re-enabling saves" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"save-on\"\015'" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"say Server backup has completed. Server going to read-write mode. You can now continue building...\"\015'" else echo "$SERVICE is not running. Not resuming saves." fi } mc_stop() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "Stopping $SERVICE" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"say $SERVERNAME is shutting down in 30 seconds! Please stop what you are doing. Check back later, we'll be back!\"\015'" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"save-all\"\015'" sleep 30 as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"stop\"\015'" sleep 7 else echo "$SERVICE was not running." fi if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "Error! $SERVICE could not be stopped." else echo "$SERVICE is stopped." fi } mc_update() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running! Will not start update." else MC_SERVER_URL=http://dl.bukkit.org/latest-rb/craftbukkit.jar as_user "cd $MCPATH && wget -q -O $MCPATH/craftbukkit_server.jar.update $MC_SERVER_URL" if [ -f $MCPATH/craftbukkit_server.jar.update ] then if `diff $MCPATH/$SERVICE $MCPATH/craftbukkit_server.jar.update >/dev/null` then echo "You are already running the latest version of $SERVICE. Update anyway? [Y/n]" select yn in "Yes" "No"; do case $yn in Yes ) as_user "mv $MCPATH/$SERVICE $MCPATH/${SERVICE}_old.jar" as_user "mv $MCPATH/craftbukkit_server.jar.update $MCPATH/$SERVICE" echo "$SERVICE updated successfully!"; break;; No ) echo "The update was not installed! Removing temporary files and exiting..." as_user "rm $MCPATH/craftbukkit_server.jar.update" exit;; esac done else as_user "mv $MCPATH/$SERVICE $MCPATH/${SERVICE}_old.jar" as_user "mv $MCPATH/craftbukkit_server.jar.update $MCPATH/$SERVICE" echo "$SERVICE updated successfully!" fi else echo "$SERVICE update could not be downloaded." fi fi } mc_addplugin() { if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running! Please stop the service before adding a plugin." else echo "Paste the URL to the .JAR Plugin..." read JARURL JARNAME=$(basename "$JARURL") if [ -d "$TEMPPLUGINS" ] then as_user "cd $PLUGINSPATH && wget -r -A.jar $JARURL -o temp_plugins/$JARNAME" else as_user "cd $PLUGINSPATH && mkdir $TEMPPLUGINS && wget -r -A.jar $JARURL -o temp_plugins/$JARNAME" fi if [ -f "$TMPDIR/$JARNAME" ] then if [ -f "$PLUGINSPATH/$JARNAME" ] then if `diff $PLUGINSPATH/$JARNAME $TMPDIR/$JARNAME >/dev/null` then echo "You are already running the latest version of $JARNAME." else NOW=`date "+%Y-%m-%d_%Hh%M"` echo "Are you sure you want to overwrite this plugin? [Y/n]" echo "Note: Your old plugin will be moved to the "$TEMPPLUGINS" folder with todays date." select yn in "Yes" "No"; do case $yn in Yes ) as_user "mv $PLUGINSPATH/$JARNAME $TEMPPLUGINS/${JARNAME}_${NOW} && mv $TEMPPLUGINS/$JARNAME $PLUGINSPATH/$JARNAME"; break;; No ) echo "The plugin has not been installed! Removing temporary plugin and exiting..." as_user "rm $TEMPPLUGINS/$JARNAME"; exit;; esac done echo "Would you like to start the $SERVICE now? [Y/n]" select yn in "Yes" "No"; do case $yn in Yes ) mc_start; break;; No ) "$SERVICE not running! To start the service run: /etc/init.d/craftbukkit start"; exit;; esac done fi else echo "Are you sure you want to add this new plugin? [Y/n]" select yn in "Yes" "No"; do case $yn in Yes ) as_user "mv $PLUGINSPATH/$JARNAME $TEMPPLUGINS/${JARNAME}_${NOW} && mv $TEMPPLUGINS/$JARNAME $PLUGINSPATH/$JARNAME"; break;; No ) echo "The plugin has not been installed! Removing temporary plugin and exiting..." as_user "rm $TEMPPLUGINS/$JARNAME"; exit;; esac done echo "Would you like to start the $SERVICE now? [Y/n]?" select yn in "Yes" "No"; do case $yn in Yes ) mc_start; break;; No ) "$SERVICE not running! To start the service run: /etc/init.d/craftbukkit start"; exit;; esac done fi else echo "Failed to download the plugin from the URL you specified!" exit; fi fi } mc_scanplugins() { if [ "$(ls -A $PLUGINSPATH)" ] then shopt -s nullglob PLUGINS=($PLUGINSPATH/*.jar) i=1 for f in "${PLUGINS[@]}" do echo "${i}: $f" PLUGIN[$i]=$f i=$(( $i + 1 )) done echo "Enter the ID of a plugin you want removed, or any other key to cancel." read INPUT if [ ! -z "${INPUT##*[!0-9]*}" ] then if [ -f "${PLUGIN[INPUT]}" ] then echo "Removing plugin..." JAR=$(basename ${PLUGIN[INPUT]}) JARNAME=${JAR%.jar} as_user "rm -f ${PLUGIN[INPUT]}" sleep 2 as_user "cd $PLUGINSPATH && rm -rf ./${JARNAME}" if [ -f "${PLUGINSPATH}/${JARNAME}" ] then echo "Plugin folder could not be removed..." fi echo "Plugin removed." else echo "${PLUGIN[INPUT]}" echo "Invalid plugin! Does not exist! Canceling..." exit; fi else echo "Canceling..." exit; fi else echo "You have no plugins installed." exit; fi } mc_backup() { mc_saveoff for i in "${WORLDS[@]}"; do NOW=`date "+%Y-%m-%d_%Hh%M"` BACKUP_FILE="$BACKUPPATH/${i}_${NOW}.tar" echo "Backing up world: $i..." #as_user "cd $MCPATH && cp -r $i $BACKUPPATH/${i}_`date "+%Y.%m.%d_%H.%M""` as_user "tar -C \"$MCPATH\" -cf \"$BACKUP_FILE\" $i" done echo "Backing up $SERVICE" as_user "tar -C \"$MCPATH\" -rf \"$BACKUP_FILE\" $SERVICE" #as_user "cp \"$MCPATH/$SERVICE\" \"$BACKUPPATH/craftbukkit_server_${NOW}.jar\"" mc_saveon echo "Compressing backup..." as_user "tar -cvzf $BACKUPPATH/server_backup_${NOW}.tar.gz $MCPATH" echo "Backup has completed successfully." } mc_command() { command="$1"; if pgrep -u $USERNAME -f $SERVICE > /dev/null then pre_log_len=`wc -l "$MCPATH/server.log" | awk '{print $1}'` echo "$SERVICE is running... executing command" as_user "screen -p 0 -S craftbukkit -X eval 'stuff \"$command\"\015'" sleep .1 # assumes that the command will run and print to the log file in less than .1 seconds # print output tail -n $[`wc -l "$MCPATH/server.log" | awk '{print $1}'`-$pre_log_len] "$MCPATH/server.log" fi } #Start-Stop here case "$1" in start) mc_start ;; stop) mc_stop ;; restart) mc_stop mc_start ;; save) mc_save ;; update) mc_stop mc_backup mc_update mc_start ;; scanplugins) mc_scanplugins ;; addplugin) mc_addplugin ;; backup) mc_backup ;; status) if pgrep -u $USERNAME -f $SERVICE > /dev/null then echo "$SERVICE is running." else echo "$SERVICE is not running." fi ;; command) if [ $# -gt 1 ]; then shift mc_command "$*" else echo "Must specify server command (try 'help'?)" fi ;; *) echo "Usage: $0 {start|stop|update|backup|status|restart|command \"server command\"}" exit 1 ;; esac exit 0

    Read the article

  • Building Publishing Pages in Code

    - by David Jacobus
    Originally posted on: http://geekswithblogs.net/djacobus/archive/2013/10/27/154478.aspxOne of the Mantras we developers try to follow: Ensure that the solution package we deliver to the client is complete.  We build Web Parts, Master Pages, Images, CSS files and other artifacts that we push to the client with a WSP (Solution Package) And then we have them finish the solution by building their site pages by adding the web parts to the site pages.       I am a proponent that we,  the developers,  should minimize this time consuming work and build these site pages in code.  I found a few blogs and some MSDN documentation but not really a complete solution that has all these artifacts working in one solution.   What I am will discuss and provide a solution for is a package that has: 1.  Master Page 2.  Page Layout 3.  Page Web Parts 4.  Site Pages   Most all done in code without the development team or the developers having to finish up the site building process spending a few hours or days completing the site!  I am not implying that in Development we do this. In fact,  we build these pages incrementally testing our web parts, etc. I am saying that the final action in our solution is that we take all these artifacts and add them to the site pages in code, the client then only needs to activate a few features and VIOLA their site appears!.  I had a project that had me build 8 pages like this as part of the solution.   In this blog post, I am taking a master page solution that I have called DJGreenMaster.  On My Office 365 Development Site it looks like this:     It is a generic master page for a SharePoint 2010 site Along with a three column layout.  Centered with a footer that uses a SharePoint List and Web Part for the footer links.  I use this master page a lot in my site development!  Easy to change the color and site logo with a little CSS.   I am going to add a few web parts for discussion purposes and then add these web parts to a site page in code.    Lets look at the solution package for DJ Green Master as that will be the basis project for building the site pages:   What you are seeing  is a complete solution to add a Master Page to a site collection which contains: 1.  Master Page Module which contains the Master Page and Page Layout 2.  The Footer Module to add the Footer Web Part 3.  Miscellaneous modules to add images, JQuery, CSS and subsite page 4.  3 features and two feature event receivers: a.  DJGreenCSS, used to add the master page CSS file to Style Sheet Library and an Event Receiver to check it in. b.  DJGreenMaster used to add the Master Page and Page Layout.  In an Event Receiver change the master page to DJGreenMaster , create the footer list and check the files in. c.  DJGreenMasterWebParts add the Footer Web Part to the site collection. I won’t go over the code for this as I will give it to you at the end of this blog post. I have discussed creating a list in code in a previous post.  So what we have is the basis to begin what is germane to this discussion.  I have the first two requirements completed.  I need now to add page web parts and the build the pages in code.  For the page web parts, I will use one downloaded from Codeplex which does not use a SharePoint custom list for simplicity:   Weather Web Part and another downloaded from MSDN which is a SharePoint Custom Calendar Web Part, I had to add some functionality to make the events color coded to exceed the built-in 10 overlays using JQuery!    Here is the solution with the added projects:     Here is a screen shot of the Weather Web Part Deployed:   Here is a screen shot of the Site Calendar with JQuery:     Okay, Now we get to the final item:  To create Publishing pages.   We need to add a feature receiver to the DJGreenMaster project I will name it DJSitePages and also add a Event Receiver:       We will build the page at the site collection level and all of the code necessary will be contained in the event receiver.   Added a reference to the Microsoft.SharePoint.Publishing.dll contained in the ISAPI folder of the 14 Hive.   First we will add some static methods from which we will call  in our Event Receiver:   1: private static void checkOut(string pagename, PublishingPage p) 2: { 3: if (p.Name.Equals(pagename, StringComparison.InvariantCultureIgnoreCase)) 4: { 5: 6: if (p.ListItem.File.CheckOutType == SPFile.SPCheckOutType.None) 7: { 8: p.CheckOut(); 9: } 10:   11: if (p.ListItem.File.CheckOutType == SPFile.SPCheckOutType.Online) 12: { 13: p.CheckIn("initial"); 14: p.CheckOut(); 15: } 16: } 17: } 18: private static void checkin(PublishingPage p,PublishingWeb pw) 19: { 20: SPFile publishFile = p.ListItem.File; 21:   22: if (publishFile.CheckOutType != SPFile.SPCheckOutType.None) 23: { 24:   25: publishFile.CheckIn( 26:   27: "CheckedIn"); 28:   29: publishFile.Publish( 30:   31: "published"); 32: } 33: // In case of content approval, approve the file need to add 34: //pulishing site 35: if (pw.PagesList.EnableModeration) 36: { 37: publishFile.Approve("Initial"); 38: } 39: publishFile.Update(); 40: }   In a Publishing Site, CheckIn and CheckOut  are required when dealing with pages in a publishing site.  Okay lets look at the Feature Activated Event Receiver: 1: public override void FeatureActivated(SPFeatureReceiverProperties properties) 2: { 3:   4:   5:   6: object oParent = properties.Feature.Parent; 7:   8:   9:   10: if (properties.Feature.Parent is SPWeb) 11: { 12:   13: currentWeb = (SPWeb)oParent; 14:   15: currentSite = currentWeb.Site; 16:   17: } 18:   19: else 20: { 21:   22: currentSite = (SPSite)oParent; 23:   24: currentWeb = currentSite.RootWeb; 25:   26: } 27: 28:   29: //create the publishing pages 30: CreatePublishingPage(currentWeb, "Home.aspx", "ThreeColumnLayout.aspx","Home"); 31: //CreatePublishingPage(currentWeb, "Dummy.aspx", "ThreeColumnLayout.aspx","Dummy"); 32: }     Basically we are calling the method Create Publishing Page with parameters:  Current Web, Name of the Page, The Page Layout, Title of the page.  Let’s look at the Create Publishing Page method:   1:   2: private void CreatePublishingPage(SPWeb site, string pageName, string pageLayoutName, string title) 3: { 4: PublishingSite pubSiteCollection = new PublishingSite(site.Site); 5: PublishingWeb pubSite = null; 6: if (pubSiteCollection != null) 7: { 8: // Assign an object to the pubSite variable 9: if (PublishingWeb.IsPublishingWeb(site)) 10: { 11: pubSite = PublishingWeb.GetPublishingWeb(site); 12: } 13: } 14: // Search for the page layout for creating the new page 15: PageLayout currentPageLayout = FindPageLayout(pubSiteCollection, pageLayoutName); 16: // Check or the Page Layout could be found in the collection 17: // if not (== null, return because the page has to be based on 18: // an excisting Page Layout 19: if (currentPageLayout == null) 20: { 21: return; 22: } 23:   24: 25: PublishingPageCollection pages = pubSite.GetPublishingPages(); 26: foreach (PublishingPage p in pages) 27: { 28: //The page allready exists 29: if ((p.Name == pageName)) return; 30:   31: } 32: 33:   34:   35: PublishingPage newPage = pages.Add(pageName, currentPageLayout); 36: newPage.Description = pageName.Replace(".aspx", ""); 37: // Here you can set some properties like: 38: newPage.IncludeInCurrentNavigation = true; 39: newPage.IncludeInGlobalNavigation = true; 40: newPage.Title = title; 41: 42: 43:   44:   45: 46:   47: //build the page 48:   49: 50: switch (pageName) 51: { 52: case "Homer.aspx": 53: checkOut("Courier.aspx", newPage); 54: BuildHomePage(site, newPage); 55: break; 56:   57:   58: default: 59: break; 60: } 61: // newPage.Update(); 62: //Now we can checkin the newly created page to the “pages” library 63: checkin(newPage, pubSite); 64: 65: 66: }     The narrative in what is going on here is: 1.  We need to find out if we are dealing with a Publishing Web.  2.  Get the Page Layout 3.  Create the Page in the pages list. 4.  Based on the page name we build that page.  (Here is where we can add all the methods to build multiple pages.) In the switch we call Build Home Page where all the work is done to add the web parts.  Prior to adding the web parts we need to add references to the two web part projects in the solution. using WeatherWebPart.WeatherWebPart; using CSSharePointCustomCalendar.CustomCalendarWebPart;   We can then reference them in the Build Home Page method.   Let’s look at Build Home Page: 1:   2: private static void BuildHomePage(SPWeb web, PublishingPage pubPage) 3: { 4: // build the pages 5: // Get the web part manager for each page and do the same code as below (copy and paste, change to the web parts for the page) 6: // Part Description 7: SPLimitedWebPartManager mgr = web.GetLimitedWebPartManager(web.Url + "/Pages/Home.aspx", System.Web.UI.WebControls.WebParts.PersonalizationScope.Shared); 8: WeatherWebPart.WeatherWebPart.WeatherWebPart wwp = new WeatherWebPart.WeatherWebPart.WeatherWebPart() { ChromeType = PartChromeType.None, Title = "Todays Weather", AreaCode = "2504627" }; 9: //Dictionary<string, string> wwpDic= new Dictionary<string, string>(); 10: //wwpDic.Add("AreaCode", "2504627"); 11: //setWebPartProperties(wwp, "WeatherWebPart", wwpDic); 12:   13: // Add the web part to a pagelayout Web Part Zone 14: mgr.AddWebPart(wwp, "g_685594D193AA4BBFABEF2FB0C8A6C1DD", 1); 15:   16: CSSharePointCustomCalendar.CustomCalendarWebPart.CustomCalendarWebPart cwp = new CustomCalendarWebPart() { ChromeType = PartChromeType.None, Title = "Corporate Calendar", listName="CorporateCalendar" }; 17:   18: mgr.AddWebPart(cwp, "g_20CBAA1DF45949CDA5D351350462E4C6", 1); 19:   20:   21: pubPage.Update(); 22:   23: } Here is what we are doing: 1.  We got  a reference to the SharePoint Limited Web Part Manager and linked/referenced Home.aspx  2.  Instantiated the a new Weather Web Part and used the Manager to add it to the page in a web part zone identified by ID,  thus the need for a Page Layout where the developer knows the ID’s. 3.  Instantiated the Calendar Web Part and used the Manager to add it to the page. 4. We the called the Publishing Page update method. 5.  Lastly, the Create Publishing Page method checks in the page just created.   Here is a screen shot of the page right after a deploy!       Okay!  I know we could make a home page look much better!  However, I built this whole Integrated solution in less than a day with the caveat that the Green Master was already built!  So what am I saying?  Build you web parts, master pages, etc.  At the very end of the engagement build the pages.  The client will be very happy!  Here is the code for this solution Code

    Read the article

  • A tale from a Stalker

    - by Peter Larsson
    Today I thought I should write something about a stalker I've got. Don't get me wrong, I have way more fans than stalkers, but this stalker is particular persistent towards me. It all started when I wrote about Relational Division with Sets late last year(http://weblogs.sqlteam.com/peterl/archive/2010/07/02/Proper-Relational-Division-With-Sets.aspx) and no matter what he tried, he didn't get a better performing query than me. But this I didn't click until later into this conversation. He must have saved himself for 9 months before posting to me again. Well... Some days ago I get an email from someone I thought i didn't know. Here is his first email Hi, I want a proper solution for achievement the result. The solution must be standard query, means no using as any native code like TOP clause, also the query should run in SQL Server 2000 (no CTE use). We have a table with consecutive keys (nbr) that is not exact sequence. We need bringing all values related with nearest key in the current key row. See the DDL: CREATE TABLE Nums(nbr INTEGER NOT NULL PRIMARY KEY, val INTEGER NOT NULL); INSERT INTO Nums(nbr, val) VALUES (1, 0),(5, 7),(9, 4); See the Result: pre_nbr     pre_val     nbr         val         nxt_nbr     nxt_val ----------- ----------- ----------- ----------- ----------- ----------- NULL        NULL        1           0           5           7 1           0           5           7           9           4 5           7           9           4           NULL        NULL The goal is suggesting most elegant solution. I would like see your best solution first, after that I will send my best (if not same with yours)   Notice there is no name, no please or nothing polite asking for my help. So, on the top of my head I sent him two solutions, following the rule "Work on SQL Server 2000 and only standard non-native code".     -- Peso 1 SELECT               pre_nbr,                              (                                                           SELECT               x.val                                                           FROM                dbo.Nums AS x                                                           WHERE              x.nbr = d.pre_nbr                              ) AS pre_val,                              d.nbr,                              d.val,                              d.nxt_nbr,                              (                                                           SELECT               x.val                                                           FROM                dbo.Nums AS x                                                           WHERE              x.nbr = d.nxt_nbr                              ) AS nxt_val FROM                (                                                           SELECT               (                                                                                                                     SELECT               MAX(x.nbr) AS nbr                                                                                                                     FROM                dbo.Nums AS x                                                                                                                     WHERE              x.nbr < n.nbr                                                                                        ) AS pre_nbr,                                                                                        n.nbr,                                                                                        n.val,                                                                                        (                                                                                                                     SELECT               MIN(x.nbr) AS nbr                                                                                                                     FROM                dbo.Nums AS x                                                                                                                     WHERE              x.nbr > n.nbr                                                                                        ) AS nxt_nbr                                                           FROM                dbo.Nums AS n                              ) AS d -- Peso 2 CREATE TABLE #Temp                                                         (                                                                                        ID INT IDENTITY(1, 1) PRIMARY KEY,                                                                                        nbr INT,                                                                                        val INT                                                           )   INSERT                                            #Temp                                                           (                                                                                        nbr,                                                                                        val                                                           ) SELECT                                            nbr,                                                           val FROM                                             dbo.Nums ORDER BY         nbr   SELECT                                            pre.nbr AS pre_nbr,                                                           pre.val AS pre_val,                                                           t.nbr,                                                           t.val,                                                           nxt.nbr AS nxt_nbr,                                                           nxt.val AS nxt_val FROM                                             #Temp AS pre RIGHT JOIN      #Temp AS t ON t.ID = pre.ID + 1 LEFT JOIN         #Temp AS nxt ON nxt.ID = t.ID + 1   DROP TABLE    #Temp Notice there are no indexes on #Temp table yet. And here is where the conversation derailed. First I got this response back Now my solutions: --My 1st Slt SELECT T2.*, T1.*, T3.*   FROM Nums AS T1        LEFT JOIN Nums AS T2          ON T2.nbr = (SELECT MAX(nbr)                         FROM Nums                        WHERE nbr < T1.nbr)        LEFT JOIN Nums AS T3          ON T3.nbr = (SELECT MIN(nbr)                         FROM Nums                        WHERE nbr > T1.nbr); --My 2nd Slt SELECT MAX(CASE WHEN N1.nbr > N2.nbr THEN N2.nbr ELSE NULL END) AS pre_nbr,        (SELECT val FROM Nums WHERE nbr = MAX(CASE WHEN N1.nbr > N2.nbr THEN N2.nbr ELSE NULL END)) AS pre_val,        N1.nbr AS cur_nbr, N1.val AS cur_val,        MIN(CASE WHEN N1.nbr < N2.nbr THEN N2.nbr ELSE NULL END) AS nxt_nbr,        (SELECT val FROM Nums WHERE nbr = MIN(CASE WHEN N1.nbr < N2.nbr THEN N2.nbr ELSE NULL END)) AS nxt_val   FROM Nums AS N1,        Nums AS N2  GROUP BY N1.nbr, N1.val;   /* My 1st Slt Table 'Nums'. Scan count 7, logical reads 14 My 2nd Slt Table 'Nums'. Scan count 4, logical reads 23 Peso 1 Table 'Nums'. Scan count 9, logical reads 28 Peso 2 Table '#Temp'. Scan count 0, logical reads 7 Table 'Nums'. Scan count 1, logical reads 2 Table '#Temp'. Scan count 3, logical reads 16 */  To this, I emailed him back asking for a scalability test What if you try with a Nums table with 100,000 rows? His response to that started to get nasty.  I have to say Peso 2 is not acceptable. As I said before the solution must be standard, ORDER BY is not part of standard SELECT. Try this without ORDER BY:  Truncate Table Nums INSERT INTO Nums (nbr, val) VALUES (1, 0),(9,4), (5, 7)  So now we have new rules. No ORDER BY because it's not standard SQL! Of course I asked him  Why do you have that idea? ORDER BY is not standard? To this, his replies went stranger and stranger Standard Select = Set-based (no any cursor) It’s free to know, just refer to Advanced SQL Programming by Celko or mail to him if you accept comments from him. What the stalker probably doesn't know, is that I and Mr Celko occasionally are involved in some conversation and thus we exchange emails. I don't know if this reference to Mr Celko was made to intimidate me either. So I answered him, still polite, this What do you mean? The SELECT itself has a ”cursor under the hood”. Now the stalker gets rude  But however I mean the solution must no containing any order by, top... No problem, I do not like Peso 2, it’s very non-intelligent and elementary. Yes, Peso 2 is elementary but most performing queries are... And now is the time where I started to feel the stalker really wanted to achieve something else, so I wrote to him So what is your goal? Have a query that performs well, or a query that is super-portable? My Peso 2 outperforms any of your code with a factor of 100 when using more than 100,000 rows. While I awaited his answer, I posted him this query Ok, here is another one -- Peso 3 SELECT             MAX(CASE WHEN d = 1 THEN nbr ELSE NULL END) AS pre_nbr,                    MAX(CASE WHEN d = 1 THEN val ELSE NULL END) AS pre_val,                    MAX(CASE WHEN d = 0 THEN nbr ELSE NULL END) AS nbr,                    MAX(CASE WHEN d = 0 THEN val ELSE NULL END) AS val,                    MAX(CASE WHEN d = -1 THEN nbr ELSE NULL END) AS nxt_nbr,                    MAX(CASE WHEN d = -1 THEN val ELSE NULL END) AS nxt_val FROM               (                              SELECT    nbr,                                        val,                                        ROW_NUMBER() OVER (ORDER BY nbr) AS SeqID                              FROM      dbo.Nums                    ) AS s CROSS JOIN         (                              VALUES    (-1),                                        (0),                                        (1)                    ) AS x(d) GROUP BY           SeqID + x.d HAVING             COUNT(*) > 1 And here is the stats Table 'Nums'. Scan count 1, logical reads 2, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. It beats the hell out of your queries…. Now I finally got a response from my stalker and now I also clicked who he was. This is his reponse Why you post my original method with a bit change under you name? I do not like it. See: http://www.sqlservercentral.com/Forums/Topic468501-362-14.aspx ;WITH C AS ( SELECT seq_nbr, k,        DENSE_RANK() OVER(ORDER BY seq_nbr ASC) + k AS grp_fct   FROM [Sample]         CROSS JOIN         (VALUES (-1), (0), (1)         ) AS D(k) ) SELECT MIN(seq_nbr) AS pre_value,        MAX(CASE WHEN k = 0 THEN seq_nbr END) AS current_value,        MAX(seq_nbr) AS next_value   FROM C GROUP BY grp_fct HAVING min(seq_nbr) < max(seq_nbr); These posts: Posted Tuesday, April 12, 2011 10:04 AM Posted Tuesday, April 12, 2011 1:22 PM Why post a solution where will not work in SQL Server 2000? Wait a minute! His own solution is using both a CTE and a ranking function so his query will not work on SQL Server 2000! Bummer... The reference to "Me not like" are my exact words in a previous topic on SQLTeam.com and when I remembered the phrasing, I also knew who he was. See this topic http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=159262 where he writes a query and posts it under my name, as if I wrote it. So I answered him this (less polite). Like I keep track of all topics in the whole world… J So you think you are the only one coming up with this idea? Besides, “M S solution” doesn’t work.   This is the result I get pre_value        current_value                             next_value 1                           1                           5 1                           5                           9 5                           9                           9   And I did nothing like you did here, where you posted a solution which you “thought” I should write http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=159262 So why are you yourself using ranking function when this was not allowed per your original email, and no cte? You use CTE in your link above, which do not work in SQL Server 2000. All this makes no sense to me, other than you are trying your best to once in a lifetime create a better performing query than me? After a few hours I get this email back. I don't fully understand it, but it's probably a language barrier. >>Like I keep track of all topics in the whole world… J So you think you are the only one coming up with this idea?<< You right, but do not think you are the first creator of this.   >>Besides, “M S Solution” doesn’t work. This is the result I get <<   Why you get so unimportant mistake? See this post to correct it: Posted 4/12/2011 8:22:23 PM >> So why are you yourself using ranking function when this was not allowed per your original email, and no cte? You use CTE in your link above, which do not work in SQL Server 2000. <<  Again, why you get some unimportant incompatibility? You offer that solution for current goals not me  >> All this makes no sense to me, other than you are trying your best to once in a lifetime create a better performing query than me? <<  No, I only wanted to know who you will solve it. Now I know you do not have a special solution. No problem. No problem for me either. So I just answered him I am not the first, and you are not the first to come up with this idea. So what is your problem? I am pretty sure other people have come up with the same idea before us. I used this technique all the way back to 2007, see http://www.sqlteam.com/forums/topic.asp?TOPIC_ID=93911 Let's see if he returns...  He did! >> So what is your problem? << Nothing Thanks for all replies; maybe we have some competitions in future, maybe. Also I like you but you do not attend it. Your behavior with me is not friendly. Not any meeting… Regards //Peso

    Read the article

  • Who writes the words? A rant with graphs.

    - by Roger Hart
    If you read my rant, you'll know that I'm getting a bit of a bee in my bonnet about user interface text. But rather than just yelling about the way the world should be (short version: no UI text would suck), it seemed prudent to actually gather some data. Rachel Potts has made an excellent first foray, by conducting a series of interviews across organizations about how they write user interface text. You can read Rachel's write up here. She presents the facts as she found them, and doesn't editorialise. The result is insightful, but impartial isn't really my style. So here's a rant with graphs. My method, and how it sucked I sent out a short survey. Survey design is one of my hobby-horses, and since some smartarse in the comments will mention it if I don't, I'll step up and confess: I did not design this one well. It was potentially ambiguous, implicitly excluded people, and since I only really advertised it on Twitter and a couple of mailing lists the sample will be chock full of biases. Regardless, these were the questions: What do you do? Select the option that best describes your role What kind of software does your organization make? (optional) In your organization, who writes the text on your software user interfaces? (for example: button names, static text, tooltips, and so on) Tick all that apply. In your organization who is responsible for user interface text? Who "owns" it? The most glaring issue (apart from question 3 being a bit broken) was that I didn't make it clear that I was asking about applications. Desktop, mobile, or web, I wouldn't have minded. In fact, it might have been interesting to categorize and compare. But a few respondents commented on the seeming lack of relevance, since they didn't really make software. There were some other issues too. It wasn't the best survey. So, you know, pinch of salt time with what follows. Despite this, there were 100 or so respondents. This post covers the overview, and you can look at the raw data in this spreadsheet What did people do? Boring graph number one: I wasn't expecting that. Given I pimped the survey on twitter and a couple of Tech Comms discussion lists, I was more banking on and even Content Strategy/Tech Comms split. What the "Others" specified: Three people chipped in with Technical Writer. Author, apparently, doesn't cut it. There's a "nobody reads the instructions" joke in there somewhere, I'm sure. There were a couple of hybrid roles, including Tech Comms and Testing, which sounds gruelling and thankless. There was also, an Intranet Manager, a Creative Director, a Consultant, a CTO, an Information Architect, and a Translator. That's a pretty healthy slice through the industry. Who wrote UI text? Boring graph number two: Annoyingly, I made this a "tick all that apply" question, so I can't make crude and inflammatory generalizations about percentages. This is more about who gets involved in user interface wording. So don't panic about the number of developers writing UI text. First off, it just means they're involved. Second, they might be good at it. What? It could happen. Ours are involved - they write a placeholder and flag it to me for changes. Sometimes I don't make any. It's also not surprising that there's so much UX in the mix. Some of that will be people taking care, and crafting an understandable interface. Some of it will be whatever text goes on the wireframe making it into production. I'm going to assume that's what happened at eBay, when their iPhone app purportedly shipped with the placeholder text "Some crappy content goes here". Ahem. Listing all 17 "other" responses would make this post lengthy indeed, but you can read them in the raw data spreadsheet. The award for the approach that sounds the most like a good idea yet carries the highest risk of ending badly goes to whoever offered up "External agencies using focus groups". If you're reading this, and that actually works, leave a comment. I'm fascinated. Who owned UI text Stop. Bar chart time: Wow. Let's cut to the chase, and by "chase", I mean those inflammatory generalizations I was talking about: In around 60% of cases the person responsible for user interface text probably lacks the relevant expertise. Even in the categories I count as being likely to have relevant skills (Marketing Copywriters, Content Strategists, Technical Authors, and User Experience Designers) there's a case for each role being unsuited, as you'll see in Rachel's blog post So it's not as simple as my headline. Does that mean that you personally, Mr Developer reading this, write bad button names? Of course not. I know nothing about you. It rather implies that as a category, the majority of people looking after UI text have neither communication nor user experience as their primary skill set, and as such will probably only be good at this by happy accident. I don't have a way of measuring those frequency of those accidents. What the Others specified: I don't know who owns it. I assume the project manager is responsible. "copywriters" when they wish to annoy me. the client's web maintenance person, often PR or MarComm That last one chills me to the bone. Still, at least nobody said "the work experience kid". You can see the rest in the spreadsheet. My overwhelming impression here is of user interface text as an unloved afterthought. There were fewer "nobody" responses than I expected, and a much broader split. But the relative predominance of developers owning and writing UI text suggests to me that organizations don't see it as something worth dedicating attention to. If true, that's bothersome. Because the words on the screen, particularly the names of things, are fundamental to the ability to understand an use software. It's also fascinating that Technical Authors and Content Strategists are neck and neck. For such a nascent discipline, Content Strategy appears to have made a mark on software development. Or my sample is skewed. But it feels like a bit of validation for my rant: Content Strategy is eating Tech Comms' lunch. That's not a bad thing. Well, not if the UI text is getting done well. And that's the caveat to this whole post. I couldn't care less who writes UI text, provided they consider the user and don't suck at it. I care that it may be falling by default to people poorly disposed to doing it right. And I care about that because so much user interface text sucks. The most interesting question Was one I forgot to ask. It's this: Does your organization have technical authors/writers? Like a lot of survey data, that doesn't tell you much on its own. But once we get a bit dimensional, it become more interesting. So taken with the other questions, this would have let me find out what I really want to know: What proportion of organizations have Tech Comms professionals but don't use them for UI text? Who writes UI text in their place? Why this happens? It's possible (feasible is another matter) that hundreds of companies have tech authors who don't work on user interfaces because they've empirically discovered that someone else, say the Marketing Copywriter, is better at it. And once we've all finished laughing, I'll point out that I've met plenty of tech authors who just aren't used to thinking about users at the point of need in the way UI text and embedded user assistance require. If you've got what I regard, perhaps unfairly, as the bad kind of tech author - the old-school kind with the thousand-page pdf and the grammar obsession - if you've got one of those then you probably are better off getting the UX folk or the copywriters to do your UI text. At the very least, they'll derive terminology from user research.

    Read the article

  • Fraud Detection with the SQL Server Suite Part 1

    - by Dejan Sarka
    While working on different fraud detection projects, I developed my own approach to the solution for this problem. In my PASS Summit 2013 session I am introducing this approach. I also wrote a whitepaper on the same topic, which was generously reviewed by my friend Matija Lah. In order to spread this knowledge faster, I am starting a series of blog posts which will at the end make the whole whitepaper. Abstract With the massive usage of credit cards and web applications for banking and payment processing, the number of fraudulent transactions is growing rapidly and on a global scale. Several fraud detection algorithms are available within a variety of different products. In this paper, we focus on using the Microsoft SQL Server suite for this purpose. In addition, we will explain our original approach to solving the problem by introducing a continuous learning procedure. Our preferred type of service is mentoring; it allows us to perform the work and consulting together with transferring the knowledge onto the customer, thus making it possible for a customer to continue to learn independently. This paper is based on practical experience with different projects covering online banking and credit card usage. Introduction A fraud is a criminal or deceptive activity with the intention of achieving financial or some other gain. Fraud can appear in multiple business areas. You can find a detailed overview of the business domains where fraud can take place in Sahin Y., & Duman E. (2011), Detecting Credit Card Fraud by Decision Trees and Support Vector Machines, Proceedings of the International MultiConference of Engineers and Computer Scientists 2011 Vol 1. Hong Kong: IMECS. Dealing with frauds includes fraud prevention and fraud detection. Fraud prevention is a proactive mechanism, which tries to disable frauds by using previous knowledge. Fraud detection is a reactive mechanism with the goal of detecting suspicious behavior when a fraudster surpasses the fraud prevention mechanism. A fraud detection mechanism checks every transaction and assigns a weight in terms of probability between 0 and 1 that represents a score for evaluating whether a transaction is fraudulent or not. A fraud detection mechanism cannot detect frauds with a probability of 100%; therefore, manual transaction checking must also be available. With fraud detection, this manual part can focus on the most suspicious transactions. This way, an unchanged number of supervisors can detect significantly more frauds than could be achieved with traditional methods of selecting which transactions to check, for example with random sampling. There are two principal data mining techniques available both in general data mining as well as in specific fraud detection techniques: supervised or directed and unsupervised or undirected. Supervised techniques or data mining models use previous knowledge. Typically, existing transactions are marked with a flag denoting whether a particular transaction is fraudulent or not. Customers at some point in time do report frauds, and the transactional system should be capable of accepting such a flag. Supervised data mining algorithms try to explain the value of this flag by using different input variables. When the patterns and rules that lead to frauds are learned through the model training process, they can be used for prediction of the fraud flag on new incoming transactions. Unsupervised techniques analyze data without prior knowledge, without the fraud flag; they try to find transactions which do not resemble other transactions, i.e. outliers. In both cases, there should be more frauds in the data set selected for checking by using the data mining knowledge compared to selecting the data set with simpler methods; this is known as the lift of a model. Typically, we compare the lift with random sampling. The supervised methods typically give a much better lift than the unsupervised ones. However, we must use the unsupervised ones when we do not have any previous knowledge. Furthermore, unsupervised methods are useful for controlling whether the supervised models are still efficient. Accuracy of the predictions drops over time. Patterns of credit card usage, for example, change over time. In addition, fraudsters continuously learn as well. Therefore, it is important to check the efficiency of the predictive models with the undirected ones. When the difference between the lift of the supervised models and the lift of the unsupervised models drops, it is time to refine the supervised models. However, the unsupervised models can become obsolete as well. It is also important to measure the overall efficiency of both, supervised and unsupervised models, over time. We can compare the number of predicted frauds with the total number of frauds that include predicted and reported occurrences. For measuring behavior across time, specific analytical databases called data warehouses (DW) and on-line analytical processing (OLAP) systems can be employed. By controlling the supervised models with unsupervised ones and by using an OLAP system or DW reports to control both, a continuous learning infrastructure can be established. There are many difficulties in developing a fraud detection system. As has already been mentioned, fraudsters continuously learn, and the patterns change. The exchange of experiences and ideas can be very limited due to privacy concerns. In addition, both data sets and results might be censored, as the companies generally do not want to publically expose actual fraudulent behaviors. Therefore it can be quite difficult if not impossible to cross-evaluate the models using data from different companies and different business areas. This fact stresses the importance of continuous learning even more. Finally, the number of frauds in the total number of transactions is small, typically much less than 1% of transactions is fraudulent. Some predictive data mining algorithms do not give good results when the target state is represented with a very low frequency. Data preparation techniques like oversampling and undersampling can help overcome the shortcomings of many algorithms. SQL Server suite includes all of the software required to create, deploy any maintain a fraud detection infrastructure. The Database Engine is the relational database management system (RDBMS), which supports all activity needed for data preparation and for data warehouses. SQL Server Analysis Services (SSAS) supports OLAP and data mining (in version 2012, you need to install SSAS in multidimensional and data mining mode; this was the only mode in previous versions of SSAS, while SSAS 2012 also supports the tabular mode, which does not include data mining). Additional products from the suite can be useful as well. SQL Server Integration Services (SSIS) is a tool for developing extract transform–load (ETL) applications. SSIS is typically used for loading a DW, and in addition, it can use SSAS data mining models for building intelligent data flows. SQL Server Reporting Services (SSRS) is useful for presenting the results in a variety of reports. Data Quality Services (DQS) mitigate the occasional data cleansing process by maintaining a knowledge base. Master Data Services is an application that helps companies maintaining a central, authoritative source of their master data, i.e. the most important data to any organization. For an overview of the SQL Server business intelligence (BI) part of the suite that includes Database Engine, SSAS and SSRS, please refer to Veerman E., Lachev T., & Sarka D. (2009). MCTS Self-Paced Training Kit (Exam 70-448): Microsoft® SQL Server® 2008 Business Intelligence Development and Maintenance. MS Press. For an overview of the enterprise information management (EIM) part that includes SSIS, DQS and MDS, please refer to Sarka D., Lah M., & Jerkic G. (2012). Training Kit (Exam 70-463): Implementing a Data Warehouse with Microsoft® SQL Server® 2012. O'Reilly. For details about SSAS data mining, please refer to MacLennan J., Tang Z., & Crivat B. (2009). Data Mining with Microsoft SQL Server 2008. Wiley. SQL Server Data Mining Add-ins for Office, a free download for Office versions 2007, 2010 and 2013, bring the power of data mining to Excel, enabling advanced analytics in Excel. Together with PowerPivot for Excel, which is also freely downloadable and can be used in Excel 2010, is already included in Excel 2013. It brings OLAP functionalities directly into Excel, making it possible for an advanced analyst to build a complete learning infrastructure using a familiar tool. This way, many more people, including employees in subsidiaries, can contribute to the learning process by examining local transactions and quickly identifying new patterns.

    Read the article

  • CodePlex Daily Summary for Sunday, March 25, 2012

    CodePlex Daily Summary for Sunday, March 25, 2012Popular ReleasesAsp.NET Url Router: v1.0: build for .net 2.0 and .net 4.0menu4web: menu4web 0.0.3: menu4web 0.0.3Craig's Utility Library: Craig's Utility Library 3.1: This update adds about 60 new extension methods, a couple of new classes, and a number of fixes including: Additions Added DateSpan class Added GenericDelimited class Random additions Added static thread friendly version of Random.Next called ThreadSafeNext. AOP Manager additions Added Destroy function to AOPManager (clears out all data so system can be recreated. Really only useful for testing...) ORM additions Added PagedCommand and PageCount functions to ObjectBaseClass (same as M...MemoryLifter: MemoryLifter 2.4.1: KNOWN ISSUE: Sometime the automatic installation of SQL Compact Edition dependency does not work, when you are behind a proxy server. To solve that problem install SQL CE 3.5 SP2 manually from the following link: http://www.microsoft.com/download/en/details.aspx?id=5783SQL Monitor - managing sql server performance: SQLMon 4.2 alpha 14: 1. improved accuracy of logic fault checking in analysisDotSpatial: DotSpatial 1.1: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components are available as NuGet pa...Microsoft All-In-One Code Framework - a centralized code sample library: C++, .NET Coding Guideline: Microsoft All-In-One Code Framework Coding Guideline This document describes the coding style guideline for native C++ and .NET (C# and VB.NET) programming used by the Microsoft All-In-One Code Framework project team.WebDAV for WHS: Version 1.0.67: - Added: Check whether the Remote Web Access is turned on or not; - Added: Check for Add-In updates;Phalanger - The PHP Language Compiler for the .NET Framework: 3.0 (March 2012) for .NET 4.0: March release of Phalanger 3.0 significantly enhances performance, adds new features and fixes many issues. See following for the list of main improvements: New features: Phalanger Tools installable for Visual Studio 2011 Beta "filter" extension with several most used filters implemented DomDocument HTML parser, loadHTML() method mail() PHP compatible function PHP 5.4 T_CALLABLE token PHP 5.4 "callable" type hint PCRE: UTF32 characters in range support configuration supports <c...Nearforums - ASP.NET MVC forum engine: Nearforums v8.0: Version 8.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Internationalization Custom authentication provider Access control list for forums and threads Webdeploy package checksum: abc62990189cf0d488ef915d4a55e4b14169bc01 Visit Roadmap for more details.BIDS Helper: BIDS Helper 1.6: This beta release is the first to support SQL Server 2012 (in addition to SQL Server 2005, 2008, and 2008 R2). Since it is marked as a beta release, we are looking for bug reports in the next few months as you use BIDS Helper on real projects. In addition to getting all existing BIDS Helper functionality working appropriately in SQL Server 2012 (SSDT), the following features are new... Analysis Services Tabular Smart Diff Tabular Actions Editor Tabular HideMemberIf Tabular Pre-Build ...Json.NET: Json.NET 4.5 Release 1: New feature - Windows 8 Metro build New feature - JsonTextReader automatically reads ISO strings as dates New feature - Added DateFormatHandling to control whether dates are written in the MS format or ISO format, with ISO as the default New feature - Added DateTimeZoneHandling to control reading and writing DateTime time zone details New feature - Added async serialize/deserialize methods to JsonConvert New feature - Added Path to JsonReader/JsonWriter/ErrorContext and exceptions w...SCCM Client Actions Tool: SCCM Client Actions Tool v1.11: SCCM Client Actions Tool v1.11 is the latest version. It comes with following changes since last version: Fixed a bug when ping and cmd.exe kept running in endless loop after action progress was finished. Fixed update checking from Codeplex RSS feed. The tool is downloadable as a ZIP file that contains four files: ClientActionsTool.hta – The tool itself. Cmdkey.exe – command line tool for managing cached credentials. This is needed for alternate credentials feature when running the HTA...WebSocket4Net: WebSocket4Net 0.5: Changes in this release fixed the wss's default port bug improved JsonWebSocket supported set client access policy protocol for silverlight fixed a handshake issue in Silverlight fixed a bug that "Host" field in handshake hadn't contained port if the port is not default supported passing in Origin parameter for handshaking supported reacting pings from server side fixed a bug in data sending fixed the bug sending a closing handshake with no message which would cause an excepti...SuperWebSocket, a .NET WebSocket Server: SuperWebSocket 0.5: Changes included in this release: supported closing handshake queue checking improved JSON subprotocol supported sending ping from server to client fixed a bug about sending a closing handshake with no message refactored the code to improve protocol compatibility fixed a bug about sub protocol configuration loading in Mono improved BasicSubProtocol added JsonWebSocketSessionUltra Presenter Desktop: UltraPresenter Desktop Version 2012.03.18: New release with new interface design.LoU: LoU.Dungeons V0: LoU.Dungeons V0SSIS GoogleAnalyticsSource: Version 1.0.3 x64: Now it's possible to select 225 metrics and 93 dimensions and I have modified almost all of the data types.HTML to docx Converter: htmltodocx_0.5.1_alpha: Improved UTF-8 support. Please note, for proper UTF-8 support, there appears to be an issue with PHPWord that needs to be addressed. See: http://phpword.codeplex.com/discussions/261365 for a discussion.Daun Management Studio: Daun Management Studio 0.1 (Alpha Version): These are these the alpha application packages for Daun Management Studio to manage MongoDB Server. Please visit our official website http://www.daun-project.comNew ProjectsBundle Transformer: Bundle Transformer - a modular extension for System.Web.Optimization. Classes CssTransformer and JsTransformer, included in the core of Bundle Transformer, implement interface IBundleTransform. They are intended to replace the standard classes CssMinify and JsMinify.Catel fody plugin: Catel plugin for Fody. For more information about Fody, see http://code.google.com/p/fody/.CharsetConverter: CharsetConverter makes it easier to convert a bunch of files from one charset to another. Files are backuped and recoded. You will no longer need to convert one file after another. It's developed in VB.NET.FNSYS: ??GM Spriter: GM Spriter creates copy rectangle and scaling data for sprites that can drawn in pieces. Supporting a more advanced, optimized, and less resource intensive method.htty: htty is the HTTP TTY, a console application for interacting with web servers.menu4web: menu4web is lightweight javascript code for creating dynamic menus on web sites.mvcpractice2: mvcpractice2National Transportation Information Control Protocol Suite: A Standards Compliant Extensible NTCIP Software Solution. It is utilized in all federally funded projects in the world over. It contains utilities for encoding, decoding in various layer formats such as HDLC, PMPP, SNMP (BER, PER Etc.) There are also some proprietary features. This software was developed in my Free Time because the software in use by my organization was severely lacking and always required some type of major change for the smallest thing. After implementation and...OpenEMR: OpenEMR is an open source medical practice management application (EHR EMR PMS) featuring fully integrated electronic health records, scheduling, electronic billing, internationalization, free support, a vibrant community, and a whole lot more. Mirror of Sourceforge OpenEMR project.Opera Extension Creator: The program helps create extensions for Opera. You can write code in another editor (program can automatically checks for changes on the disk) and only one click to create .oex file and install it in Opera. It's developed in (V)C# 2005 and .NET 2.0. PADNUG Site Redesign: Bringing the PADNUG web site up to current technologies and giving it a facelift.Practico1: ejercicios de practico 1Roslyn and C#-Derived Languages: Roslyn and C#-Derived Languages(for example: Axum)Series Organiser: An app that organises tv series episodes based on the file name into a series folder structure.Silverlight socket component: Beetle.SL?????Silverlight socket?????,???????????,??????????tcp??;?????????????.Skylark: Skylark is an n-tier application to test drive different .NET technologies across various domain problems. It's developed in .NET C#

    Read the article

< Previous Page | 312 313 314 315 316 317 318 319 320 321 322 323  | Next Page >