Search Results

Search found 17157 results on 687 pages for 'cloud services'.

Page 102/687 | < Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >

  • The Best Free Alternatives to the Windows Task Manager

    - by Lori Kaufman
    The Windows Task Manager is a built-in tool that allows you to check which services are running in the background, how much resources are being used by which software programs, and the all-to-common task of killing programs that are not responding. Even though the Windows Task Manager has several useful tools, there are many free alternatives available that provide additional or expanded features, allowing you to more closely monitor and tweak your system. How To Play DVDs on Windows 8 6 Start Menu Replacements for Windows 8 What Is the Purpose of the “Do Not Cover This Hole” Hole on Hard Drives?

    Read the article

  • Host AngularJS (Html5Mode) in ASP.NET vNext

    - by Shaun
    Originally posted on: http://geekswithblogs.net/shaunxu/archive/2014/06/10/host-angularjs-html5mode-in-asp.net-vnext.aspxMicrosoft had announced ASP.NET vNext in BUILD and TechED recently and as a developer, I found that we can add features into one ASP.NET vNext application such as MVC, WebAPI, SignalR, etc.. Also it's cross platform which means I can host ASP.NET on Windows, Linux and OS X.   If you are following my blog you should knew that I'm currently working on a project which uses ASP.NET WebAPI, SignalR and AngularJS. Currently the AngularJS part is hosted by Express in Node.js while WebAPI and SignalR are hosted in ASP.NET. I was looking for a solution to host all of them in one platform so that my SignalR can utilize WebSocket. Currently AngularJS and SignalR are hosted in the same domain but different port so it has to use ServerSendEvent. It can be upgraded to WebSocket if I host both of them in the same port.   Host AngularJS in ASP.NET vNext Static File Middleware ASP.NET vNext utilizes middleware pattern to register feature it uses, which is very similar as Express in Node.js. Since AngularJS is a pure client side framework in theory what I need to do is to use ASP.NET vNext as a static file server. This is very easy as there's a build-in middleware shipped alone with ASP.NET vNext. Assuming I have "index.html" as below. 1: <html data-ng-app="demo"> 2: <head> 3: <script type="text/javascript" src="angular.js" /> 4: <script type="text/javascript" src="angular-ui-router.js" /> 5: <script type="text/javascript" src="app.js" /> 6: </head> 7: <body> 8: <h1>ASP.NET vNext with AngularJS</h1> 9: <div> 10: <a href="javascript:void(0)" data-ui-sref="view1">View 1</a> | 11: <a href="javascript:void(0)" data-ui-sref="view2">View 2</a> 12: </div> 13: <div data-ui-view></div> 14: </body> 15: </html> And the AngularJS JavaScript file as below. Notices that I have two views which only contains one line literal indicates the view name. 1: 'use strict'; 2:  3: var app = angular.module('demo', ['ui.router']); 4:  5: app.config(['$stateProvider', '$locationProvider', function ($stateProvider, $locationProvider) { 6: $stateProvider.state('view1', { 7: url: '/view1', 8: templateUrl: 'view1.html', 9: controller: 'View1Ctrl' }); 10:  11: $stateProvider.state('view2', { 12: url: '/view2', 13: templateUrl: 'view2.html', 14: controller: 'View2Ctrl' }); 15: }]); 16:  17: app.controller('View1Ctrl', function ($scope) { 18: }); 19:  20: app.controller('View2Ctrl', function ($scope) { 21: }); All AngularJS files are located in "app" folder and my ASP.NET vNext files are besides it. The "project.json" contains all dependencies I need to host static file server. 1: { 2: "dependencies": { 3: "Helios" : "0.1-alpha-*", 4: "Microsoft.AspNet.FileSystems": "0.1-alpha-*", 5: "Microsoft.AspNet.Http": "0.1-alpha-*", 6: "Microsoft.AspNet.StaticFiles": "0.1-alpha-*", 7: "Microsoft.AspNet.Hosting": "0.1-alpha-*", 8: "Microsoft.AspNet.Server.WebListener": "0.1-alpha-*" 9: }, 10: "commands": { 11: "web": "Microsoft.AspNet.Hosting server=Microsoft.AspNet.Server.WebListener server.urls=http://localhost:22222" 12: }, 13: "configurations" : { 14: "net45" : { 15: }, 16: "k10" : { 17: "System.Diagnostics.Contracts": "4.0.0.0", 18: "System.Security.Claims" : "0.1-alpha-*" 19: } 20: } 21: } Below is "Startup.cs" which is the entry file of my ASP.NET vNext. What I need to do is to let my application use FileServerMiddleware. 1: using System; 2: using Microsoft.AspNet.Builder; 3: using Microsoft.AspNet.FileSystems; 4: using Microsoft.AspNet.StaticFiles; 5:  6: namespace Shaun.AspNet.Plugins.AngularServer.Demo 7: { 8: public class Startup 9: { 10: public void Configure(IBuilder app) 11: { 12: app.UseFileServer(new FileServerOptions() { 13: EnableDirectoryBrowsing = true, 14: FileSystem = new PhysicalFileSystem(System.IO.Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "app")) 15: }); 16: } 17: } 18: } Next, I need to create "NuGet.Config" file in the PARENT folder so that when I run "kpm restore" command later it can find ASP.NET vNext NuGet package successfully. 1: <?xml version="1.0" encoding="utf-8"?> 2: <configuration> 3: <packageSources> 4: <add key="AspNetVNext" value="https://www.myget.org/F/aspnetvnext/api/v2" /> 5: <add key="NuGet.org" value="https://nuget.org/api/v2/" /> 6: </packageSources> 7: <packageSourceCredentials> 8: <AspNetVNext> 9: <add key="Username" value="aspnetreadonly" /> 10: <add key="ClearTextPassword" value="4d8a2d9c-7b80-4162-9978-47e918c9658c" /> 11: </AspNetVNext> 12: </packageSourceCredentials> 13: </configuration> Now I need to run "kpm restore" to resolve all dependencies of my application. Finally, use "k web" to start the application which will be a static file server on "app" sub folder in the local 22222 port.   Support AngularJS Html5Mode AngularJS works well in previous demo. But you will note that there is a "#" in the browser address. This is because by default AngularJS adds "#" next to its entry page so ensure all request will be handled by this entry page. For example, in this case my entry page is "index.html", so when I clicked "View 1" in the page the address will be changed to "/#/view1" which means it still tell the web server I'm still looking for "index.html". This works, but makes the address looks ugly. Hence AngularJS introduces a feature called Html5Mode, which will get rid off the annoying "#" from the address bar. Below is the "app.js" with Html5Mode enabled, just one line of code. 1: 'use strict'; 2:  3: var app = angular.module('demo', ['ui.router']); 4:  5: app.config(['$stateProvider', '$locationProvider', function ($stateProvider, $locationProvider) { 6: $stateProvider.state('view1', { 7: url: '/view1', 8: templateUrl: 'view1.html', 9: controller: 'View1Ctrl' }); 10:  11: $stateProvider.state('view2', { 12: url: '/view2', 13: templateUrl: 'view2.html', 14: controller: 'View2Ctrl' }); 15:  16: // enable html5mode 17: $locationProvider.html5Mode(true); 18: }]); 19:  20: app.controller('View1Ctrl', function ($scope) { 21: }); 22:  23: app.controller('View2Ctrl', function ($scope) { 24: }); Then let's went to the root path of our website and click "View 1" you will see there's no "#" in the address. But the problem is, if we hit F5 the browser will be turn to blank. This is because in this mode the browser told the web server I want static file named "view1" but there's no file on the server. So underlying our web server, which is built by ASP.NET vNext, responded 404. To fix this problem we need to create our own ASP.NET vNext middleware. What it needs to do is firstly try to respond the static file request with the default StaticFileMiddleware. If the response status code was 404 then change the request path value to the entry page and try again. 1: public class AngularServerMiddleware 2: { 3: private readonly AngularServerOptions _options; 4: private readonly RequestDelegate _next; 5: private readonly StaticFileMiddleware _innerMiddleware; 6:  7: public AngularServerMiddleware(RequestDelegate next, AngularServerOptions options) 8: { 9: _next = next; 10: _options = options; 11:  12: _innerMiddleware = new StaticFileMiddleware(next, options.FileServerOptions.StaticFileOptions); 13: } 14:  15: public async Task Invoke(HttpContext context) 16: { 17: // try to resolve the request with default static file middleware 18: await _innerMiddleware.Invoke(context); 19: Console.WriteLine(context.Request.Path + ": " + context.Response.StatusCode); 20: // route to root path if the status code is 404 21: // and need support angular html5mode 22: if (context.Response.StatusCode == 404 && _options.Html5Mode) 23: { 24: context.Request.Path = _options.EntryPath; 25: await _innerMiddleware.Invoke(context); 26: Console.WriteLine(">> " + context.Request.Path + ": " + context.Response.StatusCode); 27: } 28: } 29: } We need an option class where user can specify the host root path and the entry page path. 1: public class AngularServerOptions 2: { 3: public FileServerOptions FileServerOptions { get; set; } 4:  5: public PathString EntryPath { get; set; } 6:  7: public bool Html5Mode 8: { 9: get 10: { 11: return EntryPath.HasValue; 12: } 13: } 14:  15: public AngularServerOptions() 16: { 17: FileServerOptions = new FileServerOptions(); 18: EntryPath = PathString.Empty; 19: } 20: } We also need an extension method so that user can append this feature in "Startup.cs" easily. 1: public static class AngularServerExtension 2: { 3: public static IBuilder UseAngularServer(this IBuilder builder, string rootPath, string entryPath) 4: { 5: var options = new AngularServerOptions() 6: { 7: FileServerOptions = new FileServerOptions() 8: { 9: EnableDirectoryBrowsing = false, 10: FileSystem = new PhysicalFileSystem(System.IO.Path.Combine(AppDomain.CurrentDomain.BaseDirectory, rootPath)) 11: }, 12: EntryPath = new PathString(entryPath) 13: }; 14:  15: builder.UseDefaultFiles(options.FileServerOptions.DefaultFilesOptions); 16:  17: return builder.Use(next => new AngularServerMiddleware(next, options).Invoke); 18: } 19: } Now with these classes ready we will change our "Startup.cs", use this middleware replace the default one, tell the server try to load "index.html" file if it cannot find resource. The code below is just for demo purpose. I just tried to load "index.html" in all cases once the StaticFileMiddleware returned 404. In fact we need to validation to make sure this is an AngularJS route request instead of a normal static file request. 1: using System; 2: using Microsoft.AspNet.Builder; 3: using Microsoft.AspNet.FileSystems; 4: using Microsoft.AspNet.StaticFiles; 5: using Shaun.AspNet.Plugins.AngularServer; 6:  7: namespace Shaun.AspNet.Plugins.AngularServer.Demo 8: { 9: public class Startup 10: { 11: public void Configure(IBuilder app) 12: { 13: app.UseAngularServer("app", "/index.html"); 14: } 15: } 16: } Now let's run "k web" again and try to refresh our browser and we can see the page loaded successfully. In the console window we can find the original request got 404 and we try to find "index.html" and return the correct result.   Summary In this post I introduced how to use ASP.NET vNext to host AngularJS application as a static file server. I also demonstrated how to extend ASP.NET vNext, so that it supports AngularJS Html5Mode. You can download the source code here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Windows Azure Learning Plan - SQL Azure

    - by BuckWoody
    This is one in a series of posts on a Windows Azure Learning Plan. You can find the main post here. This one deals with Security for  Windows Azure.   Overview and Training Overview and general  information about SQL Azure - what it is, how it works, and where you can learn more. General Overview (sign-in required, but free) http://social.technet.microsoft.com/wiki/contents/articles/inside-sql-azure.aspx General Guidelines and Limitations http://msdn.microsoft.com/en-us/library/ee336245.aspx Microsoft SQL Azure Documentation http://msdn.microsoft.com/en-us/windowsazure/sqlazure/default.aspx Samples and Learning Sources for online and other SQL Azure Training Free Online Training http://blogs.msdn.com/b/sqlazure/archive/2010/05/06/10007449.aspx 60-minute Overview (webcast) https://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?culture=en-US&EventID=1032458620&CountryCode=US Architecture SQL Azure Internals and Architectures for Scale Out and other use-cases. SQL Azure Architecture http://social.technet.microsoft.com/wiki/contents/articles/inside-sql-azure.aspx Scale-out Architectures http://tinyurl.com/247zm33 Federation Concepts http://tinyurl.com/34eew2w Use-Cases http://blogical.se/blogs/jahlen/archive/2010/11/23/sql-azure-why-use-it-and-what-makes-it-different-from-sql-server.aspx SQL Azure Security Model (video) http://www.msdev.com/Directory/Description.aspx?EventId=1491 Administration Standard Administrative Tasks and Tools Tools Options http://social.technet.microsoft.com/wiki/contents/articles/overview-of-tools-to-use-with-sql-azure.aspx SQL Azure Migration Wizard http://sqlazuremw.codeplex.com/ Managing Databases and Login Security http://msdn.microsoft.com/en-us/library/ee336235.aspx General Security for SQL Azure http://msdn.microsoft.com/en-us/library/ff394108.aspx Backup and Recovery http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-backup-and-restore-strategy.aspx More Backup and Recovery Options http://social.technet.microsoft.com/wiki/contents/articles/current-options-for-backing-up-data-with-sql-azure.aspx Syncing Large Databases to SQL Azure http://blogs.msdn.com/b/sync/archive/2010/09/24/how-to-sync-large-sql-server-databases-to-sql-azure.aspx Programming Programming Patterns and Architectures for SQL Azure systems. How to Build and Manage a Business Database on SQL Azure http://tinyurl.com/25q5v6g Connection Management http://social.technet.microsoft.com/wiki/contents/articles/sql-azure-connection-management-in-sql-azure.aspx Transact-SQL Supported by SQL Azure http://msdn.microsoft.com/en-us/library/ee336250.aspx

    Read the article

  • Podcast Show Notes &ndash; Oracle Coherence and Data Grid Technology - Part 1

    - by Bob Rhubart
    This week’s ArchBeat Podcast program kicks off a three-part series featuring a discussion of Oracle Coherence and data grid technology. Listen to Part 1 The panelists for this discussion are: Cameron Purdy, VP of Development, Oracle Blog | Twitter | LinkedIn | Oracle Mix Aleksandar Seovic, founder and managing director at S4HC Inc. Blog| Twitter | LinkedIn | Oracle Mix | Oracle ACE Profile (Aleks is also the author of  Oracle Coherence 3.5 from Packt Publishing.) John Stouffer, independent consultant, Oracle Applications DBA/Architect Blog |  LinkedIn | Oracle Mix | Oracle ACE Profile Part two will be available on June 23, part 3 on June 30. Coming soon On July 7 the ArchBeat Podcast kicks of a series featuring an open discussion of Architecture and Agility. Stay tuned: RSS   Technorati Tags: oracle,otn,arch2arch,archbeat,coherence,data grid,cameron purdy,aleks seovic del.icio.us Tags: oracle,otn,arch2arch,archbeat,coherence,data grid,cameron purdy,aleks seovic

    Read the article

  • MAAS not working

    - by Zimika
    I'm trying a whole week to make MASS to work, but no results. This is the procedure: maas createsuperuser sudo apt-get install maas-dhcp maas-import-isos apt-get install maas-enlist tftpd-hpa maas-import-isos After this I start node machine with PXE as default first boot device, and select option maas-enlist. Sistem start to installing some things, and after that on MAAS WebUI stand 1 node. But when I click on Node button on WebUI or refresh page, it show 0 node. Command cobbler system list showing: root@maas-serv:~# cobbler system list default node-3ee116ea-a8b2-11e1-a735-50e549e38206 Also, sometime this procedure register one node, but when registering second one, same problem like I described up. root@maas-serv:/var/lib/tftpboot/pxelinux.cfg# is empty, only default is there. On command maas-import-isos there is a one warning. ftpboot/images/precise-x86_64-maas-ephemeral/initrd copying images generating GPXE/PXE configuration files generating PXE menu structure warning: kernel option length exceeds 255 warning: kernel option length exceeds 255 warning: kernel option length exceeds 255 warning: kernel option length exceeds 255 copying files for distro: precise-x86_64

    Read the article

  • Amazon CloudFront Cache Invalidation – Fill out the Survey!

    - by joelvarty
    Amazon have come up with a survey regarding how cache can be invalidated on object stored in their CloudFront servers. http://survey.amazonwebservices.com/survey/s?s=1369   This is a key feature for Agility CMS, and for a lot of other applications. If it’s important to you, I suggest you spend a few minutes and fill it out. more later - joel

    Read the article

  • Markus Zirn, "Big Data with CEP and SOA" @ SOA, Cloud &amp; Service Technology Symposium 2012

    - by JuergenKress
    ORACLE PROMOTIONAL DISCOUNT FOR EXCLUSIVE ORACLE DISCOUNT, ENTER PROMO CODE: DJMXZ370 Early-Bird Registration is Now Open with Special Pricing! Register before July 1, 2012 to qualify for discounts. Visit the Registration page for details. The International SOA, Cloud + Service Technology Symposium is a yearly event that features the top experts and authors from around the world, providing a series of keynotes, talks, demonstrations, and panels, as well as training and certification workshops - all dedicated to empowering IT professionals to realize modern service technologies and practices in the real world. Click here for a two-page printable conference overview (PDF). Big Data with CEP and SOA - September 25, 2012 - 14:15 Speaker: Markus Zirn, Oracle and Baz Kuthi, Avocent The "Big Data" trend is driving new kinds of IT projects that process machine-generated data. Such projects store and mine using Hadoop/ Map Reduce, but they also analyze streaming data via event-driven patterns, which can be called "Fast Data" complementary to "Big Data". This session highlights how "Big Data" and "Fast Data" design patterns can be combined with SOA design principles into modern, event-driven architectures. We will describe specific architectures that combines CEP, Distributed Caching, Event-driven Network, SOA Composites, Application Development Framework, as well as Hadoop. Architecture patterns include pre-processing and filtering event streams as close as possible to the event source, in memory master data for event pattern matching, event-driven user interfaces as well as distributed event processing. Focus is on how "Fast Data" requirements are elegantly integrated into a traditional SOA architecture. Markus Zirn is Vice President of Product Management covering Oracle SOA Suite, SOA Governance, Application Integration Architecture, BPM, BPM Solutions, Complex Event Processing and UPK, an end user learning solution. He is the author of “The BPEL Cookbook” (rated best book on Services Oriented Architecture in 2007) as well as “Fusion Middleware Patterns”. Previously, he was a management consultant with Booz Allen & Hamilton’s High Tech practice in Duesseldorf as well as San Francisco and Vice President of Product Marketing at QUIQ. Mr. Zirn holds a Masters of Electrical Engineering from the University of Karlsruhe and is an alumnus of the Tripartite program, a joint European degree from the University of Karlsruhe, Germany, the University of Southampton, UK, and ESIEE, France. KEYNOTES & SPEAKERS More than 80 international subject matter experts will be speaking at the Symposium. Below are confirmed keynotes and speakers so far. Over 50% of the agenda has not yet been finalized. Many more speakers to come. View the partial program calendars on the Conference Agenda page. CONFERENCE THEMES & TRACKS Cloud Computing Architecture & Patterns New SOA & Service-Orientation Practices & Models Emerging Service Technology Innovation Service Modeling & Analysis Techniques Service Infrastructure & Virtualization Cloud-based Enterprise Architecture Business Planning for Cloud Computing Projects Real World Case Studies Semantic Web Technologies (with & without the Cloud) Governance Frameworks for SOA and/or Cloud Computing Projects Service Engineering & Service Programming Techniques Interactive Services & the Human Factor New REST & Web Services Tools & Techniques Oracle Specialized SOA & BPM Partners Oracle Specialized partners have proven their skills by certifications and customer references. To find a local Specialized partner please visit http://solutions.oracle.com SOA & BPM Partner Community For regular information on Oracle SOA Suite become a member in the SOA & BPM Partner Community for registration please visit  www.oracle.com/goto/emea/soa (OPN account required) If you need support with your account please contact the Oracle Partner Business Center. Blog Twitter LinkedIn Mix Forum Technorati Tags: Markus Zirn,SOA Symposium,Thomas Erl,SOA Community,Oracle SOA,Oracle BPM,BPM Community,OPN,Jürgen Kress

    Read the article

  • Open Source Survey: Oracle Products on Top

    - by trond-arne.undheim
    Oracle continues to work with the open source community to bring the most innovative and productive software to market (more). Oracle products received the most votes in several key categories of the 2010 Linux Journal Reader's Choice Awards. With over 12,000 technologists reporting, these product earned top spots: Best Office Suite: OpenOffice.org Best Single Office Program: OpenOffice.org Writer Best Database: MySQL Best Virtualization Solution: VirtualBox "As the leading open source technology and service provider, Oracle continues to work with the community stakeholders to rapidly innovate many open source products for use in fully tested production environments," says Edward Screven, Oracle's chief corporate architect. "Supporting open source is important to Oracle and our customers, and we continue to invest in it." According to a recent report by the Linux Foundation, Oracle is one of the top ten contributors to the Linux Kernel. Oracle also contributes millions of lines of code to these important projects: OpenJDK: 7,002,579 Eclipse: 1,800,000 (#3 in active committers) MySQL: 5,073,113 NetBeans: 7,870,446 JSF: 701,980 Apache MyFaces Trinidad: 1,316,840 Hudson: 1,209,779 OpenOffice.org: 7,500,000

    Read the article

  • Unlock the Full Value of Oracle CRM On Demand

    - by charles.knapp
    Register for this live Oracle CRM On Demand Virtual Community Session! Oracle CRM On Demand delivers the most complete On Demand CRM available anywhere. But how can you ensure you are getting maximum value from the many powerful features that Oracle CRM On Demand offers? Join our interactive Oracle CRM On Demand Virtual Community Session on Tuesday, January 11, 2011 from 10.00-11 a.m. PT / 1 p.m.-2 p.m. ET to get expert advice and discuss the best ways to unlock the full potential of Oracle CRM Demand with Mike Lairson, author of 'Oracle CRM On Demand Reporting'. Book OfferSend your Oracle CRM On Demand configuration ideas before the Webcast to [email protected] and you could win a free copy of 'Oracle CRM On Demand Reporting' by Mike Lairson. Learn more and register now!

    Read the article

  • Access and Manage Your Ubuntu One Account in Chrome and Iron

    - by Asian Angel
    Do you have an Ubuntu One account that you access across different operating systems? Whether you are using Ubuntu, a different flavor of Linux, Windows, or Mac the Ubuntu One web app makes it easy to access and manage your Ubuntu One account in just moments. The Ubuntu One web app will definitely be useful if you find yourself away from your favorite Ubuntu computer but need to get important files uploaded to your account. Ubuntu One [Chrome Web Store] Latest Features How-To Geek ETC Learn To Adjust Contrast Like a Pro in Photoshop, GIMP, and Paint.NET Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions Access and Manage Your Ubuntu One Account in Chrome and Iron Mouse Over YouTube Previews YouTube Videos in Chrome Watch a Machine Get Upgraded from MS-DOS to Windows 7 [Video] Bring the Whole Ubuntu Gang Home to Your Desktop with this Mascots Wallpaper Hack Apart a Highlighter to Create UV-Reactive Flowers [Science] Add a “Textmate Style” Lightweight Text Editor with Dropbox Syncing to Chrome and Iron

    Read the article

  • Running a Silverlight application in the Google App Engine platform

    - by rajbk
    This post shows you how to host a Silverlight application in the Google App Engine (GAE) platform. You deploy and host your Silverlight application on Google’s infrastructure by creating a configuration file and uploading it along with your application files. I tested this by uploading an old demo of mine - the four stroke engine silverlight demo. It is currently being served by the GAE over here: http://fourstrokeengine.appspot.com/ The steps to run your Silverlight application in GAE are as follows: Account Creation Create an account at http://appengine.google.com/. You are allocated a free quota at signup. Select “Create an Application”   Verify your account by SMS   Create your application by clicking on “Create an Application”   Pick an application identifier on the next screen. The identifier has to be unique. You will use this identifier when uploading your application. The application you create will by default be accessible at [applicationidentifier].appspot.com. You can also use custom domains if needed (refer to the docs).   Save your application. Download SDK  We will use the  Windows Launcher for Google App Engine tool to upload our apps (it is possible to do the same through command line). This is a GUI for creating, running and deploying applications. The launcher lets you test the app locally before deploying it to the GAE. This tool is available in the Google App Engine SDK. The GUI is written in Python and therefore needs an installation of Python to run. Download and install the Python Binaries from here: http://www.python.org/download/ Download and install the Google App Engine SDK from here: http://code.google.com/appengine/downloads.html Run the GAE Launcher. Select Create New Application.   On the next dialog, give your application a name (this must match the identifier we created earlier) For Parent Directory, point to the directory containing your Silverlight files. Change the port if you want to. The port is used by the GAE local web server. The server is started if you choose to run the application locally for testing purposes. Hit Save. Configure, Test and Upload As shown below, the files I am interested in uploading for my Silverlight demo app are The html page used to host the Silverlight control The xap file containing the compiled Silverlight application A favicon.ico file.   We now create a configuration file for our application called app.yaml. The app.yaml file specifies how URL paths correspond to request handlers and static files.  We edit the file by selecting our app in the GUI and clicking “Edit” The contents of file after editing is shown below (note that the contents of the file should be in plain text): application: fourstrokeengine version: 1 runtime: python api_version: 1 handlers: - url: /   static_files: Default.html   upload: Default.html - url: /favicon.ico   static_files: favicon.ico   upload: favicon.ico - url: /FourStrokeEngine.xap   static_files: FourStrokeEngine.xap   upload: FourStrokeEngine.xap   mime_type: application/x-silverlight-app - url: /.*   static_files: Default.html   upload: Default.html We have listed URL patterns for our files, specified them as static files and specified a mime type for our xap file. The wild card URL at the end will match all URLs that are not found to our default page (you would normally include a html file that displays a 404 message).  To understand more about app.yaml, refer to this page. Save the file. Run the application locally by selecting “Browse” in the GUI. A web server listening on the port you specified is started (8080 in my case). The app is loaded in your default web browser pointing to http://localhost:8080/. Make sure the application works as expected. We are now ready to deploy. Click the “Deploy” icon. You will be prompted for your username and password. Hit OK. The files will get uploaded and you should get a dialog telling you to “close the window”. We are done uploading our Silverlight application. Go to http://appengine.google.com/ and launch the application by clicking on the link in the “Current Version” column.   You should be taken to a URL which points to your application running in Google’s infrastructure : http://fourstrokeengine.appspot.com/. We are done deploying our application! Clicking on the link in the Application column will take you to the Admin console where you can see stats related to system usage.  To learn more about the Google Application Engine, go here: http://code.google.com/appengine/docs/whatisgoogleappengine.html

    Read the article

  • Moesion Webinar: Managing BizTalk Server from your Smartphone or Tablet Without Upsetting your Boss

    - by gsusx
    BizTalkers, This Thursday we will be hosting a webinar to highlight how to use Moesion to manage your BizTalk Server environment from your mobile device. We will walk through the complete feature set of Moesion HTML5 BizTalk management console as well as complementary features of the Moesion platform that can be used to manage your BizTalk environment from your mobile device. More importantly, if you are a BizTalk developer or IT Pro we REALLY REALLY REALLY would love to get your feedback about the...(read more)

    Read the article

  • Java EE 7 interview @ InfoQ

    - by alexismp
    Anil Gaur, the head of JavaEE and GlassFish at Oracle, was recently interviewed by InfoQ on the progress and scope of Java EE 7. Make sure you read this for an up-to-date status. It turns out that the final release of Java EE 7 in now slated for late Q1, early Q2, 2013 with most specs expected t be at the Public Review stage by this summer. In addition to the improvements to the JMS, JAX-RS, JPA and JSF specifications Anil also covers the new JSON and WebSockets JSRs and gives a complete overview of the Java EE platform at it is shaping up with PaaS as its main theme.

    Read the article

  • Simultaneously calling multiple methods on a WCF service from silverlight

    - by ola karlsson
    A while back I had to debug some performance issues in an existing Silverlight app, as the problem / solution was a bit obscure and finding info about it was quite tricky, I thought I’d share, maybe it can help the next person with this problem. The App On start, the app would do a number of calls to different methods on a WCF service, this to populate the UI with the necessary data. Recently one of those services had been changed and was now taking quite a bit longer than it used to. This was resulting in quite a long loading time for the whole UI, which was set up so it wouldn’t let the user interact with anything, until all the service calls had finished. First I broke out the longer running service call from the others, then removed the constraint that it had to be loaded for the UI in general to become responsive. I also added a loading indicator just on that area of the UI, thinking that the main UI would load while this particular section could keep loading independently. The Problem However this is where things started to get a bit strange. I found that even after these changes, the main UI wouldn’t activate until the long running call returned. So now, I did what I should have done to start with, I got Fiddler out and had a look at what was really happening. What I found was that, once the call to the long running service method was placed, all subsequent call were waiting for that one to return before executing. Not having really worked with WCF previously or knowing much about it in general, I was stumped… I knew of the issues where Silverlight is restricted by the browsers networking features in regards to number of simultaneous connections etc. However that just didn’t seem to be the issue here, you can clearly see in Fiddler that there’s numerous calls, but they’re just not returning. I thought of the problem maybe being in the WCF service, but the calls were really not that complicated and surely the service should be able to handle a lot more than what I was throwing at it! So I did what every developer does in this type of scenario, I hit the search engines. I did a whole bunch of searching on things like “multiple simultaneous WCF calls from Silverlight” and “Calling long running WCF services from Silverlight” etc. etc. This however, pretty much got me nowhere, I found a whole heap of resources on how to do WCF calls from Silverlight but most of them were very basic and of no use what so ever. The fog is clearing It wasn’t until I came across the term “ WCF blocking calls” and started incorporating that in my searches I started to get somewhere. Those searches quite quickly brought me to the following thread in the Silverlight forum “Long-running WCF call blocking subsequent calls” which discussed the exact problem I was facing and the best part, one of the guys there had the solution! The short answer is in the forum post and the guys answering, has also done a more extensive blog post about it called “Silverlight, WCF, and ASP.Net Configuration Gotchas” which covers it very well.  So come on what’s the solution?! I heard you ask, unless you’ve already gone to the links and looked it up ;) The Solution Well, it turns out that the issue is founded in a mix of Silverlight, Asp.Net and WCF, basically if you’re doing multiple calls to a single WCF web-service and you have Asp.Net session state enabled, the calls will be executed sequentially by the service, hence any long running calls will block subsequent ones. So why is Asp.Net session state effecting us, we’re working in Silverlight, right? We'll as mentioned earlier, by default Silverlight uses the browsers networking stack when doing service calls, hence to the WCF service, the call looks like it might as well be coming from a normal Asp.Net. To get around this, we look to a feature introduced in Silverlight 3, namely the Client HTTP Stack. The Client HTTP Stack to the rescue By using the following syntax (for example in our App.xaml.cs, Application_Startup method) WebRequest.RegisterPrefix("http://", WebRequestCreator.ClientHttp); we can set our Silverlight application to use the Client HTTP Stack, which incidentally solves our problem! By using Silverlights own networking stack, rather than that of the browser, we get around the Asp.Net - WCF session state issue. The above code specifies that all calls to addresses starting with “http://” should go through the client stack, this can actually be set more granular and you can specify it to be used only for certain domains etc. Summary The actual solution is well covered in the forum and blog posts I link to above. This post is more about sharing my experience, hopefully helping to spread the word about this and maybe make it a bit easier for the next poor guy with this issue to find the solution. Until next time, Ola

    Read the article

  • Squeezing hardware

    - by [email protected]
    It's very common that high availability means duplicate hardware so costs grows up.Nowadays, CIOs and DBAs has the main challenge of reduce the money spent increasing the performance and the availability. Since Grid Infrastructure 11gR2, there is a new feature that helps them to afford this challenge: Server PoolsNow, in Grid Infrastructure 11gR2, you can define server pools across the cluster setting up the minimum number of servers, the maximum and how important is the pool.For example:Consider  that "Velasco, Boixeda & co"  has 3 apps in a 6 servers cluster.First One is the main core business appSecond one is Mid RangeAnd third it's a database not very important.We Define the following resource requirements for expected workload:1- Main App 2 servers required2- Mid Range App requires 1 server3- Is not a required app in case of disasterThe we define 3 server pools across the cluster:1- Main pool min two servers, max three servers, importance four2- Mid pool, min one server max two servers, importance two3- test pool,min zero servers, max one server, importance oneSo the initial configuration is:-Main pool has three servers-Mid pool has two servers-Test pool has one serverLogically, we can see the cluster like this:If any server fails, the following algorithm will be applied:1.-The server pool of least importance2.-IF server pools are of the same importance,   THEN then the Server Pool that has more than its defined minimum servers Is chosenHope it helps 

    Read the article

  • The new Auto Scaling Service in Windows Azure

    - by shiju
    One of the key features of the Cloud is the on-demand scalability, which lets the cloud application developers to scale up or scale down the number of compute resources hosted on the Cloud. Auto Scaling provides the capability to dynamically scale up and scale down your compute resources based on user-defined policies, Key Performance Indicators (KPI), health status checks, and schedules, without any manual intervention. Auto Scaling is an important feature to consider when designing and architecting cloud based solutions, which can unleash the real power of Cloud to the apps for providing truly on-demand scalability and can also guard the organizational budget for cloud based application deployment. In the past, you have had to leverage the the Microsoft Enterprise Library Autoscaling Application Block (WASABi) or a services like  MetricsHub for implementing Automatic Scaling for your cloud apps hosted on the Windows Azure. The WASABi required to host your auto scaling block in a Windows Azure Worker Role for effectively implementing the auto scaling behaviour to your Windows Azure apps. The newly announced Auto Scaling service in Windows Azure lets you add automatic scaling capability to your Windows Azure Compute Services such as Cloud Services, Web Sites and Virtual Machine. Unlike WASABi hosted on a Worker Role, you don’t need to host any monitoring service for using the new Auto Scaling service and the Auto Scaling service will be available to individual Windows Azure Compute Services as part of the Scaling. Configure Auto Scaling for a Windows Azure Cloud Service Currently the Auto Scaling service supports Cloud Services, Web Sites and Virtual Machine. In this demo, I will be used a Cloud Services app with a Web Role and a Worker Role. To enable the Auto Scaling, select t your Windows Azure app in the Windows Azure management portal, and choose “SCLALE” tab. The Scale tab will show the all information regards with Auto Scaling. The below image shows that we have currently disabled the AutoScale service. To enable Auto Scaling, you need to choose either CPU or QUEUE. The QUEUE option is not available for Web Sites. The image below demonstrates how to configure Auto Scaling for a Web Role based on the utilization of CPU. We have configured the web role app for running with 1 to 5 Virtual Machine instances based on the CPU utilization with a range of 50 to 80%. If the aggregate utilization is becoming above above 80%, it will scale up instances and it will scale down instances when utilization is becoming below 50%. The image below demonstrates how to configure Auto Scaling for a Worker Role app based on the messages added into the Windows Azure storage Queue. We configured the worker role app for running with 1 to 3 Virtual Machine instances based on the Queue messages added into the Windows Azure storage Queue. Here we have specified the number of messages target per machine is 2000. The image below shows the summary of the Auto Scaling for the Cloud Service after configuring auto scaling service. Summary Auto Scaling is an extremely important behaviour of the Cloud applications for providing on-demand scalability without any manual intervention. Windows Azure provides greater support for enabling Auto Scaling for the apps deployed on the Windows Azure cloud platform. The new Auto Scaling service in Windows Azure lets you add automatic scaling capability to your Windows Azure Compute Services such as Cloud Services, Web Sites and Virtual Machine. In the new Auto Scaling service, you don’t have to host any monitor service like you have had in WASABi block. The Auto Scaling service is an excellent alternative to the manually hosting WASABi block in a Worker Role app.

    Read the article

  • Migration & Modernization: Windows/VB6 Apps to ASP.NET HTML5

    - by Visual WebGui
    I would like to invite you to a webinar we are doing in collaboration with Jeffrey S. Hammond , Principal Analyst serving Application Development & Delivery Professionals at Forrester Research. The webinar is free and it will will introduce the substantial changes brought on by the move to Web Applications and Open Web architectures, and the challenges it places on application development shops. We’ll also introduce how we at Gizmox are helping client navigate this mobile shift and evolve existing...(read more)

    Read the article

  • Azure Flavor for the Sharepoint Media Component

    - by spano
    Some time ago I wrote about a Media Processing Component for Sharepoint that I was working on. It is a Media Assets list for Sharepoint that lets you choose where to store the blob files. It provides also intelligence for encoding videos, generating thumbnail and poster images, obtaining media metadata, etc. On that first post the component was explained in detail, with the original 3 storage flavors: Sharepoint list, Virtual Directoy or FTP. The storage manager is extensible, so a new flavor was...(read more)

    Read the article

  • Where is the SQL Azure Development Environment

    - by BuckWoody
    Recently I posted an entry explaining that you can develop in Windows Azure without having to connect to the main service on the Internet, using the Software Development Kit (SDK) which installs two emulators - one for compute and the other for storage. That brought up the question of the same kind of thing for SQL Azure. The short answer is that there isn’t one. While we’ll make the development experience for all versions of SQL Server, including SQL Azure more easy to write against, you can simply treat it as another edition of SQL Server. For instance, many of us use the SQL Server Developer Edition - which in versions up to 2008 is actually the Enterprise Edition - to develop our code. We might write that code against all kinds of environments, from SQL Express through Enterprise Edition. We know which features work on a certain edition, what T-SQL it supports and so on, and develop accordingly. We then test on the actual platform to ensure the code runs as expected. You can simply fold SQL Azure into that same development process. When you’re ready to deploy, if you’re using SQL Server Management Studio 2008 R2 or higher, you can script out the database when you’re done as a SQL Azure script (with change notifications where needed) by selecting the right “Engine Type” on the scripting panel: (Thanks to David Robinson for pointing this out and my co-worker Rick Shahid for the screen-shot - saved me firing up a VM this morning!) Will all this change? Will SSMS, “Data Dude” and other tools change to include SQL Azure? Well, I don’t have a specific roadmap for those tools, but we’re making big investments on Windows Azure and SQL Azure, so I can say that as time goes on, it will get easier. For now, make sure you know what features are and are not included in SQL Azure, and what T-SQL is supported. Here are a couple of references to help: General Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ee336245.aspx Transact-SQL Supported by SQL Azure: http://msdn.microsoft.com/en-us/library/ee336250.aspx SQL Azure Learning Plan: http://blogs.msdn.com/b/buckwoody/archive/2010/12/13/windows-azure-learning-plan-sql-azure.aspx

    Read the article

  • Windows Azure Learning Plan - Compute

    - by BuckWoody
    This is one in a series of posts on a Windows Azure Learning Plan. You can find the main post here. This one deals with the "compute" function of Windows Azure, which includes Configuration Files, the Web Role, the Worker Role, and the VM Role. There is a general programming guide for Windows Azure that you can find here to help with the overall process.   Configuration Files Configuration Files define the environment for a Windows Azure application, similar to an ASP.NET application. This section explains how to work with these. General Introduction and Overview http://blogs.itmentors.com/bill/2009/11/04/configuration-files-and-windows-azure/ Service Definition File Schema http://msdn.microsoft.com/en-us/library/ee758711.aspx Service Configuration File Schema http://msdn.microsoft.com/en-us/library/ee758710.aspx  Windows Azure Web Role The Web Role runs code (such as ASP pages) that require a User Interface. Web Role "Boot Camp" Video  https://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?culture=en-US&EventID=1032470854&CountryCode=US Web Role Deployment Checklist http://blogs.infragistics.com/blogs/anton_staykov/archive/2010/06/30/windows-azure-web-role-deployment-checklist.aspx  Using a Web Role as a Worker Role for Small Applications http://www.31a2ba2a-b718-11dc-8314-0800200c9a66.com/2010/12/how-to-combine-worker-and-web-role-in.html Windows Azure Worker Role  The Worker Role is used for code that does not require a direct User Interface. Worker Role "Boot Camp" Video https://msevents.microsoft.com/CUI/WebCastEventDetails.aspx?culture=en-US&EventID=1032470871&CountryCode=US Worker Role versus Web Roles http://msdn.microsoft.com/en-us/library/gg433012.aspx Deploying other applications (like Java) in a Windows Azure Worker Role http://blogs.msdn.com/b/mariok/archive/2011/01/05/deploying-java-applications-in-azure.aspx Windows Azure VM Role The Windows Azure VM Role is an Operating System-level mechanism for code deployment. VM Role Overview and Details  http://msdn.microsoft.com/en-us/library/gg465398.aspx  The proper use of the VM Role http://blogs.msdn.com/b/buckwoody/archive/2010/12/28/the-proper-use-of-the-vm-role-in-windows-azure.aspx

    Read the article

  • Eloqua API Full Code Example in JAVA

    - by Shawn Spencer
    Is there anyone out there who has mastered to retrieve some data programmatically from Eloqua? First of all, I'm more or less a newbie, as far as JAVA. I can follow tutorials, take directions and will Google till my fingers bleed. I understand the basics and am slightly familiar with OOP. My main problem is that I have a Friday deadline (and tomorrow is Thanksgiving). At any rate, all the Eloqua code snippets (that I've been able to find) illustrate one aspect of a specific issue, and that's it. In my case, I would greatly appreciate a JAVA project of some sort, with all the necessary files to do web services (WSDL, SOAP and perhaps WSIT) and the main class and all that included. No, I don't want you to do my work for me! Just give me enough to find my way around, enter the information I need to retrieve and all that. I'll take it from there. Any pointers, links or suggestions?

    Read the article

  • Where is the SQL Azure Development Environment

    - by BuckWoody
    Recently I posted an entry explaining that you can develop in Windows Azure without having to connect to the main service on the Internet, using the Software Development Kit (SDK) which installs two emulators - one for compute and the other for storage. That brought up the question of the same kind of thing for SQL Azure. The short answer is that there isn’t one. While we’ll make the development experience for all versions of SQL Server, including SQL Azure more easy to write against, you can simply treat it as another edition of SQL Server. For instance, many of us use the SQL Server Developer Edition - which in versions up to 2008 is actually the Enterprise Edition - to develop our code. We might write that code against all kinds of environments, from SQL Express through Enterprise Edition. We know which features work on a certain edition, what T-SQL it supports and so on, and develop accordingly. We then test on the actual platform to ensure the code runs as expected. You can simply fold SQL Azure into that same development process. When you’re ready to deploy, if you’re using SQL Server Management Studio 2008 R2 or higher, you can script out the database when you’re done as a SQL Azure script (with change notifications where needed) by selecting the right “Engine Type” on the scripting panel: (Thanks to David Robinson for pointing this out and my co-worker Rick Shahid for the screen-shot - saved me firing up a VM this morning!) Will all this change? Will SSMS, “Data Dude” and other tools change to include SQL Azure? Well, I don’t have a specific roadmap for those tools, but we’re making big investments on Windows Azure and SQL Azure, so I can say that as time goes on, it will get easier. For now, make sure you know what features are and are not included in SQL Azure, and what T-SQL is supported. Here are a couple of references to help: General Guidelines and Limitations: http://msdn.microsoft.com/en-us/library/ee336245.aspx Transact-SQL Supported by SQL Azure: http://msdn.microsoft.com/en-us/library/ee336250.aspx SQL Azure Learning Plan: http://blogs.msdn.com/b/buckwoody/archive/2010/12/13/windows-azure-learning-plan-sql-azure.aspx

    Read the article

  • How to Automatically Backup Your Gmail Attachments With IFTTT

    - by Mark Wilson
    When it comes to getting things done quickly, automation is the name of the game. We’ve looked at IFTTT before, and a new batch of updates has introduced a number of options that can be used to automatically do things with files that are sent to your Gmail address. What could this be used for? Well the most obvious starting point is to simply create a backup of any files that you receive via email. This is useful if you find that you often reach the size limit for your inbox as it enables you to delete emails without having to worry about losing the associated files. Start by paying a visit to the IFTTT website and then either sign into an existing account or create a new one.     

    Read the article

  • Practical Approaches to increasing Virtualization Density-Part 1

    - by Girish Venkat
    Happy New year everyone!. Let me kick start the year off by talking about Virtualization density.  What is it?The number of virtual servers that a physical server can support and it's increase from the prior physical infrastructure as a percentage. Why is it important?This is important because the density should be indicative of how well the server is getting consumed?So what is wrong ?Virtualization density fails to convey the "Real usage" of a server.  Most of the hypervisor based O/S Virtualization  evangelists take pride in the fact that they are now running a Virtual Server farm of X machines compared to a Physical server farm of Y (with Y less than X obviously). The real question is - has your utilization of the server really increased or not.  In an internal study that was conducted by one of the top financial institution - the utilization of servers only went up by 15% from 30 to 45. So, this really means that just by increasing virtualization density one will not be achieving the goal of using up the servers in their server farm better.  I will write about what the possible approaches are to increase virtualization density in the next entry. 

    Read the article

< Previous Page | 98 99 100 101 102 103 104 105 106 107 108 109  | Next Page >