Search Results

Search found 1914 results on 77 pages for 'platinum azure'.

Page 40/77 | < Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >

  • Need help configuing juju on Windows 8

    - by Claude Tyler McAdams
    I am trying to get Juju working on Windows 8 but I am running in to some errors when trying to get juju to see my ssh keys: C:\Users\username> juju bootstrap error: error parsing environment "azure": read C:\Users\user\SkyDrive\Documents\Azure\ssh\: The handle is invalid. I've added a public key I generated with putty to the directory above called azure My environments.yaml file has this in it: authorized-keys-path: C:\Users\user\SkyDrive\Documents\Azure\ssh\ Any ideas?

    Read the article

  • Parsing error when bootstrapping on Windows

    - by Claude Tyler McAdams
    I am trying to get Juju working on Windows 8 but I am running in to some errors when trying to get juju to see my ssh keys: C:\Users\username> juju bootstrap error: error parsing environment "azure": read C:\Users\user\SkyDrive\Documents\Azure\ssh\: The handle is invalid. I've added a public key I generated with putty to the directory above called azure My environments.yaml file has this in it: authorized-keys-path: C:\Users\user\SkyDrive\Documents\Azure\ssh\ Any ideas?

    Read the article

  • asp.net:Invalid temp directory in chart handler configuration [c:\TempImageFiles\].

    - by veda
    I am getting this error Invalid temp directory in chart handler configuration [c:\TempImageFiles\]. While running my code. Intially I was getting No http handler was found for request type ‘GET’ error which I solved it by referring no http handler But now I am getting the above error The details of the error are Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code. Exception Details: System.IO.DirectoryNotFoundException: Invalid temp directory in chart handler configuration [c:\TempImageFiles\]. Source Error: An unhandled exception was generated during the execution of the current web request. Information regarding the origin and location of the exception can be identified using the exception stack trace below. The stackTrace of this error [DirectoryNotFoundException: Invalid temp directory in chart handler configuration [c:\TempImageFiles\].] System.Web.UI.DataVisualization.Charting.ChartHttpHandlerSettings.Inspect() +851 System.Web.UI.DataVisualization.Charting.ChartHttpHandlerSettings.ParseParams(String parameters) +1759 System.Web.UI.DataVisualization.Charting.ChartHttpHandlerSettings..ctor(String parameters) +619 System.Web.UI.DataVisualization.Charting.ChartHttpHandler.InitializeParameters() +237 System.Web.UI.DataVisualization.Charting.ChartHttpHandler.EnsureInitialized(Boolean hardCheck) +208 System.Web.UI.DataVisualization.Charting.ChartHttpHandler.EnsureInstalled() +33 System.Web.UI.DataVisualization.Charting.Chart.GetImageStorageMode() +57 System.Web.UI.DataVisualization.Charting.Chart.Render(HtmlTextWriter writer) +257 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +144 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +583 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +91 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +410 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +118 System.Web.UI.HtmlControls.HtmlForm.RenderChildren(HtmlTextWriter writer) +489 System.Web.UI.HtmlControls.HtmlContainerControl.Render(HtmlTextWriter writer) +84 System.Web.UI.HtmlControls.HtmlForm.Render(HtmlTextWriter output) +713 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +144 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +583 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +91 System.Web.UI.HtmlControls.HtmlForm.RenderControl(HtmlTextWriter writer) +91 System.Web.UI.Control.RenderChildrenInternal(HtmlTextWriter writer, ICollection children) +410 System.Web.UI.Control.RenderChildren(HtmlTextWriter writer) +118 System.Web.UI.Control.Render(HtmlTextWriter writer) +60 System.Web.UI.Page.Render(HtmlTextWriter writer) +66 System.Web.UI.Control.RenderControlInternal(HtmlTextWriter writer, ControlAdapter adapter) +144 System.Web.UI.Control.RenderControl(HtmlTextWriter writer, ControlAdapter adapter) +583 System.Web.UI.Control.RenderControl(HtmlTextWriter writer) +91 System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint) +7761 Can anyone tell me how to solve this problem... Should i have to create a temporary directory manually or what should i do...

    Read the article

  • Switching from one connectionstring to another when moving from development to cloud

    - by Nancy Walker
    Hello, I am working on a cloud application. When I test out the application on my computer I want to have my connection string set as follows in ServiceConfiguration.cscfg: <Setting name="DataConnectionString" value="UseDevelopmentStorage=true" /> When I publish to the cloud I need to have it set as follows: <Setting name="DataConnectionString" value="DefaultEndpointsProtocol=https;AccountName=xxxx;AccountKey=yyy" /> I keep going from one environment to the other and keep having to change the DataConnectionString. Is there a way that I can automate this? I looked around and can't see any examples but I'm sure some others have the same problem as me. Thanks, Nancy

    Read the article

  • Management API - The request body XML was invalid or not correctly specified

    - by maartenba
    Cross posting from http://social.msdn.microsoft.com/Forums/en-US/windowsazuretroubleshooting/thread/31b6aedc-c069-4e32-8e8f-2ff4b7c30793 I'm getting this error on changing configuration through the service management API: The request body XML was invalid or not correctly specified The request body payload: <?xml version="1.0" encoding="utf-8"?> <ChangeConfiguration xmlns="http://schemas.microsoft.com/windowsazu re"> <Configuration>PD94bWwgdmVyc2lvbj0iMS4wIj8+CjxTZXJ2aWNlQ29uZmlndX JhdGlvbiB4bWxuczp4c2k9Imh0dHA6Ly93d3cudzMub3JnLzIwMDEvWE1MU2NoZW1hLWluc3RhbmNlIi B4bWxuczp4c2Q9Imh0dHA6Ly93d3cudzMub3JnLzIwMDEvWE1MU2NoZW1hIiB4bWxucz0iaHR0cDovL3 NjaGVtYXMubWljcm9zb2Z0LmNvbS9TZXJ2aWNlSG9zdGluZy8yMDA4LzEwL1NlcnZpY2VDb25maWd1cm F0aW9uIiBzZXJ2aWNlTmFtZT0iIiBvc0ZhbWlseT0iMSIgb3NWZXJzaW9uPSIqIj4KICA8Um9sZSBuYW 1lPSJXZWJSb2xlMSI+CiAgICA8Q29uZmlndXJhdGlvblNldHRpbmdzPgogICAgICA8U2V0dGluZyBuYW 1lPSJNaWNyb3NvZnQuV2luZG93c0F6dXJlLlBsdWdpbnMuRGlhZ25vc3RpY3MuQ29ubmVjdGlvblN0cm luZyIgdmFsdWU9IlVzZURldmVsb3BtZW50U3RvcmFnZT10cnVlIi8+CiAgICA8L0NvbmZpZ3VyYXRpb2 5TZXR0aW5ncz4KICAgIDxJbnN0YW5jZXMgY291bnQ9IjIiLz4KICAgIDxDZXJ0aWZpY2F0ZXMvPgogID wvUm9sZT4KPC9TZXJ2aWNlQ29uZmlndXJhdGlvbj4K</Configuration> </ChangeConfiguration> I'm passing it the following configuration: $configuration = '<?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" serviceName="" osFamily="1" osVersion="*"> <Role name="WebRole1"> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true"/> </ConfigurationSettings> <Instances count="2"/> <Certificates/> </Role> </ServiceConfiguration>'; Does anyone know why this error occurs? I suspect it has something to do with encoding but not sure.

    Read the article

  • Migrating Databases from SQL Server 2008 to SqlAzure

    - by kaleidoscope
    Connect SQL Azure through SSMS. (It will get connected, only if you have port 1433 open.) Create required databases on SQL Azure. Create and execute schema scripts for databases.(Make sure you have written scripts which are 100% compatible with SQL Azure. Tool called SQL Azure migration tool by CodePlex helps with doing this.) Create and execute insert scripts for databases. (One can use SSMS for generating insert scripts.) For further details refer to the Windows Azure Platform Kit Nov 2009.   Anish

    Read the article

  • TechEd 2014 Day 2

    - by John Paul Cook
    Today people asked me about backing up older versions of SQL Server to Azure. Older versions back to SQL Server 2005 can be easily backed up to Azure Storage by installing Microsoft SQL Server Backup to Windows Azure Tool. It installs a service of the same name that applies rules to SQL Server backups. You can tell the tool to backup or encrypt your SQL Server backups. You can have it automatically upload your backups to Azure Storage. Even if you don’t want to upload your backups to Azure, you might...(read more)

    Read the article

  • TechEd 2014 Day 2

    - by John Paul Cook
    Today people asked me about backing up older versions of SQL Server to Azure. Older versions back to SQL Server 2005 can be easily backed up to Azure Storage by installing Microsoft SQL Server Backup to Windows Azure Tool. It installs a service of the same name that applies rules to SQL Server backups. You can tell the tool to backup or encrypt your SQL Server backups. You can have it automatically upload your backups to Azure Storage. Even if you don’t want to upload your backups to Azure, you might...(read more)

    Read the article

  • Where does Rundesk execute local tasks from

    - by Leon Stafford
    I'm trying to interact with the nodejs Azure sdk from a CentOS installation of Rundeck. If I try from the "run" adhoc virtual shell, I am able to after running azure account import <mykey> and can then also execute other Azure commands inside of jobs if I set them as Rundeck node tasks and not selecting "dispatch to nodes" in the job settings. Trying to run the Azure sdk commands as commands to be dispatched to the node (local) fails with the error: localhost1-NodeDispatch-localexec 04:53:04 /usr/bin/env: node: No such file or directory 04:53:04 Failed: NonZeroResultCode: Result code was 127 I am not able to "jumpstart" the same environment by running azure account import <mykey> I am assuming this is a permissions/environmental issue, though not sure how to fix it. UPDATE: Executing whoami from the same job returns rundeck, so I assume I will need to either modify that to execute tasks as my system user or grant permissions to get the rundeck user into the node environment the Azure sdk is running in?

    Read the article

  • Future of BizTalk

    - by Vamsi Krishna
    The future of BizTalk- The last TechEd Conference was a very important one from BizTalk perspective. Microsoft will continue innovating BizTalk and releasing BizTalk versions for “for next few years to come”. So, all the investments that clients have made so far will continue giving returns. Three flavors of BizTalk: BizTalk On-Premise, BizTalk as IaaS and BizTalk as PaaS (Azure Service Bus).Windows Azure IaaS and How It WorksDate: June 12, 2012Speakers: Corey SandersThis session covers the significant investments that Microsoft is making in our Windows Azure Infrastructure-as-a-Service (IaaS) solution and how it http://channel9.msdn.com/Events/TechEd/NorthAmerica/2012/AZR201 TechEd provided two sessions around BizTalk futures:http://channel9.msdn.com/Events/TechEd/NorthAmerica/2012AZR 207: Application Integration Futures - The Road Map and What's Next on Windows Azure http://channel9.msdn.com/Events/TechEd/NorthAmerica/2012/AZR207AZR211: Building Integration Solutions Using Microsoft BizTalk On-Premises and on Windows Azurehttp://channel9.msdn.com/Events/TechEd/NorthAmerica/2012/AZR211Here is are some highlights from the two sessions at TechEd. Bala Sriram, Director of Development for BizTalk provided the introduction at the road map session and covered some key points around BizTalk:More then 12,000 Customers 79% of BizTalk users have already adopted BizTalk Server 2010 BizTalk Server will be released for “years to come” after this release. I.e. There will be more releases of BizTalk after BizTalk 2010 R2. BizTalk 2010 R2 will be releasing 6 months after Windows 8 New Azure (Cloud-based) BizTalk scenarios will be available in IaaS and PaaS BizTalk Server on-premises, IaaS, and PaaS are complimentary and designed to work together BizTalk Investments will be taken forward The second session was mainly around drilling in and demonstrating the up and coming capabilities in BizTalk Server on-premise, IaaS, and PaaS:BizTalk IaaS: Users will be able to bring their own or choose from a Azure IaaS template to quickly provision BizTalk Server virtual machines (VHDs) BizTalk Server 2010 R2: Native adapter to connect to Azure Service Bus Queues, Topics and Relay. Native send port adapter for REST. Demonstrated this in connecting to Salesforce to get and update Salesforce information. ESB Toolkit will be incorporate are part of product and setup BizTalk PaaS: HTTP, FTP, Service Bus, LOB Application connectivity XML and Flat File processing Message properties Transformation, Archiving, Tracking Content based routing

    Read the article

  • CodePlex Daily Summary for Thursday, November 24, 2011

    CodePlex Daily Summary for Thursday, November 24, 2011Popular ReleasesASP.NET Comet Ajax Library (Reverse Ajax - Server Push): ASP.NET Reverse Ajax Samples: Chat, MVC Razor, DesktopClient, Reverse Ajax for VB.NET and C#Windows Azure SDK for PHP: Windows Azure SDK for PHP v4.0.5: INSTALLATION Windows Azure SDK for PHP requires no special installation steps. Simply download the SDK, extract it to the folder you would like to keep it in, and add the library directory to your PHP include_path. INSTALLATION VIA PEAR Maarten Balliauw provides an unofficial PEAR channel via http://www.pearplex.net. Here's how to use it: New installation: pear channel-discover pear.pearplex.net pear install pearplex/PHPAzure Or if you've already installed PHPAzure before: pear upgrade p...Anno 2070 Assistant: Beta v1.0 (STABLE): Anno 2070 Assistant Beta v1.0 Released! Features Included: Complete Building Layouts for Ecos, Tycoons & Techs Complete Production Chains for Ecos, Tycoons & Techs Completed Credits Screen Known Issues: Not all production chains and building layouts may be on the lists because they have not yet been discovered. However, data is still 99.9% complete. Currently the Supply & Demand, including Calculator screen are disabled until version 1.1.Minemapper: Minemapper v0.1.7: Including updated Minecraft Biome Extractor and mcmap to support the new Minecraft 1.0.0 release (new block types, etc).Metro Pandora: Metro Pandora SDK V1: For more information on this release please see Metro Pandora SDK Introduction. Supported platforms: Windows Phone 7 / Silverlight Windows 8 .Net 4.0, WPF, WinformsVisual Leak Detector for Visual C++ 2008/2010: v2.2.1: Enhancements: * strdup and _wcsdup functions support added. * Preliminary support for VS 11 added. Bugs Fixed: * Low performance after upgrading from VLD v2.1. * Memory leaks with static linking fixed (disabled calloc support). * Runtime error R6002 fixed because of wrong memory dump format. * version.h fixed in installer. * Some PVS studio warning fixed.NetSqlAzMan - .NET SQL Authorization Manager: 3.6.0.10: 3.6.0.10 22-Nov-2011 Update: Removed PreEmptive Platform integration (PreEmptive analytics) Removed all PreEmptive attributes Removed PreEmptive.dll assembly references from all projects Added first support to ADAM/AD LDS Thanks to PatBea. Work Item 9775: http://netsqlazman.codeplex.com/workitem/9775Developer Team Article System Management: DTASM v1.3: ?? ??? ???? 3 ????? ???? ???? ????? ??? : - ????? ?????? ????? ???? ?? ??? ???? ????? ?? ??? ? ?? ???? ?????? ???? ?? ???? ????? ?? . - ??? ?? ???? ????? ???? ????? ???? ???? ?? ????? , ?????? ????? ????? ?? ??? . - ??? ??????? ??? ??? ???? ?? ????? ????? ????? .VideoLan DotNet for WinForm, WPF & Silverlight 5: VideoLan DotNet for WinForm, WPF, SL5 - 2011.11.22: The new version contains Silverlight 5 library: Vlc.DotNet.Silverlight. A sample could be tested here The new version add and correct many features : Correction : Reinitialize some variables Deprecate : Logging API, since VLC 1.2 (08/20/2011) Add subitem in LocationMedia (for Youtube videos, ...) Update Wpf sample to use Youtube videos Many others correctionsSharePoint 2010 FBA Pack: SharePoint 2010 FBA Pack 1.2.0: Web parts are now fully customizable via html templates (Issue #323) FBA Pack is now completely localizable using resource files. Thank you David Chen for submitting the code as well as Chinese translations of the FBA Pack! The membership request web part now gives the option of having the user enter the password and removing the captcha (Issue # 447) The FBA Pack will now work in a zone that does not have FBA enabled (Another zone must have FBA enabled, and the zone must contain the me...SharePoint 2010 Education Demo Project: Release SharePoint SP1 for Education Solutions: This release includes updates to the Content Packs for SharePoint SP1. All Content Packs have been updated to install successfully under SharePoint SP1SQL Monitor - managing sql server performance: SQLMon 4.1 alpha 6: 1. improved support for schema 2. added find reference when right click on object list 3. added object rename supportBugNET Issue Tracker: BugNET 0.9.126: First stable release of version 0.9. Upgrades from 0.8 are fully supported and upgrades to future releases will also be supported. This release is now compiled against the .NET 4.0 framework and is a requirement. Because of this the web.config has significantly changed. After upgrading, you will need to configure the authentication settings for user registration and anonymous access again. Please see our installation / upgrade instructions for more details: http://wiki.bugnetproject.c...Free SharePoint 2010 Sites Templates: SharePoint Server 2010 Sites Templates: here is the list of sites templates to be downloadedVsTortoise - a TortoiseSVN add-in for Microsoft Visual Studio: VsTortoise Build 30 Beta: Note: This release does not work with custom VsTortoise toolbars. These get removed every time when you shutdown Visual Studio. (#7940) Build 30 (beta)New: Support for TortoiseSVN 1.7 added. (the download contains both setups, for TortoiseSVN 1.6 and 1.7) New: OpenModifiedDocumentDialog displays conflicted files now. New: OpenModifiedDocument allows to group items by changelist now. Fix: OpenModifiedDocumentDialog caused Visual Studio 2010 to freeze sometimes. Fix: The installer didn...nopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.30: Highlight features & improvements: • Performance optimization. • Back in stock notifications. • Product special price support. • Catalog mode (based on customer role) To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).WPF Converters: WPF Converters V1.2.0.0: support for enumerations, value types, and reference types in the expression converter's equality operators the expression converter now handles DependencyProperty.UnsetValue as argument values correctly (#4062) StyleCop conformance (more or less)Json.NET: Json.NET 4.0 Release 4: Change - JsonTextReader.Culture is now CultureInfo.InvariantCulture by default Change - KeyValurPairConverter no longer cares about the order of the key and value properties Change - Time zone conversions now use new TimeZoneInfo instead of TimeZone Fix - Fixed boolean values sometimes being capitalized when converting to XML Fix - Fixed error when deserializing ConcurrentDictionary Fix - Fixed serializing some Uris returning the incorrect value Fix - Fixed occasional error when...Media Companion: MC 3.423b Weekly: Ensure .NET 4.0 Full Framework is installed. (Available from http://www.microsoft.com/download/en/details.aspx?id=17718) Ensure the NFO ID fix is applied when transitioning from versions prior to 3.416b. (Details here) Replaced 'Rebuild' with 'Refresh' throughout entire code. Rebuild will now be known as Refresh. mc_com.exe has been fully updated TV Show Resolutions... Resolved issue #206 - having to hit save twice when updating runtime manually Shrunk cache size and lowered loading times f...ASP.net Awesome jQuery Ajax Controls Samples and Tutorials: 1.0 samples: Demos and Tutorials for ASP.net Awesome VS2008 are in .NET 3.5 VS2010 are in .NET 4.0 (demos for the ASP.net Awesome jQuery Ajax Controls)New Projects"My" Search SharePoint 2010 WebParts: Solution consists of two SharePoint 2010 web parts inheriting from core results web part. The "My Core Search Results" webpart and the "Sortable Results" web part.Alerta Mensagens: Projeto de alerta de mensagens.Analyzer GT3000: VU MIF PS1 "Utopine Amnezija" PSI projektasAutoReservation: Miniprojekt an der HSR im Fach MsTechContestMeter: Program for measuring participant scores in programming contests.CountriesWFA: oby ostatniDotNet.Framework.Common: <DotNet.Framework.Common> ASP.NET?????? Esaan Windows Phone 7 Application Compitition: Resource for Esaan Windows Phone 7 Application Excel add-in to enable users View Azure Storage Tables: An Excel 2007 add-in using Azure Storage REST APIs (no Azure libraries required) and Excel custom task pane. Uses of such Office 2007 add-in: 1. Read data from Azure storage, populate that in excel and use it by sorting (not directly available in Azure storage) and so on. 2. Write Sorted data back (in sorted order) back to Azure storage. 3. Have data used frequently (such as dictionary of legal terms, or math formulae, and so on) on Azure storage and let your Office Excel users use th...fastconv: fastconv, a fast charset encoding convertor.gaosu: this is a gaosu projectGems: Gossip-enabled monitoring system.GridIT: GridIT is a Puzzle written in Visual Basic.Habanero Faces: A set of user interface libraries for use with Habanero Core used to build Windows Forms or Visual WebGUI user interfaces for your business objects. HomeSite: HomeSiteHTML to docx Converter: This converts HTML into Word documents (docx format). The code is written in PHP and works with PHPWord.Image Curator: image curation programjLemon: jLemon is an LALR(1) parser generator. Lemon is similar to the much more famous programs "YACC", "BISON" and "LEMON". jLemon is a pure Java program and compatible with "LEMON".jngsoftware: JNG Computer ServicesKinectoid: Kinectoid is a Kinect based pong game based on Neat Game Engine.Linq Accelerator: coming soon!Linq to SSRS (SQL Server Reporting Services): Linq to SSRS allows you as developer generate objects and map them to the repots on SQL Server Reporting Services in linq to sql fashion using lambda expressions and linq notation.MicroBitmap - Image Compressor: MicroBitmap is a image compressor which is different from all other compressors This compressor is focussed at compressing the bitmap and making it so small as possible so fast as possible You can find more information in the source codeOMX_AL_test_environment: OMX application layer simulation/testing environmentShuriken Plugins: Shuriken Plugins is a project for developing plugins for Shuriken, the free launcher app on Windows. Simple Command Line Backup Tool: Simple Command Line Backup Tool automates the backup of files from the command line for use in batch files. Includes file type filters/ recursive/non recursive and number days to keep backups for. Right now this is simply a quick app I created for a client of mine that needed a simple backup utility to go with a product I delivered. .NET 2.0 C# Console Application Hopefully this will morph into a useful utility with features not common in other command line backup utilities.SomethingSpacial: Initially designed via Interact Designer Ariel (http://www.facingblend.com/) Something Spacial as a design was used to help give some look and feel for the Silverlight User Group Starter Kit and then finally used as part of the Seattle Silverlight User Group Community Site.SqlServer Packer: SqlServer???????????????????,?????????。TVDBMetaData: Construct TV Series and Episode XML metadata from TheTvDB database.Ukázkové projekty: Obsahuje ukázkové projekty uživatele TenCoKaciStromy.Uzing Inklude: UI = Uzing + Inklude Uzing : Utility classes for .Net written in C# Inklude : Utility classes written in native C++ It supports very low level communications, but in very simple way. Current Capability - TCP, UDP, Thread in C# & C++ - String in C++ - Shared memory in C# - Communication between C# and C++ via shared memory Dependency: - TBB (http://threadingbuildingblocks.org/) - Boost (http://www.boost.org/) Even though it depends on such heavy libs, end developer does...Visual Studio Project Converter: This project is inspired by the original source code provided by Emmet Gray to convert Visual Studio solution and project files from one version to another (both backwards and forwards). This project has been converted to C# and updated to support new features.WarrantyPrint: WarrantyPrintWordPress???? on Windows Azure: WordPress?????Windows Azure Platform????????。 ???????SQL Azure??????。WPFResumeVideo: Play video on any computer, and resume where you left offWyvern's Depot: Personal code repository.

    Read the article

  • SQLOS and Cloud Infrastructure sessions at PASS Summit 2012

    - by SQLOS Team
    The SQL Pass Summit 2012, the largest yet, is in full swing. Here's a summary of the sessions this week on cloud infrastructure and SQLOS topics. Some of these were today, and you can catch the recordings. One more session takes place on Friday covering SQL Server solution patterns in Windows Azure VMs... Also, catch Thursday's keynote with Quentin Clark which will feature a cool IaaS demo!   SQL Server in Windows Azure VM Sessions CLD-309-A SQLCAT: Best Practices and Lessons Learned on SQL Server in an Azure VM Steve Howard, Arvind Ranasaria - Wednesday 11/6 10:15 This session looked at some best practices to optimize Networking, Memory, Disk IO and high availability based on lessons learned during SQLCat work with customer deployments. Well worth catching the recording.   SQL Server in Azure VM patterns: Hybrid Disaster Recovery, data movement and BI Guy Bowerman, Peter Saddow, Michael Washam, Ross LoForte - Friday 11/9 9:45 Rm 613 [Note: In the guides this has an outdated title.] This session has a focus on SQL Server Azure VM solutions. Starting with the basics and then going deeper into: - New features in the Microsoft Assessment and Planning Toolkit 8.0 to help plan and size SQL VM migrations.- A Look at a Windows Azure VM SQL Server app making use of load balancing and SQL Server high availability features.- A BI case study running SQL BI components in Azure VMs and making use of Windows 8 tiles.- A training class in a VM case study.   SQLOS Sessions DBA-500-HD Inside SQLOS 2012 (half-day session) Bob Ward - Wednesday 11/6 1:30pm Bob Ward from CSS applies his wealth of experience to look at the internals of SQLOS and what's changed in the various SQL 2012 components, including memory, resource governor, scheduler.   DBA-403-M: SQLCAT: Memory Manager Changes in SQL Server 2012 Gus Apostol, Jerome Halmans - 1:30pm Covers the redesigned SQLOS memory manager in SQL Server 2012 including the new page allocator for any size pages (and all that implies), DMVs, demo's. Not sure why this was placed at the same time as the SQLOS half-day session, but since it's recorded it's available for catch-up.   - Guy   Originally posted at http://blogs.msdn.com/b/sqlosteam/

    Read the article

  • &lsquo;Publish&hellip;&rsquo; Resulting in Directory With No Files

    - by ToStringTheory
    I was pulling my hair out with this one…  Which isn’t good considering I have so little of it left!  I had just upgraded to the Windows Azure 1.7 SDK the day before with no problems, and used the upgraded ‘Publish…’ dialog to successfully publish a website to my hard disk for hosting on an internal development server.  However, when trying to deploy another project to my file system, it said it was successful, but there were no files in the directory.  The only difference, the first project was an Azure project, the second was a standard ASP.Net Web Application.  If you installed the Windows Azure 1.7 SDK, you may want to read this. The Problem At first it appears that there is no problem: However you may remember that when publishing a web application, the output window will generally iterate through each of the directories as it copies the files from that directory over.  Sure enough, when looking at the output directory – there are no files, no bin directory, no nothing… Troubleshooting Since one site published and the other did not, I believed that the failure may have been to a failed SQL Server 2012 installation that happened between publish.  I rolled back the installation, however that did not work either.  I also checked the Configuration Manager dialog, and ensured that the projects were selected to actually build (just checking, even though the output said it built them..)  I checked the properties of the solution and the projects, and a selection of files in the project to make sure that they were selected for content…  Nothing seemed to work. I then decided to uninstall the Azure 1.7 SDK to see if that was the culprit.  When I opened the Windows 7 ‘Uninstall a Program’ dialog, I noticed that the Azure SDK came with 2 extra packages that just so happen to be in a Release Candidate state from Microsoft – ‘Microsoft Web Deploy 3.0’ and ‘Microsoft Web Publish – Visual Studio 2010’.  It dawned on me that the publish dialog must not be just for Azure, since it appeared when I tried to deploy the regular web application as well.  Therefore, it must have been an upgrade to the publish mechanism in Visual Studio.  I uninstalled both of the programs and received my old publish dialog once again, and was able to successfully publish the solution above as I had done before. After celebrating solving the problem, I tried reinstalling the Azure package, to see if it would repair the publishing process. Even though it brought back the updated dialogs, it did not publish any files. Instead of uninstalling and retreating, I now KNEW what the cause was, and these were packages not just for Azure. I now knew a product name to search for. The Solution Sure enough, with the correct search term in Google – ‘microsoft web publish no files’, and setting the timeline to 1 week, I found what I needed - Microsoft Connect - Publish Web Application FAILS! (by Andrew Rits). I am surprised that I missed something that ended up being so simple…  In the Configuration Manager, I had the following settings: This is how I had been building and debugging the solution always…  However, apparently when installing the new Web Publishing package, it does things a little differently in its configuration for publishing: You see the difference?  The configuration here is set to ‘x86’ instead of ‘Any CPU’.  Sure enough, as soon as I switched the configuration to ‘Release – Any CPU’, the deployment built and published all of my files as I expected. Conclusion It was a small change, but apparently the new ‘Publish web application’ defaults to the x86 configuration, thereby not copying any of the project/bin files to the publish target directory.  I spent forever trying things, but this small drop down eluded me until I was able to target that the dialog was actually working apparently, I just didn’t have the correct configuration. I hope that this saves you the hours of frustration and hastened hair loss that it caused me…  I also hope that before Microsoft brings this publishing package out of RC status, that they change the behavior of that menu to default to the settings of the old publish menu for the first time. Happy Coding!

    Read the article

  • Azure &ndash; Part 5 &ndash; Repository Pattern for Table Service

    - by Shaun
    In my last post I created a very simple WCF service with the user registration functionality. I created an entity for the user data and a DataContext class which provides some methods for operating the entities such as add, delete, etc. And in the service method I utilized it to add a new entity into the table service. But I didn’t have any validation before registering which is not acceptable in a real project. So in this post I would firstly add some validation before perform the data creation code and show how to use the LINQ for the table service.   LINQ to Table Service Since the table service utilizes ADO.NET Data Service to expose the data and the managed library of ADO.NET Data Service supports LINQ we can use it to deal with the data of the table service. Let me explain with my current example: I would like to ensure that when register a new user the email address should be unique. So I need to check the account entities in the table service before add. If you remembered, in my last post I mentioned that there’s a method in the TableServiceContext class – CreateQuery, which will create a IQueryable instance from a given type of entity. So here I would create a method under my AccountDataContext class to return the IQueryable<Account> which named Load. 1: public class AccountDataContext : TableServiceContext 2: { 3: private CloudStorageAccount _storageAccount; 4:  5: public AccountDataContext(CloudStorageAccount storageAccount) 6: : base(storageAccount.TableEndpoint.AbsoluteUri, storageAccount.Credentials) 7: { 8: _storageAccount = storageAccount; 9:  10: var tableStorage = new CloudTableClient(_storageAccount.TableEndpoint.AbsoluteUri, 11: _storageAccount.Credentials); 12: tableStorage.CreateTableIfNotExist("Account"); 13: } 14:  15: public void Add(Account accountToAdd) 16: { 17: AddObject("Account", accountToAdd); 18: SaveChanges(); 19: } 20:  21: public IQueryable<Account> Load() 22: { 23: return CreateQuery<Account>("Account"); 24: } 25: } The method returns the IQueryable<Account> so that I can perform the LINQ operation on it. And back to my service class, I will use it to implement my validation. 1: public bool Register(string email, string password) 2: { 3: var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); 4: var accountToAdd = new Account(email, password) { DateCreated = DateTime.Now }; 5: var accountContext = new AccountDataContext(storageAccount); 6:  7: // validation 8: var accountNumber = accountContext.Load() 9: .Where(a => a.Email == accountToAdd.Email) 10: .Count(); 11: if (accountNumber > 0) 12: { 13: throw new ApplicationException(string.Format("Your account {0} had been used.", accountToAdd.Email)); 14: } 15:  16: // create entity 17: try 18: { 19: accountContext.Add(accountToAdd); 20: return true; 21: } 22: catch (Exception ex) 23: { 24: Trace.TraceInformation(ex.ToString()); 25: } 26: return false; 27: } I used the Load method to retrieve the IQueryable<Account> and use Where method to find the accounts those email address are the same as the one is being registered. If it has I through an exception back to the client side. Let’s run it and test from my simple client application. Oops! Looks like we encountered an unexpected exception. It said the “Count” is not support by the ADO.NET Data Service LINQ managed library. That is because the table storage managed library (aka. TableServiceContext) is based on the ADO.NET Data Service and it supports very limit LINQ operation. Although I didn’t find a full list or documentation about which LINQ methods it supports I could even refer a page on msdn here. It gives us a roughly summary of which query operation the ADO.NET Data Service managed library supports and which doesn't. As you see the Count method is not in the supported list. Not only the query operation, there inner lambda expression in the Where method are limited when using the ADO.NET Data Service managed library as well. For example if you added (a => !a.DateDeleted.HasValue) in the Where method to exclude those deleted account it will raised an exception said "Invalid Input". Based on my experience you should always use the simple comparison (such as ==, >, <=, etc.) on the simple members (such as string, integer, etc.) and do not use any shortcut methods (such as string.Compare, string.IsNullOrEmpty etc.). 1: // validation 2: var accountNumber = accountContext.Load() 3: .Where(a => a.Email == accountToAdd.Email) 4: .ToList() 5: .Count; 6: if (accountNumber > 0) 7: { 8: throw new ApplicationException(string.Format("Your account {0} had been used.", accountToAdd.Email)); 9: } We changed the a bit and try again. Since I had created an account with my mail address so this time it gave me an exception said that the email had been used, which is correct.   Repository Pattern for Table Service The AccountDataContext takes the responsibility to save and load the account entity but only for that specific entity. Is that possible to have a dynamic or generic DataContext class which can operate any kinds of entity in my system? Of course yes. Although there's no typical database in table service we can threat the entities as the records, similar with the data entities if we used OR Mapping. As we can use some patterns for ORM architecture here we should be able to adopt the one of them - Repository Pattern in this example. We know that the base class - TableServiceContext provide 4 methods for operating the table entities which are CreateQuery, AddObject, UpdateObject and DeleteObject. And we can create a relationship between the enmity class, the table container name and entity set name. So it's really simple to have a generic base class for any kinds of entities. Let's rename the AccountDataContext to DynamicDataContext and make the type of Account as a type parameter if it. 1: public class DynamicDataContext<T> : TableServiceContext where T : TableServiceEntity 2: { 3: private CloudStorageAccount _storageAccount; 4: private string _entitySetName; 5:  6: public DynamicDataContext(CloudStorageAccount storageAccount) 7: : base(storageAccount.TableEndpoint.AbsoluteUri, storageAccount.Credentials) 8: { 9: _storageAccount = storageAccount; 10: _entitySetName = typeof(T).Name; 11:  12: var tableStorage = new CloudTableClient(_storageAccount.TableEndpoint.AbsoluteUri, 13: _storageAccount.Credentials); 14: tableStorage.CreateTableIfNotExist(_entitySetName); 15: } 16:  17: public void Add(T entityToAdd) 18: { 19: AddObject(_entitySetName, entityToAdd); 20: SaveChanges(); 21: } 22:  23: public void Update(T entityToUpdate) 24: { 25: UpdateObject(entityToUpdate); 26: SaveChanges(); 27: } 28:  29: public void Delete(T entityToDelete) 30: { 31: DeleteObject(entityToDelete); 32: SaveChanges(); 33: } 34:  35: public IQueryable<T> Load() 36: { 37: return CreateQuery<T>(_entitySetName); 38: } 39: } I saved the name of the entity type when constructed for performance matter. The table name, entity set name would be the same as the name of the entity class. The Load method returned a generic IQueryable instance which supports the lazy load feature. Then in my service class I changed the AccountDataContext to DynamicDataContext and that's all. 1: var accountContext = new DynamicDataContext<Account>(storageAccount); Run it again and register another account. The DynamicDataContext now can be used for any entities. For example, I would like the account has a list of notes which contains 3 custom properties: Account Email, Title and Content. We create the note entity class. 1: public class Note : TableServiceEntity 2: { 3: public string AccountEmail { get; set; } 4: public string Title { get; set; } 5: public string Content { get; set; } 6: public DateTime DateCreated { get; set; } 7: public DateTime? DateDeleted { get; set; } 8:  9: public Note() 10: : base() 11: { 12: } 13:  14: public Note(string email) 15: : base(email, string.Format("{0}_{1}", email, Guid.NewGuid().ToString())) 16: { 17: AccountEmail = email; 18: } 19: } And no need to tweak the DynamicDataContext we can directly go to the service class to implement the logic. Notice here I utilized two DynamicDataContext instances with the different type parameters: Note and Account. 1: public class NoteService : INoteService 2: { 3: public void Create(string email, string title, string content) 4: { 5: var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); 6: var accountContext = new DynamicDataContext<Account>(storageAccount); 7: var noteContext = new DynamicDataContext<Note>(storageAccount); 8:  9: // validate - email must be existed 10: var accounts = accountContext.Load() 11: .Where(a => a.Email == email) 12: .ToList() 13: .Count; 14: if (accounts <= 0) 15: throw new ApplicationException(string.Format("The account {0} does not exsit in the system please register and try again.", email)); 16:  17: // save the note 18: var noteToAdd = new Note(email) { Title = title, Content = content, DateCreated = DateTime.Now }; 19: noteContext.Add(noteToAdd); 20: } 21: } And updated our client application to test the service. I didn't implement any list service to show all notes but we can have a look on the local SQL database if we ran it at local development fabric.   Summary In this post I explained a bit about the limited LINQ support for the table service. And then I demonstrated about how to use the repository pattern in the table service data access layer and make the DataContext dynamically. The DynamicDataContext I created in this post is just a prototype. In fact we should create the relevant interface to make it testable and for better structure we'd better separate the DataContext classes for each individual kind of entity. So it should have IDataContextBase<T>, DataContextBase<T> and for each entity we would have class AccountDataContext<Account> : IDataContextBase<Account>, DataContextBase<Account> { … } class NoteDataContext<Note> : IDataContextBase<Note>, DataContextBase<Note> { … }   Besides the structured data saving and loading, another common scenario would be saving and loading some binary data such as images, files. In my next post I will show how to use the Blob Service to store the bindery data - make the account be able to upload their logo in my example.   Hope this helps, Shaun   All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Q&amp;A: How do I cancel my Windows Azure Platform Introductory Special? (or any Subscription)

    - by Eric Nelson
    Short answer: Don’t! Just kidding :-) Long answer: I believe it is the same process as for other Microsoft Online Services – but I have never tried it. Hence please post a comment if you follow this successfully or not and I will amend. From http://www.microsoft.com/online/help/en-us/mocp/, search for “cancel” and you get: What I am not clear about is whether an Introductory Special is classed as a trial. Either way, the answer is to contact support and ask to cancel. I would suggest you are fully armed with details of your subscription which you can get from signing in to https://mocp.microsoftonline.com. You can contact support via a online web form at https://mocp-support.custhelp.com/ Or You can call them. The details are again on the support page http://www.microsoft.com/online/help/en-us/mocp/  In the UK you can call 0800 731 8457 or (0) 20 3027 6039 Monday – Friday 09:00 – 17:00 GMT (UTC). I hope that helps.

    Read the article

  • Mobile My Oracle Support 6.3 Release is live!

    - by JanSyss
    We have released Mobile My Oracle Support 6.3 last Saturday (13-Oct-2012), including 10 enhancements and almost 40 bug fixes. Mobile My Oracle Support is My Oracle Support's webapplication optimized for mobile devices to manage your Service Requests, your On Demand Requests for Change (RFCs), search over Support's Knowledge Base, Bug database or Sun System Handbook, and to manage your pending user requests (CUA). You can find the application at http://support.oracle.mobi  or get redirected from http://support.oracle.com when using a mobile device. Overall Several UI optimizations in different screens. Service Request Area Show the platinum icon for Platinum SRs and the restore status for Platinum Sev 1s. Email send with Share functionality now contains links to Mobile MOS and Full Site. Knowledge Management Area Ability in Advanced Search to search the Sun System Handbook (cfr. screenshot below) Better rendering of the KB documents to avoid where possible horizontal scrolling. Don't hesitate to share your feedback and comments or even requests.

    Read the article

  • Ajouter l'accès RDP à une application LightSwitch 2011 déployée sur Windows Azure, un article d'Avkash Chauhan, traduit par Deepin Prayag

    Citation: Visual Studio LightSwitch 2011 propose des « starter kits » et des options de déploiement flexibles qui vous aident à créer et à facilement publier des applications métier personnalisées à l'aspect professionnel et distingué, sans nécessité de code. Visual Studio LightSwitch propose une manière simple de développer des applications métier de type bureau ou Cloud. LightSwitch gère toute la plomberie pour vous, afin que vous puissiez vous concentrer sur la création de valeurs métier. Partie 3 :

    Read the article

  • June 2013 Release of the Ajax Control Toolkit

    - by Stephen.Walther
    I’m happy to announce the June 2013 release of the Ajax Control Toolkit. For this release, we enhanced the AjaxFileUpload control to support uploading files directly to Windows Azure. We also improved the SlideShow control by adding support for CSS3 animations. You can get the latest release of the Ajax Control Toolkit by visiting the project page at CodePlex (http://AjaxControlToolkit.CodePlex.com). Alternatively, you can execute the following NuGet command from the Visual Studio Library Package Manager window: Uploading Files to Azure The AjaxFileUpload control enables you to efficiently upload large files and display progress while uploading. With this release, we’ve added support for uploading large files directly to Windows Azure Blob Storage (You can continue to upload to your server hard drive if you prefer). Imagine, for example, that you have created an Azure Blob Storage container named pictures. In that case, you can use the following AjaxFileUpload control to upload to the container: <toolkit:ToolkitScriptManager runat="server" /> <toolkit:AjaxFileUpload ID="AjaxFileUpload1" StoreToAzure="true" AzureContainerName="pictures" runat="server" /> Notice that the AjaxFileUpload control is declared with two properties related to Azure. The StoreToAzure property causes the AjaxFileUpload control to upload a file to Azure instead of the local computer. The AzureContainerName property points to the blob container where the file is uploaded. .int3{position:absolute;clip:rect(487px,auto,auto,444px);}SMALL cash advance VERY CHEAP To use the AjaxFileUpload control, you need to modify your web.config file so it contains some additional settings. You need to configure the AjaxFileUpload handler and you need to point your Windows Azure connection string to your Blob Storage account. <configuration> <appSettings> <!--<add key="AjaxFileUploadAzureConnectionString" value="UseDevelopmentStorage=true"/>--> <add key="AjaxFileUploadAzureConnectionString" value="DefaultEndpointsProtocol=https;AccountName=testact;AccountKey=RvqL89Iw4npvPlAAtpOIPzrinHkhkb6rtRZmD0+ojZupUWuuAVJRyyF/LIVzzkoN38I4LSr8qvvl68sZtA152A=="/> </appSettings> <system.web> <compilation debug="true" targetFramework="4.5" /> <httpRuntime targetFramework="4.5" /> <httpHandlers> <add verb="*" path="AjaxFileUploadHandler.axd" type="AjaxControlToolkit.AjaxFileUploadHandler, AjaxControlToolkit"/> </httpHandlers> </system.web> <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <handlers> <add name="AjaxFileUploadHandler" verb="*" path="AjaxFileUploadHandler.axd" type="AjaxControlToolkit.AjaxFileUploadHandler, AjaxControlToolkit"/> </handlers> <security> <requestFiltering> <requestLimits maxAllowedContentLength="4294967295"/> </requestFiltering> </security> </system.webServer> </configuration> You supply the connection string for your Azure Blob Storage account with the AjaxFileUploadAzureConnectionString property. If you set the value “UseDevelopmentStorage=true” then the AjaxFileUpload will upload to the simulated Blob Storage on your local machine. After you create the necessary configuration settings, you can use the AjaxFileUpload control to upload files directly to Azure (even very large files). Here’s a screen capture of how the AjaxFileUpload control appears in Google Chrome: After the files are uploaded, you can view the uploaded files in the Windows Azure Portal. You can see that all 5 files were uploaded successfully: New AjaxFileUpload Events In response to user feedback, we added two new events to the AjaxFileUpload control (on both the server and the client): · UploadStart – Raised on the server before any files have been uploaded. · UploadCompleteAll – Raised on the server when all files have been uploaded. · OnClientUploadStart – The name of a function on the client which is called before any files have been uploaded. · OnClientUploadCompleteAll – The name of a function on the client which is called after all files have been uploaded. These new events are most useful when uploading multiple files at a time. The updated AjaxFileUpload sample page demonstrates how to use these events to show the total amount of time required to upload multiple files (see the AjaxFileUpload.aspx file in the Ajax Control Toolkit sample site). SlideShow Animated Slide Transitions With this release of the Ajax Control Toolkit, we also added support for CSS3 animations to the SlideShow control. The animation is used when transitioning from one slide to another. Here’s the complete list of animations: · FadeInFadeOut · ScaleX · ScaleY · ZoomInOut · Rotate · SlideLeft · SlideDown You specify the animation which you want to use by setting the SlideShowAnimationType property. For example, here is how you would use the Rotate animation when displaying a set of slides: <%@ Page Language="C#" AutoEventWireup="true" CodeBehind="ShowSlideShow.aspx.cs" Inherits="TestACTJune2013.ShowSlideShow" %> <%@ Register TagPrefix="toolkit" Namespace="AjaxControlToolkit" Assembly="AjaxControlToolkit" %> <script runat="Server" type="text/C#"> [System.Web.Services.WebMethod] [System.Web.Script.Services.ScriptMethod] public static AjaxControlToolkit.Slide[] GetSlides() { return new AjaxControlToolkit.Slide[] { new AjaxControlToolkit.Slide("slides/Blue hills.jpg", "Blue Hills", "Go Blue"), new AjaxControlToolkit.Slide("slides/Sunset.jpg", "Sunset", "Setting sun"), new AjaxControlToolkit.Slide("slides/Winter.jpg", "Winter", "Wintery..."), new AjaxControlToolkit.Slide("slides/Water lilies.jpg", "Water lillies", "Lillies in the water"), new AjaxControlToolkit.Slide("slides/VerticalPicture.jpg", "Sedona", "Portrait style picture") }; } </script> <!DOCTYPE html> <html > <head runat="server"> <title></title> </head> <body> <form id="form1" runat="server"> <div> <toolkit:ToolkitScriptManager ID="ToolkitScriptManager1" runat="server" /> <asp:Image ID="Image1" Height="300" Runat="server" /> <toolkit:SlideShowExtender ID="SlideShowExtender1" TargetControlID="Image1" SlideShowServiceMethod="GetSlides" AutoPlay="true" Loop="true" SlideShowAnimationType="Rotate" runat="server" /> </div> </form> </body> </html> In the code above, the set of slides is exposed by a page method named GetSlides(). The SlideShowAnimationType property is set to the value Rotate. The following animated GIF gives you an idea of the resulting slideshow: If you want to use either the SlideDown or SlideRight animations, then you must supply both an explicit width and height for the Image control which is the target of the SlideShow extender. For example, here is how you would declare an Image and SlideShow control to use a SlideRight animation: <toolkit:ToolkitScriptManager ID="ToolkitScriptManager1" runat="server" /> <asp:Image ID="Image1" Height="300" Width="300" Runat="server" /> <toolkit:SlideShowExtender ID="SlideShowExtender1" TargetControlID="Image1" SlideShowServiceMethod="GetSlides" AutoPlay="true" Loop="true" SlideShowAnimationType="SlideRight" runat="server" /> Notice that the Image control includes both a Height and Width property. Here’s an approximation of this animation using an animated GIF: Summary The Superexpert team worked hard on this release. We hope you like the new improvements to both the AjaxFileUpload and the SlideShow controls. We’d love to hear your feedback in the comments. On to the next sprint!

    Read the article

  • Combining Shared Secret and Certificates

    - by Michael Stephenson
    As discussed in the introduction article this walkthrough will explain how you can implement WCF security with the Windows Azure Service Bus to ensure that you can protect your endpoint in the cloud with a shared secret but also combine this with certificates so that you can identify the sender of the message.   Prerequisites As in the previous article before going into the walk through I want to explain a few assumptions about the scenario we are implementing but to keep the article shorter I am not going to walk through all of the steps in how to setup some of this. In the solution we have a simple console application which will represent the client application. There is also the services WCF application which contains the WCF service we will expose via the Windows Azure Service Bus. The WCF Service application in this example was hosted in IIS 7 on Windows 2008 R2 with AppFabric Server installed and configured to auto-start the WCF listening services. I am not going to go through significant detail around the IIS setup because it should not matter in relation to this article however if you want to understand more about how to configure WCF and IIS for such a scenario please refer to the following paper which goes into a lot of detail about how to configure this. The link is: http://tinyurl.com/8s5nwrz   Setting up the Certificates To keep the post and sample simple I am going to use the local computer store for all certificates but this bit is really just the same as setting up certificates for an example where you are using WCF without using Windows Azure Service Bus. In the sample I have included two batch files which you can use to create the sample certificates or remove them. Basically you will end up with: A certificate called PocServerCert in the personal store for the local computer which will be used by the WCF Service component A certificate called PocClientCert in the personal store for the local computer which will be used by the client application A root certificate in the Root store called PocRootCA with its associated revocation list which is the root from which the client and server certificates were created   For the sample Im just using development certificates like you would normally, and you can see exactly how these are configured and placed in the stores from the batch files in the solution using makecert and certmgr.   The Service Component To begin with let's look at the service component and how it can be configured to listen to the service bus using a shared secret but to also accept a username token from the client. In the sample the service component is called Acme.Azure.ServiceBus.Poc.Cert.Services. It has a single service which is the Visual Studio template for a WCF service when you add a new WCF Service Application so we have a service called Service1 with its Echo method. Nothing special so far!.... The next step is to look at the web.config file to see how we have configured the WCF service. In the services section of the WCF configuration you can see I have created my service and I have created a local endpoint which I simply used to do a little bit of diagnostics and to check it was working, but more importantly there is the Windows Azure endpoint which is using the ws2007HttpRelayBinding (note that this should also work just the same if your using netTcpRelayBinding). The key points to note on the above picture are the service behavior called MyServiceBehaviour and the service bus endpoints behavior called MyEndpointBehaviour. We will go into these in more detail later.   The Relay Binding The relay binding for the service has been configured to use the TransportWithMessageCredential security mode. This is the important bit where the transport security really relates to the interaction between the service and listening to the Azure Service Bus and the message credential is where we will use our certificate like we have specified in the message/clientCrentialType attribute. Note also that we have left the relayClientAuthenticationType set to RelayAccessToken. This means that authentication will be made against ACS for accessing the service bus and messages will not be accepted from any sender who has not been authenticated by ACS.   The Endpoint Behaviour In the below picture you can see the endpoint behavior which is configured to use the shared secret client credential for accessing the service bus and also for diagnostic purposes I have included the service registry element.     Hopefully if you are familiar with using Windows Azure Service Bus relay feature the above is very familiar to you and this is a very common setup for this section. There is nothing specific to the username token implementation here. The Service Behaviour Now we come to the bit with most of the certificate stuff in it. When you configure the service behavior I have included the serviceCredentials element and then setup to use the clientCertificate check and also specifying the serviceCertificate with information on how to find the servers certificate in the store.     I have also added a serviceAuthorization section where I will implement my own authorization component to perform additional security checks after the service has validated that the message was signed with a good certificate. I also have the same serviceSecurityAudit configuration to log access to my service. My Authorization Manager The below picture shows you implementation of my authorization manager. WCF will eventually hand off the message to my authorization component before it calls the service code. This is where I can perform some logic to check if the identity is allowed to access resources. In this case I am simple rejecting messages from anyone except the PocClientCertificate.     The Client Now let's take a look at the client side of this solution and how we can configure the client to authenticate against ACS but also send a certificate over to the service component so it can implement additional security checks on-premise. I have a console application and in the program class I want to use the proxy generated with Add Service Reference to send a message via the Azure Service Bus. You can see in my WCF client configuration below I have setup my details for the azure service bus url and am using the ws2007HttpRelayBinding.   Next is my configuration for the relay binding. You can see below I have configured security to use TransportWithMessageCredential so we will flow the token from a certificate with the message and also the RelayAccessToken relayClientAuthenticationType which means the component will validate against ACS before being allowed to access the relay endpoint to send a message.     After the binding we need to configure the endpoint behavior like in the below picture. This contains the normal transportClientEndpointBehaviour to setup the ACS shared secret configuration but we have also configured the clientCertificate to look for the PocClientCert.     Finally below we have the code of the client in the console application which will call the service bus. You can see that we have created our proxy and then made a normal call to a WCF in exactly the normal way but the configuration will jump in and ensure that a token is passed representing the client certificate.     Conclusion As you can see from the above walkthrough it is not too difficult to configure a service to use both a shared secret and certificate based token at the same time. This gives you the power and protection offered by the access control service in the cloud but also the ability to flow additional tokens to the on-premise component for additional security features to be implemented. Sample The sample used in this post is available at the following location: https://s3.amazonaws.com/CSCBlogSamples/Acme.Azure.ServiceBus.Poc.Cert.zip

    Read the article

  • How to convert this query to a "django model query" ?

    - by fabriciols
    Hello ! What i want is simple : models : class userLastTrophy(models.Model): user = models.ForeignKey(userInfo) platinum = models.IntegerField() gold = models.IntegerField() silver = models.IntegerField() bronze = models.IntegerField() level = models.IntegerField() rank = models.IntegerField() perc_level = models.IntegerField() date_update = models.DateTimeField(default=datetime.now, blank=True) total = models.IntegerField() points = models.IntegerField() class userTrophy(models.Model): user = models.ForeignKey(userInfo) platinum = models.IntegerField() gold = models.IntegerField() silver = models.IntegerField() bronze = models.IntegerField() total = models.IntegerField() level = models.IntegerField() perc_level = models.IntegerField() date_update = models.DateTimeField(default=datetime.now, blank=True) rank = models.IntegerField(default=0) total = models.IntegerField(default=0) points = models.IntegerField(default=0) last_trophy = models.ForeignKey(userLastTrophy, default=0) I have this query : select t2.user_id as id, t2.platinum - t1.platinum as plat, t2.gold - t1.gold as gold, t2.silver - t1.silver as silver, t2.bronze - t1.bronze as bronze, t2.points - t1.points as points from myps3t_usertrophy t2, myps3t_userlasttrophy t1 where t1.id = t2.last_trophy_id order by points; how to do this with django models ?

    Read the article

  • How to create a local Windows-based service bus outside of Azure, similar to Redis with automatic fail-over?

    - by ElHaix
    We are implementing a service/message-bus feature in our SignalR application and have been looking at Redis, with automatic fail-over using Redis Sentiel. We would like to maintain our own servers and have read SignalR powered by Service Bus. Since this is a Winddows Azure implementation, how can I accomplish this in our internal network with VM's with automatic fail-over similar to the Redis solution discussed above?

    Read the article

  • SQL SERVER – Microsoft SQL Server Migration Assistant V6.0 Released

    - by Pinal Dave
    Every company makes a different decision about the database when they start, but as they move forward they mature and make the decision which is based on their experience and best interest of the organization. Similarly, quite a many organizations make different decisions on database, like Sybase, MySQL, Oracle or Access and as time passes by they learn that now they want to move to a different platform. Microsoft makes it easy for SQL Server professional by releasing various Migration Assistant tools. Last week, Microsoft released Microsoft SQL Server Migration Assistant v6.0. Here are different tools released earlier last week to migrate various product to SQL Server. Microsoft SQL Server Migration Assistant v6.0 for Sybase SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Sybase Adaptive Server Enterprise (ASE) to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for MySQL SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from MySQL to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for Oracle SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Oracle to SQL Server and Azure SQL DB. SSMA automates all aspects of migration including migration assessment analysis, schema and SQL statement conversion, data migration as well as migration testing. Microsoft SQL Server Migration Assistant v6.0 for Access SQL Server Migration Assistant (SSMA) is a free supported tool from Microsoft that simplifies database migration process from Access to SQL Server. SSMA for Access automates conversion of Microsoft Access database objects to SQL Server database objects, loads the objects into SQL Server and Azure SQL DB, and then migrates data from Microsoft Access to SQL Server and Azure SQL DB. Reference: Pinal Dave (http://blog.sqlauthority.com)Filed under: PostADay, SQL, SQL Authority, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, T SQL Tagged: SQL Migration

    Read the article

  • ASP.NET Universal Providers (System.Web.Providers)

    - by shiju
    Microsoft Web Platform and Tools (WPT)  team has announced the release of ASP.NET Universal Providers that allows you to use Session, Membership, Roles and Profile providers along with all editions of SQL Server 2005 and later. This support includes Sql Server Express, Sql Server CE and Sql Azure.ASP.NET Universal Providers is available as a NuGet package and the following command will install the package via NuGet. PM> Install-Package System.Web.Providers The support for Sql Azure will help the Azure developers to easily migrate their ASP.NET applications to Azure platform. System.Web.Providers.DefaultMembershipProvider is the equivalent name for the current SqlMembershipProvider and you can put right connectionstring name in the configuration and it will work with any version of Sql Server based on the copnnection string. System.Web.Providers.DefaultProfileProvider is the equivalent provider name for existing System.Web.Profile.SqlProfileProvider and  System.Web.Providers.DefaultRoleProvider is the equivalent provider name for the existing System.Web.Security.SqlRoleProvider.

    Read the article

< Previous Page | 36 37 38 39 40 41 42 43 44 45 46 47  | Next Page >