Search Results

Search found 26945 results on 1078 pages for 'azure virtual machine'.

Page 18/1078 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • What makes a project suitable for Azure/the cloud?

    - by dotnetdev
    Hi, I have read about Windows Azure but to get deeper into this technology, I need to (obviously) use it. I have a small ASP.NET site which gets little traffic and I am thinking that hosting this on Azure would save me money. Other than this, what other factors would contribute to a project being suitable for the cloud? Thanks

    Read the article

  • Remotely sync Time Machine drives

    - by Off Rhoden
    I have an Xserve that runs Time Machine to a local terabyte drive. I also connected my external terabyte drive for a time period and had Time Machine use it to establish the seed data. I plan to take my drive back home with me (out of state) and have the Xserve return to using its local drive for Time Machine. But when I get back home, is there a way to keep my external drive's copy of the Time Machine Backups folder in sync with the Backups folder back on the Xserve? I'm wanting a full copy of the history (makes an awesome remote backup). I've thought of using the unix command rsync. In fact, that's how I had been doing it but I was looking the compactness that Time Machine was able to achieve. Thanks.

    Read the article

  • Windows Azure Recipe: Software as a Service (SaaS)

    - by Clint Edmonson
    The cloud was tailor built for aspiring companies to create innovative internet based applications and solutions. Whether you’re a garage startup with very little capital or a Fortune 1000 company, the ability to quickly setup, deliver, and iterate on new products is key to capturing market and mind share. And if you can capture that share and go viral, having resiliency and infinite scale at your finger tips is great peace of mind. Drivers Cost avoidance Time to market Scalability Solution Here’s a sketch of how a basic Software as a Service solution might be built out: Ingredients Web Role – this hosts the core web application. Each web role will host an instance of the software and as the user base grows, additional roles can be spun up to meet demand. Access Control – this service is essential to managing user identity. It’s backed by a full blown implementation of Active Directory and allows the definition and management of users, groups, and roles. A pre-built ASP.NET membership provider is included in the training kit to leverage this capability but it’s also flexible enough to be combined with external Identity providers including Windows LiveID, Google, Yahoo!, and Facebook. The provider model provides extensibility to hook into other industry specific identity providers as well. Databases – nearly every modern SaaS application is backed by a relational database for its core operational data. If the solution is sold to organizations, there’s a good chance multi-tenancy will be needed. An emerging best practice for SaaS applications is to stand up separate SQL Azure database instances for each tenant’s proprietary data to ensure isolation from other tenants. Worker Role – this is the best place to handle autonomous background processing such as data aggregation, billing through external services, and other specialized tasks that can be performed asynchronously. Placing these tasks in a worker role frees the web roles to focus completely on user interaction and data input and provides finer grained control over the system’s scalability and throughput. Caching (optional) – as a web site traffic grows caching can be leveraged to keep frequently used read-only, user specific, and application resource data in a high-speed distributed in-memory for faster response times and ultimately higher scalability without spinning up more web and worker roles. It includes a token based security model that works alongside the Access Control service. Blobs (optional) – depending on the nature of the software, users may be creating or uploading large volumes of heterogeneous data such as documents or rich media. Blob storage provides a scalable, resilient way to store terabytes of user data. The storage facilities can also integrate with the Access Control service to ensure users’ data is delivered securely. Training & Examples These links point to online Windows Azure training labs and examples where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Developing Applications for the Cloud, 2nd Edition (eBook) This book demonstrates how you can create from scratch a multi-tenant, Software as a Service (SaaS) application to run in the cloud using the latest versions of the Windows Azure Platform and tools. The book is intended for any architect, developer, or information technology (IT) professional who designs, builds, or operates applications and services that run on or interact with the cloud. Fabrikam Shipping (SaaS reference application) This is a full end to end sample scenario which demonstrates how to use the Windows Azure platform for exposing an application as a service. We developed this demo just as you would: we had an existing on-premises sample, Fabrikam Shipping, and we wanted to see what it would take to transform it in a full subscription based solution. The demo you find here is the result of that investigation See my Windows Azure Resource Guide for more guidance on how to get started, including more links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Lost Internet access in Windows XP Mode virtual pc under Windows 7

    - by kousen
    In my office, I created and configured a virtual pc in Windows XP Mode. Everything was working fine. Now I'm on the road, and my Internet access (in the host operation system) is either via a hotel wifi or through my Verizon air card. Either way, I've lost Internet access in the virtual pc. I went into the Virtual PC settings, and set the Networking value to Shared Networking (NAT). Actually, I've tried every combination I can find, but I can't get from the virtual pc to the web. I'm hoping to use the virtual box at a client site, so I really need that access. Is there anything I can do to get it back? Thanks for any help.

    Read the article

  • VMWare Fusion - Cannot communicate between Host Mac and Virtual Mac running on same machine [migrated]

    - by Jeff Gold
    I'm running a "virtual" Mac OS machine on a Mac running VMWare Fusion. The Virtual Mac is setup with Bridged Networking, and has its own separate IP address. The outside world can connect to either the Mac itself or the virtual Mac via their respective IP addresses just fine, this works great! The problem... the Mac itself cannot connect to the virtual Mac's IP address, nor can the virtual Mac connect to the real Mac's IP address. Some things I've read mention something about enabling VMCI, but I have no idea how to do this, or if this is even the correct solution. Any suggestions?

    Read the article

  • Does Windows Virtual PC support virtual applications with an XP Home Edition guest?

    - by endolith
    I've installed Windows Virtual PC in Windows 7 and have the XP Mode virtual PC working. I can run virtual applications with it and the integration features all work. I used Disk2VHD to convert my existing XP Home drive into a VHD, so I can use it as a virtual PC, too. It works in general, but it sometimes pops up the "Could not enable integration features" error. I don't see the host computer's drives in the guest, and I don't see the guest's applications in the host's Start menu. Is this just because the guest is XP Home instead of XP Pro? Do I have to reinstall all these apps in the XP Mode VHD in order to get them as virtual apps? Could something else be preventing it from working?

    Read the article

  • Virtual box lost files

    - by Paul Lloyd
    I have been running Virtual Box on a Macbook Pro for a year or more, with an old version of Windows XP running on the virtual machine. Recently my Mac battery dies completely and I had left VB / Windows running. When I re-started the Mac, VB would not load. I reinstalled a recent download of VB from a DMG file I had stored. Now VB starts but when I try to start the virtual machine I get the following message, Failed to open a session for the virtual machine Windows XP. Failed to load VMMR0.r0 (VERR_SUPLIB_WRITE_NON_SYS_GROUP). Result Code: NS_ERROR_FAILURE (0x80004005) Component: Console Interface: IConsole {1968b7d3-e3bf-4ceb-99e0-cb7c913317bb} I have been backing up the MAC using Time Machine and have backups going back months. I really need to access the windows files as they have my Accounts and other business critical stuff. Any ideas please, or does anyone know where I can get some support for this combination of hardware / software, apologies I am a novice in this area. Thanks in advance.

    Read the article

  • Virtual PC and hardware-assisted virtualization (VT-x) problem

    - by Vesa Huovi
    I've installed Microsoft's Virtual PC on Windows 7, but when I try to start a virtual machine I get the following error message: '<Virtual machine name' could not be started because hardware-assisted virtualization is disabled. Please enable hardware virtualization in the BIOS settings and try again. If hardware virtualization settings is already enabled, you may have to disable Trusted Execution Technology (TXT) setting in BIOS or update the system BIOS. However, if I download and run the Hardware-Assisted Virtualization Detection Tool, it gives the following positive message: This computer is configured with hardware-assisted virtualization. This computer meets the processor requirements to run Windows Virtual PC. If this computer runs a supported edition of Windows® 7, you can install Windows Virtual PC. I've also used the MSR Walker in the third-party utility CrystalCPUID to examine MSR 0x3a on both processors on my system, and it's 0x5 (0x4 = VT enabled, 0x1 = VT lock), as expected. Does anyone have any ideas of what else to check? Thanks.

    Read the article

  • virtual machines

    - by André Alçada Padez
    well, i hope this doesn't get categorized as a boating question, but it really is related to programming. I have windows XP, and i am going to have to have a VM running: Windows 7 Visual Studio 2008 Sql Server 2008 IIS 7 (8 in a little while) Wamp Photoshop CS5 etc... so i was wondering what should i use to be easier to install and configure, and best performance: Virtual Box or Microsoft's Virtual Machine? Thank you Well i tried Virtual Box, it's always crashing for some reason. I think i'm going to try Virtual PC, just to stick to an all Microsoft Solution.

    Read the article

  • Windows 2000 under Windows 7 Virtual PC not working correctly

    - by dave
    I have just moved my Windows 2000 Virtual PCs from Vista to Windows 7 Professional (64-bit). The machines work to a point but I have found some problems: drive mapping does not seem to work any more. I need this to exchange data. I do not need network access to the virtual PC so would rather leave it unconnected. the virtual PC would automatically shutdown the session and go to the login screen after a few minutes of inactivity. I tried installing the Virtual PC Integration Components but the install failed (one of the messages basically says it's XP+ only). Now I'm stuck in 640x480 mode with mouse capture. I have heard that you can install an older version of the Integration Components but this sounds a bit suspect. Does anyone have any ideas on how to get Windows2000 working with drive sharing on a Virtual PC?

    Read the article

  • Running a Linux virtual machine on Windows 7

    - by hekevintran
    I want to do two things: Set up a virtual machine on Windows 7 to run Ubuntu Set up a way for the virtual machine to read the windows disk or windows to have read/write access to the virtual machine's disk. My goal is to have a place where both Ubuntu and Windows can read and write. What software is good for this task? Are their free programs that can run virtual machines? Also if my machine is running Windows 7 64-bit, can I install Ubuntu 32-bit? Or am I forced to use Ubuntu 64-bit? Or does it not matter?

    Read the article

  • sizes of RAM, of virtual memory and of swap for 32-bit OS

    - by Tim
    If I understand correctly, a 32-bit OS (Ubuntu) can only address 4GiB memory, so RAM with size larger than 4Gib will only be used 4Gib of itself and the rest is a waste. I am now confused about this situation for RAM with similar one for virtual memory and for swap. with virtual memory being swap + RAM, if the size of the virtual memory exceeds 4Gib, will the exceeding part be a waste for the 32-bit OS? if I now have to choose the size for my swap partition, is it a factor to consider that the 32-bit OS can only address 4GiB memory? Does the size of swap have to be chosen with respect to the 4Gib addressible limitation? Will the swap exceeding 4GiB always be a waste? is virtual memory equal to RAM and swap? or can virtual memory use space on the hard drive outside the swap partition? Thanks and regards!

    Read the article

  • Virtual Machine and Virus

    - by tellme
    I have a requirement for which I have to get online without protection (firewall, anti-virus). At the same time, I don't want to risk getting infected with viruses. If I install a virtual machine (VirtualBox) to test, and it does get infected with viruses, will it also infect my host system? In other words, can I use the virtual machine for testing without being concerned about a virus on the virtual machine infecting my host?

    Read the article

  • Virtual Box for everyday use?

    - by Mark
    I was just thinking about how nice virtual machines are...and how even "rebooting" is less painful, because at the very least, you don't have to wait for your physical computer to turn off and on with all the mobo shannanigans at the start... so, what if I ran everything in a virtual machine? Then I wouldn't really need a primary OS, I just need something than can run VirtualBox or what have you. So what's the lightest weight OS I could install, that supports a good virtual machine?

    Read the article

  • Can't use VHD in a particular machine

    - by madth3
    (All the machines referred below have Windows 7 Professional.) I have a virtual machine targetted for Virtual PC. I'm able to use that disk in two different machines (32 bit and 64 bit, respectively). Yet in the machine where I wanted to finally install this virtual machine I get the error that the VHD file is not valid. The faulty machine is 32 bit and has the exact same Virtual PC version as the other machines. The file was correctly copied (SHA1sum says so). Default XP Mode works (therefore Virtual PC is not so wrong) Does anyone have any idea what might be wrong in the target machine?

    Read the article

  • Node.js Adventure - Storage Services and Service Runtime

    - by Shaun
    When I described on how to host a Node.js application on Windows Azure, one of questions might be raised about how to consume the vary Windows Azure services, such as the storage, service bus, access control, etc.. Interact with windows azure services is available in Node.js through the Windows Azure Node.js SDK, which is a module available in NPM. In this post I would like to describe on how to use Windows Azure Storage (a.k.a. WAS) as well as the service runtime.   Consume Windows Azure Storage Let’s firstly have a look on how to consume WAS through Node.js. As we know in the previous post we can host Node.js application on Windows Azure Web Site (a.k.a. WAWS) as well as Windows Azure Cloud Service (a.k.a. WACS). In theory, WAWS is also built on top of WACS worker roles with some more features. Hence in this post I will only demonstrate for hosting in WACS worker role. The Node.js code can be used when consuming WAS when hosted on WAWS. But since there’s no roles in WAWS, the code for consuming service runtime mentioned in the next section cannot be used for WAWS node application. We can use the solution that I created in my last post. Alternatively we can create a new windows azure project in Visual Studio with a worker role, add the “node.exe” and “index.js” and install “express” and “node-sqlserver” modules, make all files as “Copy always”. In order to use windows azure services we need to have Windows Azure Node.js SDK, as knows as a module named “azure” which can be installed through NPM. Once we downloaded and installed, we need to include them in our worker role project and make them as “Copy always”. You can use my “Copy all always” tool mentioned in my last post to update the currently worker role project file. You can also find the source code of this tool here. The source code of Windows Azure SDK for Node.js can be found in its GitHub page. It contains two parts. One is a CLI tool which provides a cross platform command line package for Mac and Linux to manage WAWS and Windows Azure Virtual Machines (a.k.a. WAVM). The other is a library for managing and consuming vary windows azure services includes tables, blobs, queues, service bus and the service runtime. I will not cover all of them but will only demonstrate on how to use tables and service runtime information in this post. You can find the full document of this SDK here. Back to Visual Studio and open the “index.js”, let’s continue our application from the last post, which was working against Windows Azure SQL Database (a.k.a. WASD). The code should looks like this. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Now let’s create a new function, copy the records from WASD to table service. 1. Delete the table named “resource”. 2. Create a new table named “resource”. These 2 steps ensures that we have an empty table. 3. Load all records from the “resource” table in WASD. 4. For each records loaded from WASD, insert them into the table one by one. 5. Prompt to user when finished. In order to use table service we need the storage account and key, which can be found from the developer portal. Just select the storage account and click the Manage Keys button. Then create two local variants in our Node.js application for the storage account name and key. Since we need to use WAS we need to import the azure module. Also I created another variant stored the table name. In order to work with table service I need to create the storage client for table service. This is very similar as the Windows Azure SDK for .NET. As the code below I created a new variant named “client” and use “createTableService”, specified my storage account name and key. 1: var azure = require("azure"); 2: var storageAccountName = "synctile"; 3: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 4: var tableName = "resource"; 5: var client = azure.createTableService(storageAccountName, storageAccountKey); Now create a new function for URL “/was/init” so that we can trigger it through browser. Then in this function we will firstly load all records from WASD. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: } 18: } 19: }); 20: } 21: }); 22: }); When we succeed loaded all records we can start to transform them into table service. First I need to recreate the table in table service. This can be done by deleting and creating the table through table client I had just created previously. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: } 27: }); 28: }); 29: } 30: } 31: }); 32: } 33: }); 34: }); As you can see, the azure SDK provide its methods in callback pattern. In fact, almost all modules in Node.js use the callback pattern. For example, when I deleted a table I invoked “deleteTable” method, provided the name of the table and a callback function which will be performed when the table had been deleted or failed. Underlying, the azure module will perform the table deletion operation in POSIX async threads pool asynchronously. And once it’s done the callback function will be performed. This is the reason we need to nest the table creation code inside the deletion function. If we perform the table creation code after the deletion code then they will be invoked in parallel. Next, for each records in WASD I created an entity and then insert into the table service. Finally I send the response to the browser. Can you find a bug in the code below? I will describe it later in this post. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: // transform the records 26: for (var i = 0; i < results.rows.length; i++) { 27: var entity = { 28: "PartitionKey": results.rows[i][1], 29: "RowKey": results.rows[i][0], 30: "Value": results.rows[i][2] 31: }; 32: client.insertEntity(tableName, entity, function (error) { 33: if (error) { 34: error["target"] = "insertEntity"; 35: res.send(500, error); 36: } 37: else { 38: console.log("entity inserted"); 39: } 40: }); 41: } 42: // send the 43: console.log("all done"); 44: res.send(200, "All done!"); 45: } 46: }); 47: }); 48: } 49: } 50: }); 51: } 52: }); 53: }); Now we can publish it to the cloud and have a try. But normally we’d better test it at the local emulator first. In Node.js SDK there are three build-in properties which provides the account name, key and host address for local storage emulator. We can use them to initialize our table service client. We also need to change the SQL connection string to let it use my local database. The code will be changed as below. 1: // windows azure sql database 2: //var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:ac6271ya9e.database.windows.net,1433;Database=synctile;Uid=shaunxu@ac6271ya9e;Pwd=eszqu94XZY;Encrypt=yes;Connection Timeout=30;"; 3: // sql server 4: var connectionString = "Driver={SQL Server Native Client 11.0};Server={.};Database={Caspar};Trusted_Connection={Yes};"; 5:  6: var azure = require("azure"); 7: var storageAccountName = "synctile"; 8: var storageAccountKey = "/cOy9L7xysXOgPYU9FjDvjrRAhaMX/5tnOpcjqloPNDJYucbgTy7MOrAW7CbUg6PjaDdmyl+6pkwUnKETsPVNw=="; 9: var tableName = "resource"; 10: // windows azure storage 11: //var client = azure.createTableService(storageAccountName, storageAccountKey); 12: // local storage emulator 13: var client = azure.createTableService(azure.ServiceClient.DEVSTORE_STORAGE_ACCOUNT, azure.ServiceClient.DEVSTORE_STORAGE_ACCESS_KEY, azure.ServiceClient.DEVSTORE_TABLE_HOST); Now let’s run the application and navigate to “localhost:12345/was/init” as I hosted it on port 12345. We can find it transformed the data from my local database to local table service. Everything looks fine. But there is a bug in my code. If we have a look on the Node.js command window we will find that it sent response before all records had been inserted, which is not what I expected. The reason is that, as I mentioned before, Node.js perform all IO operations in non-blocking model. When we inserted the records we executed the table service insert method in parallel, and the operation of sending response was also executed in parallel, even though I wrote it at the end of my logic. The correct logic should be, when all entities had been copied to table service with no error, then I will send response to the browser, otherwise I should send error message to the browser. To do so I need to import another module named “async”, which helps us to coordinate our asynchronous code. Install the module and import it at the beginning of the code. Then we can use its “forEach” method for the asynchronous code of inserting table entities. The first argument of “forEach” is the array that will be performed. The second argument is the operation for each items in the array. And the third argument will be invoked then all items had been performed or any errors occurred. Here we can send our response to browser. 1: app.get("/was/init", function (req, res) { 2: // load all records from windows azure sql database 3: sql.open(connectionString, function (err, conn) { 4: if (err) { 5: console.log(err); 6: res.send(500, "Cannot open connection."); 7: } 8: else { 9: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 10: if (err) { 11: console.log(err); 12: res.send(500, "Cannot retrieve records."); 13: } 14: else { 15: if (results.rows.length > 0) { 16: // begin to transform the records into table service 17: // recreate the table named 'resource' 18: client.deleteTable(tableName, function (error) { 19: client.createTableIfNotExists(tableName, function (error) { 20: if (error) { 21: error["target"] = "createTableIfNotExists"; 22: res.send(500, error); 23: } 24: else { 25: async.forEach(results.rows, 26: // transform the records 27: function (row, callback) { 28: var entity = { 29: "PartitionKey": row[1], 30: "RowKey": row[0], 31: "Value": row[2] 32: }; 33: client.insertEntity(tableName, entity, function (error) { 34: if (error) { 35: callback(error); 36: } 37: else { 38: console.log("entity inserted."); 39: callback(null); 40: } 41: }); 42: }, 43: // send reponse 44: function (error) { 45: if (error) { 46: error["target"] = "insertEntity"; 47: res.send(500, error); 48: } 49: else { 50: console.log("all done"); 51: res.send(200, "All done!"); 52: } 53: } 54: ); 55: } 56: }); 57: }); 58: } 59: } 60: }); 61: } 62: }); 63: }); Run it locally and now we can find the response was sent after all entities had been inserted. Query entities against table service is simple as well. Just use the “queryEntity” method from the table service client and providing the partition key and row key. We can also provide a complex query criteria as well, for example the code here. In the code below I queried an entity by the partition key and row key, and return the proper localization value in response. 1: app.get("/was/:key/:culture", function (req, res) { 2: var key = req.params.key; 3: var culture = req.params.culture; 4: client.queryEntity(tableName, culture, key, function (error, entity) { 5: if (error) { 6: res.send(500, error); 7: } 8: else { 9: res.json(entity); 10: } 11: }); 12: }); And then tested it on local emulator. Finally if we want to publish this application to the cloud we should change the database connection string and storage account. For more information about how to consume blob and queue service, as well as the service bus please refer to the MSDN page.   Consume Service Runtime As I mentioned above, before we published our application to the cloud we need to change the connection string and account information in our code. But if you had played with WACS you should have known that the service runtime provides the ability to retrieve configuration settings, endpoints and local resource information at runtime. Which means we can have these values defined in CSCFG and CSDEF files and then the runtime should be able to retrieve the proper values. For example we can add some role settings though the property window of the role, specify the connection string and storage account for cloud and local. And the can also use the endpoint which defined in role environment to our Node.js application. In Node.js SDK we can get an object from “azure.RoleEnvironment”, which provides the functionalities to retrieve the configuration settings and endpoints, etc.. In the code below I defined the connection string variants and then use the SDK to retrieve and initialize the table client. 1: var connectionString = ""; 2: var storageAccountName = ""; 3: var storageAccountKey = ""; 4: var tableName = ""; 5: var client; 6:  7: azure.RoleEnvironment.getConfigurationSettings(function (error, settings) { 8: if (error) { 9: console.log("ERROR: getConfigurationSettings"); 10: console.log(JSON.stringify(error)); 11: } 12: else { 13: console.log(JSON.stringify(settings)); 14: connectionString = settings["SqlConnectionString"]; 15: storageAccountName = settings["StorageAccountName"]; 16: storageAccountKey = settings["StorageAccountKey"]; 17: tableName = settings["TableName"]; 18:  19: console.log("connectionString = %s", connectionString); 20: console.log("storageAccountName = %s", storageAccountName); 21: console.log("storageAccountKey = %s", storageAccountKey); 22: console.log("tableName = %s", tableName); 23:  24: client = azure.createTableService(storageAccountName, storageAccountKey); 25: } 26: }); In this way we don’t need to amend the code for the configurations between local and cloud environment since the service runtime will take care of it. At the end of the code we will listen the application on the port retrieved from SDK as well. 1: azure.RoleEnvironment.getCurrentRoleInstance(function (error, instance) { 2: if (error) { 3: console.log("ERROR: getCurrentRoleInstance"); 4: console.log(JSON.stringify(error)); 5: } 6: else { 7: console.log(JSON.stringify(instance)); 8: if (instance["endpoints"] && instance["endpoints"]["nodejs"]) { 9: var endpoint = instance["endpoints"]["nodejs"]; 10: app.listen(endpoint["port"]); 11: } 12: else { 13: app.listen(8080); 14: } 15: } 16: }); But if we tested the application right now we will find that it cannot retrieve any values from service runtime. This is because by default, the entry point of this role was defined to the worker role class. In windows azure environment the service runtime will open a named pipeline to the entry point instance, so that it can connect to the runtime and retrieve values. But in this case, since the entry point was worker role and the Node.js was opened inside the role, the named pipeline was established between our worker role class and service runtime, so our Node.js application cannot use it. To fix this problem we need to open the CSDEF file under the azure project, add a new element named Runtime. Then add an element named EntryPoint which specify the Node.js command line. So that the Node.js application will have the connection to service runtime, then it’s able to read the configurations. Start the Node.js at local emulator we can find it retrieved the connections, storage account for local. And if we publish our application to azure then it works with WASD and storage service through the configurations for cloud.   Summary In this post I demonstrated how to use Windows Azure SDK for Node.js to interact with storage service, especially the table service. I also demonstrated on how to use WACS service runtime, how to retrieve the configuration settings and the endpoint information. And in order to make the service runtime available to my Node.js application I need to create an entry point element in CSDEF file and set “node.exe” as the entry point. I used five posts to introduce and demonstrate on how to run a Node.js application on Windows platform, how to use Windows Azure Web Site and Windows Azure Cloud Service worker role to host our Node.js application. I also described how to work with other services provided by Windows Azure platform through Windows Azure SDK for Node.js. Node.js is a very new and young network application platform. But since it’s very simple and easy to learn and deploy, as well as, it utilizes single thread non-blocking IO model, Node.js became more and more popular on web application and web service development especially for those IO sensitive projects. And as Node.js is very good at scaling-out, it’s more useful on cloud computing platform. Use Node.js on Windows platform is new, too. The modules for SQL database and Windows Azure SDK are still under development and enhancement. It doesn’t support SQL parameter in “node-sqlserver”. It does support using storage connection string to create the storage client in “azure”. But Microsoft is working on make them easier to use, working on add more features and functionalities.   PS, you can download the source code here. You can download the source code of my “Copy all always” tool here.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • SQL Azure Security: DoS

    - by Herve Roggero
    Since I decided to understand in more depth how SQL Azure works I started to dig into its performance characteristics. So I decided to write an application that allows me to put SQL Azure to the test and compare results with a local SQL Server database. One of the options I added is the ability to issue the same command on multiple threads to get certain performance metrics. That's when I stumbled on an interesting security feature of SQL Azure: its Denial of Service (DoS) detection engine. What this security feature does is that it performs a check on the number of connections being established, and if the rate of connection is too high, SQL Azure blocks all communication from that machine. I am still trying to learn more about this specific feature, but it appears that going to the SQL Azure portal and testing the connection from the portal "resets" the feature and you are allowed to connect again... until you reach the login threashold. In the specific test I was performing, all the logins were successful. I haven't tried to login with an invalid account or password... that will be for next time. On my Linked In group (SQL Server and SQL Azure Security: http://www.linkedin.com/groups?gid=2569994&trk=hb_side_g) Chip Andrews (www.sqlsecurity.com) pointed out that this feature in itself could present an internal threat. In theory, a rogue application could be issuing many login requests from a NATed network, which could potentially prevent any production system from connecting to SQL Azure within the same network. My initial response was that this could indeed be the case. However, while the TCP protocol contains the latest NATed IP address of a machine (which masks the origin of the machine making the SQL request), the TDS protocol itself contains the IP Address of the machine making the initial request; so technically there would be a way for SQL Azure to block only the internal IP address making the rogue requests.  So this warrants further investigation... stay tuned...

    Read the article

  • Azure Florida Association

    - by Dave Noderer
    Herve Roggero, SQL Azure MVP,  has created a virtual community to focus on Azure. Here is the outline from Herve:   User Group Name:  Azure Florida Association Purpose: Start a virtual Florida user group that targets the Azure platform Venues: Most meetings will be virtual; however I plan to host a few physical events across Florida if possible from time to time; physical events may be a few hours long with potentially more than one speaker Possible Topics: The topics will touch Azure generally speaking, but can have a wide array of concern such as Integration, Data Migration, Hosting, Security, Scalability, Mobile Device integration, successful ventures/lessons learned, cross cloud integration patterns, testing in the cloud, deployment management, reporting… Target Members: Architects, Developers, IT Managers Membership: Membership will be free; virtual events will be free; physical events may involve a minimal cover charge Speakers: If you are interested in speaking or if you have topic ideas, please let me know Frequency: Initially these meetings will be held every other month   The first meeting will be held on January 25, 2012 at 4PM EST. Vikas Sahni, SQL Azure MVP, will be presenting on Demystifying SQL Azure. Vikas will introduce SQL Azure, value proposition, usage scenarios, concepts and architecture, what is there and what is not, including Tips and Tricks.  The actual meeting link will be available in January but please join the linked in group now to be kept informed of this and future events: http://www.linkedin.com/groups?gid=4177626.

    Read the article

  • Windows Azure Interop

    - by kaleidoscope
    How Windows Azure Platform is an open cloud platform. What makes it interoperable? The Windows Azure platform supports popular standards and protocols including SOAP, REST, and XML. Developers can use their preferred programming frameworks including .NET, and PHP, now. Tools such as Eclipse have been created for PHP developers for building Windows Azure applications. Now external endpoints (inbound traffic) have been enabled to worker a role, which enables applications that receive internet traffic that aren’t running under IIS. Windows Azure interoperable with Java At PDC 09, solution accelerator for Tomcat is delivered. Tomcat is an open source software implementation of the Java Servlet and JavaServer Pages technologies. The Windows Azure solution accelerator leverages a PDC09 feature that enable arbitrary processes to bind to inbound service endpoints. Windows Azure interoperable with PHP The Windows Azure tools for Eclipse extension builds upon the PHP Development Toolkit (PDT) and integrates Web Tools Platform (WTP) to provide a complete toolkit for Windows Azure web application development. For more details please refer to the link: http://www.microsoft.com/windowsazure/faq/   Rituraj

    Read the article

  • SQL Azure Book

    - by ScottKlein
    One of the hotest technology topics of the day is Azure. Being a SQL guy, I am all over this technology, especially SQL Azure. So much so that Herve Roggero and I are currently writing a book for APress on SQL Azure. This book will be out in September and will include deep and thorough coverage of SQL Azure, best practices, and how-to's. We are excited about this book and the technology. However, we'd like to hear from you. As we go around evangelizing SQL Azure at user groups, code camps, and SQL Saturday's, we see the range of "heard of it" to "experimenting with it". Very few are actually doing something with Azure. I'd like to know what your concerns/questions are regarding Azure.  More specifically, what functionality do you think is cool as well as what is lacking, and what would be your list of "must have's" to do something with Azure. We hear a lot regarding security concerns, lack of backup/restore, etc. Is there more? Herve and I will be posting frequently as we write to let you know what we find, what is cool, and what you can look forward to. I'm heading to Tech-Ed in June and hopefully will come back with some great things to tell you!

    Read the article

  • SQL Azure Book

    - by ScottKlein
    One of the hotest technology topics of the day is Azure. Being a SQL guy, I am all over this technology, especially SQL Azure. So much so that Herve Roggero and I are currently writing a book for APress on SQL Azure. This book will be out in September and will include deep and thorough coverage of SQL Azure, best practices, and how-to's. We are excited about this book and the technology. However, we'd like to hear from you. As we go around evangelizing SQL Azure at user groups, code camps, and SQL Saturday's, we see the range of "heard of it" to "experimenting with it". Very few are actually doing something with Azure. I'd like to know what your concerns/questions are regarding Azure.  More specifically, what functionality do you think is cool as well as what is lacking, and what would be your list of "must have's" to do something with Azure. We hear a lot regarding security concerns, lack of backup/restore, etc. Is there more? Herve and I will be posting frequently as we write to let you know what we find, what is cool, and what you can look forward to. I'm heading to Tech-Ed in June and hopefully will come back with some great things to tell you!

    Read the article

  • Looking for a webhost to support SSRS Hosting with SQL Azure

    - by Adrian Grigore
    Since SQL Azure does not currently support SSRS, the only possible workaround is to host my own SSRS server and have it point to my SQL Azure instance for data retrieval. Now, for me it would be total overkill to rent a dedicated server with SQL server on it just for hosting SSRS. Are there any (shared) web hosters that offer SSRS hosting with third party SQL servers? I've already asked discountasp.net, but they don't allow this. Thanks, Adrian

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >