Search Results

Search found 2619 results on 105 pages for 'azure diagnostics'.

Page 18/105 | < Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >

  • Windows Azure Tools for Microsoft Visual Studio 1.2 (June 2010)

    - by Jim Duffy
    I have good news! Microsoft has released the June 2010 Windows Azure Tools + SDK. These new tools extend Visual Studio (both VS 2010 & VS 2008) for Windows Azure development. With these tools you can create, configure, debug, build, run, and deploy scalable web apps on Windows Azure. At first glance what I see as some of the most interesting points of interest are the fact that Visual Studio 2010 RTM is fully supported as well as .NET 4 support. You can choose to build your apps with the .NET 3.5 or .NET 4 frameworks. Another area of interest that I’ll be digging into is the cloud storage explorer. It provides a read-only view of your Windows Azure tables and blob containers from within Visual Studio via the Server Explorer. I’m sure I’ll have more to say about the Windows Azure Tools as I dig deeper… Have a day. :-|

    Read the article

  • configure time sync for azure VMs

    - by Pharao2k
    I have several ExtraSmall-sized Azure VMs (PaaS / Cloud Service based) that are all experiencing drifting of the Windows clock. Research showed that this is quite common, especially in VMs with shared cores. Unfortunately even after configuring the w32time service to sync with time.windows.com and forcing a resync (w32tm /resync), there seems to be a time difference of 2 seconds to the configured NTP server. Though Microsoft states that w32tm is not meant as a high-precision sync tool, a difference of 2 seconds is (IMO) quite a lot for server-activities/processing. What does one have to do to get more accurate time sync?

    Read the article

  • Migrating an Existing ASP.NET App to run on Windows Azure

    - by kaleidoscope
    Converting Existing ASP.NET application in to Windows Azure Method 1:  1. Add a Windows Azure Cloud Service to the existing solution                         2. Add WebRole Project in solution and select Existing Asp.Net project in it. Method 2: The other option would have been to create a new Cloud Service project and add the ASP.NET project (which you want to deploy on Azure) to it using -                    1. Solution | Add | Existing Project                    2. Add | Web Role Project in solution Converting Sql server database to SQL Azure - Step 1: Migrate the <Existing Application>.MDF Step 2: Migrate the ASP.NET providers Step 3: Change the connection strings More details can be found at http://blogs.msdn.com/jnak/archive/2010/02/08/migrating-an-existing-asp-net-app-to-run-on-windows-azure.aspx   Ritesh, D

    Read the article

  • Windows Azure openSUSE: rcnetwork not starting

    - by djechelon
    I have a Linux VM in Azure, created from their default image. My problem is simply that the init script network doesn't look like to start, so dependent services (apache, postfix...) won't start. If I run yast runlevel and try to start postfix it asks me to start network first: if I accept, network is started without errors and then postfix is started. While network is configured to start on boot, it just doesn't appear to have started. Anyway, SSH connections work fine. Currently, I had to edit my init scripts and remove network from the Required-Start list, but that didn't work for posftix (even after running systemctl --system daemon-reload). How can I fix all this?

    Read the article

  • Azure Florida Association: New user group announcement

    - by Herve Roggero
    I am proud to announce the creation of a new virtual user group: the Azure Florida Association. The missiong of this group is to bring national and internaional speakers to the forefront of the Florida Azure community. Speakers include Microsoft employees, MVPs and senior developers that use the Azure platform extensively. How to learn about meetings and the group Go to http://www.linkedin.com/groups?gid=4177626 First Meeting Announcement Date: January 25 2012 @4PM ET Topic: Demystifying SQL Azure Description: What is SQL Azure, Value Proposition, Usage scenarios, Concepts and Architecture, What is there and what is not, Tips and Tricks Bio: Vikas is a versatile technical consultant whose knowledge and experience ranges from products to projects, from .net to IBM Mainframe Assembler.  He has lead and mentored people on different technical platforms, and has focused on new technologies from Microsoft for the past few years.  He is also takes keen interest in Methodologies, Quality and Processes.

    Read the article

  • How does azure memory usage work?

    - by Jed Grant
    I have a windows azure website. In the dashboard it shows me that I have used 1.51 GB of the 2GB available per hour. I keep increasing the number of instances available in the shared node so the site doesn't shut down. After each hour finishes, the memory usage still shows 1.51 GB used. I assume this would start at ZERO and then be used as time goes on, but that doesn't appear to be the case. How does server memory work? What are some reasons my application using this much memory? (I use no output caching and generally have just built off of the basic MVC templates provided in visual studio.) What other considerations should I be making to get the amount of memory needed to decrease?

    Read the article

  • PowerShell create new Azure VM from uploaded disk (not image)

    - by MikeBaz
    I have a VHD in Azure storage. That VHD is configured as an OS disk through a command like the following: Add-AzureDisk -DiskName $newCode -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" ` -Label $newCode -OS "Windows" I would like to create a new VM pointing at that disk. From what I can tell if I was doing this with an image I would do something like: New-AzureVMConfig -Name $newCode -InstanceSize $instanceSize ` -MediaLocation "http://$script:accountName.blob.core.windows.net/$newCode/$sourceVhdName.vhd" -ImageName $newCode ` | Add-AzureProvisioningConfig -Windows -Password $adminPassword ` | New-AzureVM -ServiceName $newCode However this is wrong for me because I don't have an image - I have a configured VHD that is not sysprepped and can't be. How can I create the VM in PowerShell to point at the existing disk like I can through the portal?

    Read the article

  • Azure Linux out of band managment

    - by faker
    I have a Linux (Ubuntu) virtual machine running on Azure. It seems like the only way to connect to it is via SSH. This is OK for normal operation, but what to do when something goes wrong (fsck waiting for user-input, new kernel doesn't boot, mis-configured network, etc.)? There is a grayed out "Connect" button in the management interface, and the help for it says: To access a virtual machine running Windows Server, click Connect and follow the instructions. Enter the password that was set when the virtual machine was created. The Connect button is not available for a virtual machine running Linux, but you can use your favorite SSH program to access it. I've read the documentation on the command line tools, but there is also no way to connect to it. Is there any way for me to get such a console?

    Read the article

  • Once VPN connection is done, how do I proceed reaching the other side address space?

    - by sports
    I'm using Windows Azure and I created a VPN Site To Site, configured like this: My virtual network: My address space: 10.2.0.0/16 (65531) Subnet1: 10.2.1.0/24 (251) Subnet2: 10.2.2.0/24 (251) Gateway: 10.2.3.0/29 (3) My public gateway IP: 137.135.x.z (I wont show x and z for security reasons) This public gateway uses, as you can see, 5 IPs on subnet1 and 5 IPs on subnet2, and 5 IPs on the gateway "Their" virtual network (in azure this would be a "Local network") Their address space: 172.60.100.67/32, 172.60.100.68/32, 172.60.100.69/32 Their device public IP: x.x.x.x (ommited for security reasons) Notice their address space are 3 IPs So: the VPN is "in green" (in Azure is showing up green, literally, like these two are connected) and now my question is: How do I proceed to reach their address space? I've tried creating a virtual machine (Windows Server 2008, but it could be an Ubuntu) on "my" virtual network and it is automatically "placed" on subnet1 or subnet2. So it gets the IP 10.2.1.0 (valid example), I can't choose to place the virtual machine in the gateway address space. How do I "reach" any of the IPs 172.60.100.67, 172.60.100.68, 172.60.100.69 ? In other words: How can I telnet any of these IPs? or ping? or see them in my network? Please provide me answers for Windows Server 2008 or for an Ubuntu. I'm open to create any virtual machine.

    Read the article

  • Big AdventureWorks2012

    - by jamiet
    Last week I launched AdventureWorks on Azure, an initiative to make SQL Azure accessible to anyone, in my blog post AdventureWorks2012 now available for all on SQL Azure. Since then I think its fair to say that the reaction has been lukewarm with 31 insertions into the [dbo].[SqlFamily] table and only 8 donations via PayPal to support it; on the other hand those 8 donators have been incredibly generous and we nearly have enough in the bank to cover a full year’s worth of availability. It was always my intention to try and make this offering more appealing and to that end I have used an adapted version of Adam Machanic’s make_big_adventure.sql script to massively increase the amount of data in the database and give the community more scope to really push SQL Azure and see what it is capable of. There are now two new tables in the database: [dbo].[bigProduct] with 25200 rows [dbo].[bigTransactionHistory] with 7827579 rows The credentials to login and use AdventureWorks on Azure are as they were before: Server mhknbn2kdz.database.windows.net Database AdventureWorks2012 User sqlfamily Password sqlf@m1ly Remember, if you want to support AdventureWorks on Azure simply click here to launch a pre-populated PayPal Send Money form - all you have to do is login, fill in an amount, and click Send. We need more donations to keep this up and running so if you think this is useful and worth supporting, please please donate.   I mentioned that I had to adapt Adam’s script, the main reasons being: Cross-database queries are not yet supported in SQL Azure so I had to create a local copy of [dbo].[spt_values] rather than reference that in [master] SELECT…INTO is not supported in SQL Azure The 1GB limit of SQLAzure web edition meant that there would not be enough space to store all the data generated by Adam’s script so I had to decrease the total number of rows. The amended script is available on my SkyDrive at https://skydrive.live.com/redir.aspx?cid=550f681dad532637&resid=550F681DAD532637!16756&parid=550F681DAD532637!16755 @Jamiet

    Read the article

  • Squid on an Azure VM

    - by LantisGaius
    I can't get it to work. Here's exactly what I did: Create a new Azure VM, Windows Server 2012. RDP to the new VM Download & Extract Squid for Windows (2.7.STABLE8) Rename the conf files (squid, mime & cachemgr) Add the following lines on the end of squid.conf auth_param basic program c:/squid/libexec/ncsa_auth.exe c:/squid/etc/passwd.txt auth_param basic children 5 auth_param basic realm Welcome to http://abcde.fg Squid Proxy! auth_param basic credentialsttl 12 hours auth_param basic casesensitive off acl ncsa_users proxy_auth REQUIRED http_access allow ncsa_users Use http://www.htaccesstools.com/htpasswd-generator-windows/ to create passwd.txt Test passwd.txt via c:/squid/libexec/ncsa_auth.exe c:/squid/etc/passwd.txt (success) squid -z squid -i net start squid (No errors so far). go to https://manage.windowsazure.com, Virtual Machines - myVM - Endpoints Add Endpoint: Name: Squid Protocol: TCP Public Port: 80 Private Port: 3128 That's it. Unfortunately, it doesn't work. I think I screwed something up at the endpoint? I'm not sure.. help? EDIT: I'm testing it via Firefox - Options - Advanced - Network, and the exact error is "The Proxy Server is refusing connections." I'm using my DNS as the Proxy server "abcdef.cloudapp.net" and port 80 (since that's my public endpoint).

    Read the article

  • Azure load-balancing strategy

    - by growse
    I'm currently building out a small web deployment using VM instances on MS Azure. The main problem I'm facing at the moment is trying to figure out how to get the load-balancing to detect if a particular VM has failed and not route traffic to that VM. As far as I can tell, there are only only two load-balancing options: Have multiple VMs (web01, web02, web03 etc.) within the same 'cloud service' behind a single VIP, and configure the endpoints to be load balanced. Create multiple 'cloud services', put a single web VM in each and create a traffic manager service across all these services. It appears that (1) is extremely simplistic and doesn't attempt to do any host failure detection. (2) appears to be much more varied, but requires me to put all my webservers in their own individual cloud service. Traffic manager appears to be much more directed at a geographic failover scenario, where you have multiple cloud services across different regions. This approach also has the disadvantage in that my web servers won't be able to communicate with my databases on internal IP addresses, unlike scenario (1). What's the best approach here?

    Read the article

  • Azure's Ubuntu 12.0.4 fails to install PHP5

    - by Alex Kennberg
    Similar to this article from Azure themselves: http://www.windowsazure.com/en-us/manage/linux/common-tasks/install-lamp-stack/ I am trying to install PHP5 on Ubuntu 12.0.4 virtual machine. However, it fails installing the ssl-cert. $ sudo apt-get install php5 Reading package lists... Done Building dependency tree Reading state information... Done php5 is already the newest version. 0 upgraded, 0 newly installed, 0 to remove and 49 not upgraded. 1 not fully installed or removed. After this operation, 0 B of additional disk space will be used. Do you want to continue [Y/n]? y Setting up ssl-cert (1.0.28) ... Could not create certificate. Openssl output was: Generating a 2048 bit RSA private key ............................+++ ...................................................................................................................+++ writing new private key to '/etc/ssl/private/ssl-cert-snakeoil.key' ----- problems making Certificate Request 140320238503584:error:0D07A097:asn1 encoding routines:ASN1_mbstring_ncopy:string too long:a_mbstr.c:154:maxsize=64 dpkg: error processing ssl-cert (--configure): subprocess installed post-installation script returned error exit status 1 Errors were encountered while processing: ssl-cert E: Sub-process /usr/bin/dpkg returned an error code (1) Any tips appreciated.

    Read the article

  • Q&amp;A: Can you develop for the Windows Azure Platform using Windows XP?

    - by Eric Nelson
    This question has come up several times recently as we take several hundred UK developers through 6 Weeks of Windows Azure training (sorry – we are full). Short answer: In the main, yes Longer answer: The question is sparked by the requirements as stated on the Windows Azure SDK download page. Namely: Supported Operating Systems: Windows 7; Windows Vista; Windows Vista 64-bit Editions Service Pack 1; Windows Vista Business; Windows Vista Business 64-bit edition; Windows Vista Enterprise; Windows Vista Enterprise 64-bit edition; Windows Vista Home Premium; Windows Vista Home Premium 64-bit edition; Windows Vista Service Pack 1; Windows Vista Service Pack 2; Windows Vista Ultimate; Windows Vista Ultimate 64-bit edition Notice there is no mention of Windows XP. However things are not quite that simple. The Windows Azure Platform consists of three released technologies Windows Azure SQL Azure Windows Azure platform AppFabric The Windows Azure SDK is only for one of the three technologies, Windows Azure. What about SQL Azure and AppFabric? Well it turns out that you can develop for both of these technologies just fine with Windows XP: SQL Azure development is really just SQL Server development with a few gotchas – and for local development you can simply use SQL Server 2008 R2 Express (other versions will also work). AppFabric also has no local simulation environment and the SDK will install fine on Windows XP (SDK download) Actually it is also possible to do Windows Azure development on Windows XP if you are willing to always work directly against the real Azure cloud running in Microsoft datacentres. However in practice this would be painful and time consuming, hence why the Windows Azure SDK installs a local simulation environment. Therefore if you want to develop for Windows Azure I would recommend you either upgrade from Windows XP to Windows 7 or… you use a virtual machine running Windows 7. If this is a temporary requirement, then you could consider building a virtual machine using the Windows 7 Enterprise 90 day eval. Or you could download a pre-configured VHD – but I can’t quite find the link for a Windows 7 VHD. Pointers welcomed. Thanks.

    Read the article

  • New SQL Azure Development Accelerator Core promotional offer announced

    - by Eric Nelson
    This is (almost) a straight copy and paste but represents an important announcement worthy of a little more “exposure” :-) Starting August 1, 2010, we will release a new SQL Azure Development Accelerator Core promotional offer.  This new offer will give you the flexibility to purchase commitment quantities of SQL Azure Business Edition databases independent of other Windows Azure platform services at a deeply discounted monthly price.  The offer is valid only for a six month term.  You may purchase in 10 GB increments the amount of our Business Edition relational database that you require (each Business Edition database is capable of storing up to 50 GB).  The offer price will be $74.95 per 10 GB per month.  This promotional offer represents 25% off of our normal consumption rates.  Monthly Business Edition relational database usage exceeding the purchased commitment amount and usage for other Windows Azure platform services for this offer will be charged at our normal consumption rates.  Please click here for full details of our new SQL Azure Development Accelerator Core offer.  Related Links: Details of 5GB and 50GB databases have been released http://ukazure.ning.com UK community site Getting started with the Windows Azure Platform

    Read the article

  • New partnership allows auto-transposition of client/server application to Windows Azure

    - by Webgui
    The economics of IT is changing rapidly, and organizations are searching to widen and secure availability of their systems and at the same time lower costs which is exactly what the cloud meant to do. Running your systems on Microsoft’s Windows Azure cloud for example would improve and secure the availability, accessibility and scalability (both up and down) of your systems and support the new IT economics. However, in order to take advantage of the cloud's promise of lower cost of ownership, the applications must be built or adjusted to work on that platform and in most cases this is not a simple task.  Even existing web applications cannot always be transferred to Azure without some changes, and for client/server applications, the task is way more challenging even to the point where it seems impossible. The reason is the gaps between the client/server desktop technology and the cloud's. For that reason, most of the known methodologies to migrate existing client/server applications actually involve rewrite of the desktop systems for the cloud. A unique approach is introduced by Visual WebGui which creates a virtualization layer atop ASP.Net web server, it moves the transformed or generated .Net code to that layer, and then using a patent pending protocol it renders a user interface within a plain browser. The end result is pure .NET code that is a base code for a pure rich web application and now due to a collaboration with Microsoft Windows Azure Visual WebGui provides the shortest path from client/server to the Azure cloud by being able to handle close to 95% of the transformation to the cloud platform in an automatic way. Application Migration to Azure without migraines More information about the Instant CloudMove Azure solution here.

    Read the article

  • Windows Azure Event

    - by Blog Author
    Get cloud ready with Windows Azure The cloud is everywhere and here at Microsoft we’re flying high with our cloud computing release, Windows Azure. As most of you saw at the Professional Developers Conference, the reaction to Windows Azure has been nothing short of “wow” – and based on your feedback, we’ve organized this special, all-day Windows Azure Firestarter event to help you take full advantage of the cloud. Maybe you've already watched a webcast, attended a recent MSDN Event on the topic, or done your own digging on Azure. Well, here's your chance to go even deeper. This one-of-a-kind event will focus on helping developers get ‘cloud ready’ with concrete details and hands-on tactics. We’ll start by revealing Microsoft’s strategic vision for the cloud, and then offer an end-to-end view of the Windows Azure platform from a developer’s perspective. We’ll also talk about migrating your data and existing applications (regardless of platform) onto the cloud. We’ll finish up with an open panel and lots of time to ask questions. Following this event, please join us for an engaging conversation about any and all Cloud Computing topics. This FREE event is hosted by Northwest Cloud, the cloud agnostic community group, and sponsored by Microsoft. http://www.nwcloud.org/redmond/2010-04-06

    Read the article

  • Feature Updates to the Windows Azure Portal

    - by Clint Edmonson
    Lots of activity over at the Windows Azure portal this weekend, including some exciting new features and major improvements to existing features. Here are the highlights: Support for Managing Co-administrators Set up account co-administrators to allow others to share service management duties for each Azure subscription Import/Export support for SQL Databases Export existing SQL Azure databases to blob storage using SQL Server 2012’s BACPAC format. Create a new SQL Azure database from an existing BACPAC stored in blob storage Storage Container Management and Access Control Create blob storage containers directly within the portal Edit their public/private access settings Drill into storage containers and see the blobs contained within them Improved Cloud Service Status Notifications Detailed health status information about cloud services and roles as they transition between states Virtual Machine Experience Enhancements Option to automatically delete corresponding VHD files from blob storage when deleting VM disks Service Bus Management and Monitoring Ability to create and manage service bus Namespaces, Queues, Topics, Relays and Subscriptions Rich monitoring of Topics, Queues, and Subscriptions with detailed and customizable dashboard metrics Entity status (Topic, Queue, or Subscription) can be changed interactively via dashboard Direct links to the Access Control Services (ACS) namespaces when working with service bus access keys Media Services Monitoring Support Monitor encoding jobs that are queued for processing as well as active, failed and queued tasks for encoding jobs The above features are all now live in production and available to use immediately.  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today. Stay tuned to my twitter feed for Windows Azure announcements, updates, and links: @clinted Reference ID: P7VVJCM38V8R

    Read the article

  • MSBuild / PowerShell: Copy SQL Server 2012 database to SQL Azure via BACPAC (for Continuous Integration)

    - by giveme5minutes
    I'm creating a continuous integration MSBuild script which copies a database in on-premise SQL Server 2012 to SQL Azure. Easy right? Methods After a fair bit of research I've come across the following methods: Use PowerShell to access the DAC library directly, then use the MSBuild PowerShell extension to wrap the script. This would require installing PowerShell 3 and working out how to make the MSBuild PowerShell extension work with it, as apparently MS moved the DAC API to a different namespace in the latest version of the library. PowerShell would give direct access to the API, but may require quite a bit of boilerplate. Use the sample DAC Framework Client Side Tools, which requires compiling them myself, as the downloads available from Codeplex only include the Hosted version. It would also require fixing them to use DAC 3.0 classes as they appear to currently use an earlier version of DAC. I could then call these tools from an <Exec Command="" /> in the MSBuild script. Less boilerplate and if I hit any bumps in the road I can just make changes to the source. Processes Using whichever method, the process could be either: Export from on-premise SQL Server 2012 to local BACPAC Upload BACPAC to blog storage Import BACPAC to SQL Azure via Hosted DAC Or: Export from on-premise SQL Server 2012 to local BACPAC Import BACPAC to SQL Azure via Client DAC Question All of the above seems to be quite a lot of effort for something that seems to be a standard feature... so before I start reinventing the wheel and documenting the results for all to see, is there something really obvious that I've missed here? Is there pre-written script that MS has released that I have not yet uncovered? There's an command in the GUI of SQL Server Management Studio 2012 that does EXACTLY what I'm trying to do (right click on local database, click "Tasks", click "Deploy Database to SQL Azure"). Surely if it's a few clicks in the GUI it must be a single command on the command line somewhere??

    Read the article

  • Windows Azure v1.7 Spring Release Today&ndash;New Management Dashboard

    - by ToStringTheory
    Today, Microsoft will be publicly releasing a new version of Azure for public consumption.  The web conference, at http://www.meetwindowsazure.com will be airing at 1 PM PST.  They have already released an update to the Service Dashboard that can be accessed by going to http://manage.windowsazure.com.  I have some images of the new dashboard here that I have gathered and removed any PII from.  Let me know what you think! Images You should be able to click any of the images for a full resolution image. Tutorial The first thing you get after signing in is the tutorial: Landing After the tutorial completes, you get a screen with services that are active on your account on the left, and a list of ALL services (db/blob/SQL Azure) on the right.  I like the quick access to services across any of my subscriptions: Service Information These are images from a running web site with several roles.  I love how easy they have made many of the features: SQL Azure They have given some great quick functionality for looking at your DB information: Storage Here is the basic information that they give you for any storage accounts you have: Adding Services Super quick and easy to add services with the new UI: Conclusion I am EXCITED!  As you may have seen in the left side of my blog, I am an MCPD in Azure Development, and I must say that I am excited to see Microsoft moving forward with the technology and not letting it stagnate.  After as much as I have fought the other Azure dashboard, I like the friendliness and fluidity of this one. The important thing to note about ALL of the images above: this is HTML, not Silverlight.  The responsiveness is FAST on all of the actions I completed, and I believe that this is a big step forward for Azure… So, what do you think?

    Read the article

  • Windows Azure Recipe: Big Data

    - by Clint Edmonson
    As the name implies, what we’re talking about here is the explosion of electronic data that comes from huge volumes of transactions, devices, and sensors being captured by businesses today. This data often comes in unstructured formats and/or too fast for us to effectively process in real time. Collectively, we call these the 4 big data V’s: Volume, Velocity, Variety, and Variability. These qualities make this type of data best managed by NoSQL systems like Hadoop, rather than by conventional Relational Database Management System (RDBMS). We know that there are patterns hidden inside this data that might provide competitive insight into market trends.  The key is knowing when and how to leverage these “No SQL” tools combined with traditional business such as SQL-based relational databases and warehouses and other business intelligence tools. Drivers Petabyte scale data collection and storage Business intelligence and insight Solution The sketch below shows one of many big data solutions using Hadoop’s unique highly scalable storage and parallel processing capabilities combined with Microsoft Office’s Business Intelligence Components to access the data in the cluster. Ingredients Hadoop – this big data industry heavyweight provides both large scale data storage infrastructure and a highly parallelized map-reduce processing engine to crunch through the data efficiently. Here are the key pieces of the environment: Pig - a platform for analyzing large data sets that consists of a high-level language for expressing data analysis programs, coupled with infrastructure for evaluating these programs. Mahout - a machine learning library with algorithms for clustering, classification and batch based collaborative filtering that are implemented on top of Apache Hadoop using the map/reduce paradigm. Hive - data warehouse software built on top of Apache Hadoop that facilitates querying and managing large datasets residing in distributed storage. Directly accessible to Microsoft Office and other consumers via add-ins and the Hive ODBC data driver. Pegasus - a Peta-scale graph mining system that runs in parallel, distributed manner on top of Hadoop and that provides algorithms for important graph mining tasks such as Degree, PageRank, Random Walk with Restart (RWR), Radius, and Connected Components. Sqoop - a tool designed for efficiently transferring bulk data between Apache Hadoop and structured data stores such as relational databases. Flume - a distributed, reliable, and available service for efficiently collecting, aggregating, and moving large log data amounts to HDFS. Database – directly accessible to Hadoop via the Sqoop based Microsoft SQL Server Connector for Apache Hadoop, data can be efficiently transferred to traditional relational data stores for replication, reporting, or other needs. Reporting – provides easily consumable reporting when combined with a database being fed from the Hadoop environment. Training These links point to online Windows Azure training labs where you can learn more about the individual ingredients described above. Hadoop Learning Resources (20+ tutorials and labs) Huge collection of resources for learning about all aspects of Apache Hadoop-based development on Windows Azure and the Hadoop and Windows Azure Ecosystems SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • Windows Azure Platform, latest version?

    - by Vimvq1987
    I searched through internet but found nothing. The whitepapers of Windows Azure Platform say something like that: In its first release, the maximum size of a single database in SQL Azure Database is 10 gigabytes A few things are omitted in the technology’s first release, however, such as the SQL Common Language Runtime (CLR) and support for spatial data. (Microsoft says that both will be available in a future version.) I want to know that Microsoft had updated Windows Azure Platform and removed these limits or not? I decided to post this question here instead of Serverfault.com because it's more relative to programming than administration. Thank you

    Read the article

  • Windows Azure: need to know the data processing time

    - by veda
    I have stored some files in the form of blobs on azure and I have written an application that would access these blobs. When I host this application as a web role on azure, it works perfectly and I am happy with that. But now, I wanted to know “what is the query time taken to access each blob file?” I was searching for this through the Microsoft Azure Storage SLA and I found that for GetBlob request type, the maximum processing time should be within the product of 2 seconds multiplied by the number of MBs transferred in processing the request. I am still unclear. What is the actual processing time of my data query? How can I measure it? Can I be able to speed up the processing time? I can understand that the processing time depends on internet speed, location of the data center where my data is being stored, and location of data center where my application is being hosted. But still, will I be able to speed up my query?

    Read the article

  • Easiest way to retrofit retry logic on LINQ to SQL migration to SQL Azure

    - by Pat James
    I have a couple of existing ASP .NET web forms and MVC applications that currently use LINQ to SQL with a SQL Server 2008 Express database on a Windows VPS: one VPS for both IIS and SQL. I am starting to outgrow the VPS's ability to effectively host both SQL and IIS and am getting ready to split them up. I am considering migrating the database to SQL Azure and keeping IIS on the VPS. After doing initial research it sounds like implementing retry logic in the data access layer is a must-do when adopting SQL Azure. I suspect this is even more critical to implement in my situation where IIS will be on a VPS outside of the Azure infrastructure. I am looking for pointers on how to do this with the least effort and impact on my existing code base. Is there a good retry pattern that can be applied once at the LINQ to SQL data access layer, as opposed to having to wrap all of my LINQ to SQL operations in try/catch/wait/retry logic?

    Read the article

  • Windows Azure worker roles: One big job or many small jobs?

    - by Ryan Elkins
    Is there any inherent advantage when using multiple workers to process pieces of procedural code versus processing the entire load? In other words, if my workflow looks like this: Get work from queue0 and do A Store result from A in queue1 Get result from queue 1 and do B Store result from B in queue2 Get result from queue2 and do C Is there an inherent advantage to using 3 workers who each do the entire process themselves versus 3 workers that each do a part of the work (Worker 1 does 1 & 2, worker 2 does 3 & 4, worker 3 does 5). If we only care about working being done (finished with step 5) it would seem that it scales the same way (once you're using at least 3 workers). Maybe the big job is better because workers with that setup have less bottleneck issues?

    Read the article

< Previous Page | 14 15 16 17 18 19 20 21 22 23 24 25  | Next Page >