Search Results

Search found 44703 results on 1789 pages for 'azure web roles'.

Page 52/1789 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Miami 311: Built on Windows Azure

    - by Josh Holmes
    This is a cool use of Azure. The city of Miami tool their “311” data around potholes, trash pickup issues, recycling issues, broken sidewalks and the like and put that data in Azure. The next step is that they leveraged Bing Maps and Silverlight to visualize those issues spread on a map of the city. The solution takes advantage of virtually unlimited storage and processing power, provides the ability to quickly address service requests and implement updates even during peak times such as hurricane season. If things change, the City can bring the solution on site or move to a physical facility, all based on  need and cost-effectiveness. As a result, residents logging on to Miami 311 can see on average 4,500 issues in progress - not represented as a ‘list', but located on a map in relation to other projects in their neighborhood .  A simple click on the map allows them to easily drill down to more and more specific details if they want. In short, they have turned what used to be represented by a meaningless list of data into useful information, and created  actionable and consumable knowledge that is relevant to the citizens of Miami. For Miami, their ‘service call to the city' becomes an interactive process they can follow - and the City has a new tool to manage and deliver outcomes. … When the city made the move to the web, they chose tools they knew and software they trust. The Microsoft Windows Azure cloud platform made it easy to do, and they used both Bing mapping and Silverlight to build a user friendly front end. According to Port25 (Miami 311: Built on Windows Azure - Port 25: The Open Source Community at Microsoft), it took two people 8 days to implement the whole system and they are going to open source their solution so that other cities can leverage it. I haven’t seen yet where and how they are going to release it but I’ll keep you posted if I find out.

    Read the article

  • Windows Azure Upgrade Domain

    - by kaleidoscope
    Windows Azure automatically divides your role instances into some “logical” domains called upgrade domains. During upgrade, Azure is updating these domains one by one. This is a by design behavior to avoid nasty situations. Some of the last feature additions and enhancements on the platform was the ability to notify your role instances in case of “environment” changes, like adding or removing being most common. In such case, all your roles get a notification of this change. Imagine if you had 50 or 60 role instances, getting notified all at once and start doing various actions to react to this change. It will be a complete disaster for your service. The way to address this problem is upgrade domains. During upgrade Windows Azure updates them one by one and only the associated role instances to a specific domain get notified of the changes taking place. Only a small number of your role instances will get notified, react and the rest will remain intact providing a seamless upgrade experience and no service disruption or downtime. http://www.kefalidis.me/archive/2009/11/27/windows-azure-ndash-what-is-an-upgrade-domain.aspx   Lokesh, M

    Read the article

  • How does one rein in the complexities of web development ?

    - by Rahul
    I have been a server-side programmer for most of my career and have only recently started spending more time on web development. I am amazed at the number of things I need to master in order to write a decent web application. Just to list down a few tools/technologies I need to learn, Server side programming language (Java/JSP, ASP, PHP, Ruby or something else) A decent web framework (for any medium to big size application). HTML & CSS Javascript Javascript library (JQuery/ExtJS etc. primarily for AJAX). Good to know even if not necessary. At least a basic knowledge of web design - layouts, colors, fonts etc. A good understanding of web security. A good understanding of Performance/scalability issues. Testing, browser compatibility issues etc. The list goes on. So, my question to seasoned web developers is - How do you guys manage to learn and keep yourself updated on so many things? While developing a web application, how do you handle the complexities involved in these areas and yet manage to write an application that is well designed, user friendly, secure, performant and scalable. As a web developer, does one have to be a jack of all trades or should one specialize in one or two areas and leave the rest to other members of the team ?

    Read the article

  • Automated backups for Windows Azure SQL Database

    - by Greg Low
    One of the questions that I've often been asked is about how you can backup databases in Windows Azure SQL Database. What we have had access to was the ability to export a database to a BACPAC. A BACPAC is basically just a zip file that contains a bunch of metadata along with a set of bcp files for each of the tables in the database. Each table in the database is exported one after the other, so this does not produce a transactionally-consistent backup at a specific point in time. To get a transactionally-consistent copy, you need a database that isn't in use.The easiest way to get a database that isn't in use is to use CREATE DATABASE AS COPY OF. This creates a new database as a transactionally-consistent copy of the database that you are copying. You can then use the export options to get a consistent BACPAC created.Previously, I've had to automate this process by myself. Given there was also no SQL Agent in Azure, I used a job in my on-premises SQL Server to do this, using a linked server configuration.Now there's a much simpler way. Windows Azure SQL Database now supports an automated export function. On the Configuration tab for the database, you need to enable the Automated Export function. You can configure how often the operation is performed for you, and which storage account will be used for the backups.It's important to consider the cost impacts of this as well. You are charged for how ever many databases are on your server on a given day. So if you enable a daily backup, you will double your database costs. Do not schedule the backups just before midnight UTC, as that could cause you to have three databases each day instead of one.This is a much needed addition to the capabilities. Scott Guthrie also posted about some other notable changes today, including a preview of a new premium offering for SQL Database. In addition to the Web and Business editions, there will now be a Premium edition that has reserved (rather than shared) resources. You can read about it all in Scott's post here: http://weblogs.asp.net/scottgu/archive/2013/07/23/windows-azure-july-updates-sql-database-traffic-manager-autoscale-virtual-machines.aspx

    Read the article

  • The tale of how the PowerShell CmdLets got installed with Azure SDK 1.4

    - by Enrique Lima
    I installed the Azure SDK 1.4 while rebuilding my laptop and ran the installation for the Windows Azure Service Management PowerShell CmdLets. Kicked off the installation script for the WASM PowerShell CmdLets by locating the path to which WASM PowerShell CmdLets was deployed to. Double clicked the startHere command. It will then open the WASM installation dialog. Click Next. Click Next. Notice the red x next to the Azure SDK 1.3, the problem is I have SDK 1.4 Here is the workaround, I go back to the location of the deployed WASM sources. Go into the setup path, then scripts>dependencies>check. Now, locate the CheckAzureSDK.ps1 file, and right-click, then edit. This is the content in the ps1 file, it check for the specific version of the Azure SDK, in this case, it is looking for version 1.3.11133.0038. We need for it to check for version 1.4.20227.1419 Now, save your ps1 file, go back to the open WASM install dialog, and click rescan. This time it should pass, then click next. A Command prompt window will appear, click any key. This completes the installation, click Close.

    Read the article

  • Are web service handler chains possible under IIS / ASP.NET

    - by Mike
    I'm working with a client who wants me to implement a particular design in an IIS/ASP.NET environment. This design was already implemented in Java, but I am not sure it is possible using Microsoft technologies. In a Tomcat/Java environment one can create so call Handler Chains. In essence a handler runs on the server on which the web service is running and it intercepts the SOAP message coming to the web service. The handler can perform a number of tasks before passing control to the web service. Some of these tasks may refer to authentication and authorization. Moreover, one can create handler chains, such that the handlers can run in a particular sequence before passing control to the web service. This is a very elegant solution, as certain aspects of authentication and authorization can be automatically performed, without the developer of the client application and of the web service having to invest anything in it. The code for the client application and the web service is not affected. You may find a number of articles on internet on this subject by searching on Google for "web service handler chain". I performed searches for web service handlers in IIS or ASP.NET. I get some hits, but apparently handlers in IIS have another meaning than that described above. My question therefor is: Can handler chains (as available in Java and Tomcat) be created in IIS? If so, how (any article, book, forum...)? Either a negative or a positive answer will be greatly appreciated. Mike

    Read the article

  • Send Email from worker role (Azure) with attachment in c#

    - by simplyvaibh
    I am trying to send an email(in c#) from worker role(Azure) with an attachment(from blob storage). I am able to send an email but attachment(word document) is blank. The following function is called from worker role. public void sendMail(string blobName) { InitStorage();//Initialize the storage var storageAccount = CloudStorageAccount.FromConfigurationSetting("DataConnectionString"); container = blobStorage.GetContainerReference("Container Name"); CloudBlockBlob blob = container.GetBlockBlobReference(blobName); if (File.Exists("demo.doc")) File.Delete("demo.doc"); FileStream fs = new FileStream("demo.doc", FileMode.OpenOrCreate); blob.DownloadToStream(fs); Attachment attach = new Attachment(fs,"Report.doc"); System.Net.Mail.MailMessage Email = new System.Net.Mail.MailMessage("[email protected]", "[email protected]"); Email.Subject = "Text fax send via email"; Email.Subject = "Subject Of email"; Email.Attachments.Add(attach); Email.Body = "Body of email"; System.Net.Mail.SmtpClient client = new SmtpClient("smtp.live.com", 25); client.DeliveryMethod = SmtpDeliveryMethod.Network; client.EnableSsl = true; client.Credentials = new NetworkCredential("[email protected]", Password); client.Send(Email); fs.Flush(); fs.Close(); Email.Dispose(); } Please tell me where I am doing wrong?

    Read the article

  • Can someone please debug this Windows Azure Application?

    - by Vimvq1987
    Here's the myTODO project from codeplex: myTODO project I added any necessary libraries, added storage, changed obsolete types/methods, but everything went wrong when I debug it. An exception was thrown here (in TableStorage.cs): public IEnumerable<TElement> ExecuteWithRetries(RetryPolicy retry) { IEnumerable<TElement> ret = null; if (retry == null) { throw new ArgumentNullException("retry"); } retry(() => { try { ret = _query.Execute(); } catch (InvalidOperationException e) { if (TableStorageHelpers.CanBeRetried(e)) { throw new TableRetryWrapperException(e); } throw; } }); return ret; } I'm using Visual Studio 2008, SQL server 2008, Windows Azure SDK v1.1. Can anyone please debug this project for me, or suggest me someway to get it working. This request is urgent. Any helps are much appreciated. PS: If you can't download these file, please let me know, I'll upload to another hosts.

    Read the article

  • Create a real time web application using .net framework and azure - very confused

    - by test
    Let us say I would like a simple (yet complex) web application where there is continuous READING AND WRITING to the sql azure database. Let us say I am tracking a location, and I would like it to be updated very frequently (lets take the worst case: 1 second). From the little knowledge I have, I think that this involves the use of the database to continuously write the location to the database, and continuously read from the database to update another person through a website. Do you please have any suggestion which technologies can I use? Is there a simple way? I heard about node.js, signalR. I have no idea how to use them, if they are really what I need. the last tutorial I checked out simply uses a while(true) loop..but I don't think that that's something good to keep a thread continuously busy... Do I have to create some background task? Do I have to create some web service? This is a school project and I wish not to go for the most difficult option, but if there is some sort of solution, challenge accepted :) Can you please help me? Since I have asked many questions here and yet I have no solution in mind

    Read the article

  • How to get full query string parameters not UrlDecoded

    - by developerit
    Introduction While developing Developer IT’s website, we came across a problem when the user search keywords containing special character like the plus ‘+’ char. We found it while looking for C++ in our search engine. The request parameter output in ASP.NET was “c “. I found it strange that it removed the ‘++’ and replaced it with a space… Analysis After a bit of Googling and Reflection, it turns out that ASP.NET calls UrlDecode on each parameters retreived by the Request(“item”) method. The Request.Params property is affected by this two since it mashes all QueryString, Forms and other collections into a single one. Workaround Finally, I solve the puzzle usign the Request.RawUrl property and parsing it with the same RegEx I use in my url re-writter. The RawUrl not affected by anything. As its name say it, it’s raw. Published on http://www.developerit.com/

    Read the article

  • Using a service registry that doesn’t suck Part III: Service testing is part of SOA governance

    - by gsusx
    This is the third post of this series intended to highlight some of the principles of modern SOA governance solution. You can read the first two parts here: Using a service registry that doesn’t suck part I: UDDI is dead Using a service registry that doesn’t suck part II: Dear registry, do you have to be a message broker? This time I’ve decided to focus on what of the aspects that drives me ABSOLUTELY INSANE about traditional SOA Governance solutions: service testing or I should I say the lack of...(read more)

    Read the article

  • Agile SOA Governance: SO-Aware and Visual Studio Integration

    - by gsusx
    One of the major limitations of traditional SOA governance platforms is the lack of integration as part of the development process. Tools like HP-Systinet or SOA Software are designed to operate by models on which the architects dictate the governance procedures and policies and the rest of the team members follow along. Consequently, those procedures are frequently rejected by developers and testers given that they can’t incorporate it as part of their daily activities. Having SOA governance products...(read more)

    Read the article

  • Back from Teched US

    - by gsusx
    It's been a few weeks since I last blogged and, trust me, I am not happy about it :( I have been crazily busy with some of our projects at Tellago which you are going to hear more about in the upcoming weeks :) I was so busy that I didn't even have time to blog about my sessions at Teched US last week. This year I ended up presenting three sessions on three different tracks: BIE403 | Real-Time Business Intelligence with Microsoft SQL Server 2008 R2 Session Type: Breakout Session Real-time business...(read more)

    Read the article

  • Tellago & Tellago Studios at Microsoft TechReady

    - by gsusx
    This week Microsoft is hosting the first edition of their annual TechReady conference. Even though TechReady is an internal conference, Microsoft invited us to present a not one but two sessions about some our recent work. We are particularly proud of the fact that one of those sessions is about our SO-Aware service registry. We see this as a recognition to the growing popularity of SO-Aware as the best Agile SOA governance solution in the Microsoft platform. Well, on Tuesday I had the opportunity...(read more)

    Read the article

  • WebCenter Spaces 11g - UI Customization

    - by john.brunswick
    When developing on top of a portal platform to support an intranet or extranet, a portion of the development time is spent adjusting the out-of-box user templates to adjust the look and feel of the platform for your organization. Generally your deployment will not need to look like anything like the sites posted on http://cssremix.com/ or http://www.webcreme.com/, but will meet business needs by adjusting basic elements like navigation, color palate and logo placement. After spending some time doing custom UI development with WebCenter Spaces 11G I have gathered a few tips that I hope can help to speed anyone's efforts to quickly "skin" a WebCenter Spaces deployment. A detailed white paper was released that outlines a technique to quickly update the UI during runtime - http://www.oracle.com/technology/products/webcenter/pdf/owcs_r11120_cust_skins_runtime_wp.pdf. Customizing at "runtime" means using CSS and images to adjust the page layout and feel, which when creatively done can change the pages drastically. WebCenter also allows for detailed templates to manage the placement of major page elements like menus, sidebar, etc, but by adjusting only images and CSS we can end up with something like the custom solution shown below. view large image Let's dive right in and take a look at some tools to make our efforts more efficient.

    Read the article

  • Speaking at Microsoft's Duth DevDays

    - by gsusx
    Last week I had the pleasure of presenting two sessions at Microsoft's Dutch DevDays at Den Hague. On Tuesday I presented a sessions about how to implement real world RESTFul services patterns using WCF, WCF Data Services and ASP.NET MVC2. During that session I showed a total of 15 small demos that highlighted how to implement key aspects of RESTful solutions such as Security, LowREST clients, URI modeling, Validation, Error Handling, etc. As part of those demos I used the OAuth implementation created...(read more)

    Read the article

  • DonXml does WCF in NYC

    - by gsusx
    Tomorrow is WCF day in New York city!!!!! My good friend and Tellago's CTO Don Demsak will be doing a session WCF Data and RIA Services at the WCF fire-starter event to be hosted at the Microsoft offices in New York city. Don has a encyclopedic knowledge of both technologies and will be sharing lots of best practices learned from applying these technologies in large service oriented environments. In addition to Don, my crazy Cuban friend Miguel Castro will also be presenting three sessions at the...(read more)

    Read the article

  • The Complementary Roles of PLM and PIM

    - by Ulf Köster
    Oracle Product Value Chain Solutions (aka Enterprise PLM Solutions) are a comprehensive set of product management solutions that work together to provide Oracle customers with a broad array of capabilities to manage all aspects of product life: innovation, design, launch, and supply chain / commercialization processes beyond the capabilities and boundaries of traditional engineering-focused Product Lifecycle Management applications. They support companies with an integrated managed view across the product value chain: From Lab to Launch, From Farm to Fork, From Concept to Product to Customer, From Product Innovation to Product Design and Product Commercialization. Product Lifecycle Management (PLM) represents a broad suite of software solutions to improve product-oriented business processes and data. PLM success stories prove that PLM helps companies improve time to market, increase product-related revenue, reduce product costs, reduce internal costs and improve product quality. As a maturing suite of enterprise solutions, PLM is still evolving to realize the promise it can provide across all facets of a business and all phases of the product lifecycle. The vision for PLM includes everything from gathering early requirements for a product through multiple stages of the product lifecycle from product design, through commercialization and eventual product retirement or replacement. In discrete or process industries, PLM is typically more focused on Product Definition as items with respect to the technical view of a material or part, including specifications, bills of material and manufacturing data. With Agile PLM, this is specifically related to capabilities addressing Product Collaboration, Governance and Compliance, Product Quality Management, Product Cost Management and Engineering Collaboration. PLM today is mainly addressing key requirements in the early product lifecycle, in engineering changes or in the “innovation cycle”, and primarily adds value related to product design, development, launch and engineering change process. In short, PLM is the master for Product Definition, wherever manufacturing takes place. Product Information Management (PIM) is a product suite that has evolved in parallel to PLM. Product Information Management (PIM) can extend the value of PLM implementations by providing complementary tools and capabilities. More relevant in the area of Product Commercialization, the vision for PIM is to manage product information throughout an enterprise and supply chain to improve product-related knowledge management, information sharing and synchronization from multiple data sources. PIM success stories have shown the ability to provide multiple benefits, with particular emphasis on reducing information complexity and information management costs. Product Information in PIM is typically treated as the commercial view of a material or part, including sales and marketing information and categorization. PIM collects information from multiple manufacturing sites and multiple suppliers into its repository, but also provides integration tools to push the information back out to the other systems, serving as an active central repository with the aim to provide a holistic view on any product sold by a company (hence the name “Product Hub”). In short, PIM is the master of commercial Product Information. So PIM is quickly becoming mandatory because of its value in optimizing multichannel selling processes and relationships with customers, as you can see from the following table: Viewpoint PLM Current State PIM Key Benefits PIM adds to PLM Product Lifecycle Primarily R&D Front end Innovation Cycle Change process Primarily commercial / transactional state of lifecycle Provides a seamless information flow from design and manufacturing through the ultimate selling and servicing of products Data Primarily focused on “item” vs. “product” data Product structures Specifications Technical information Repository for all product information. Reaches out to entire enterprise and its various silos of product information and descriptions Provides a “trusted source” of accurate product information to the internal organization and trading partners Data Lifecycle Repository for all design iterations Historical information Released, current information, with version management and time stamping Provides a single location to track and audit historical product information Communication PLM release finished product to ERP PLM is the master for Product Definition Captures information from disparate sources, including in-house data stores Recognizes the reality of today’s data “mess” across information silos Provides the ability to package product information to its audience in the desired, relevant format to meet their exacting business requirements Departmental R&D Manufacturing Quality Compliance Procurement Strategic Marketing Focus on Marketing and Sales Gathering information from other Departments, multiple sites, multiple suppliers A singular enterprise solution that leverages existing information silos and data stores Supply Chain Multi-site internal collaboration Supplier collaboration Customer collaboration Works with customers, exchanges / data pools, and trading partners to provide relevant product information packaged the way the customer desires Provides ability to provide trading partners and internal customers with information in a manner they desire, continuously Tools Data Management Collaboration Innovation Management Cleansing Synchronization Hub functions Consistent, clean and complete commercial product information The goals of both PLM and PIM, put simply, are to help companies make more profit from their products. PLM and PIM solutions can be easily added as they share some of the same goals, while coming from two different perspectives: the definition of the product and the commercialization of the product. Both can serve as a form of product “system of record”, but take different approaches to delivering value. Oracle Product Value Chain solutions offer rich new strategies for executives to collectively leverage Agile PLM, Product Data Hub, together with Enterprise Data Quality for Products, and other industry leading Oracle applications to achieve further incremental value, like Oracle Innovation Management. This is unique on the market today.

    Read the article

  • Microsoft Townhall, An Example for Azure and MVC

    - by Shaun
    Microsoft just released an example named Microsoft Townhall which was built and deployed on Azure. It uses ASP.NET MVC as its webiste framework and the SQL Azure plus LinqToSQL as its the database and the ORM framework. You can download the source code at the MSDN Code Gallery. Basides the Azure it might be more useful to us to learn how they utilized ASP.NET MVC. Just a very quickly review I found it utilized the Enterprise Library Unity as the main IoC container for controllers, services and repositories and customized a lot of ModelBinders, Filters, etc.   Hope this helps, Shaun   All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Tellago announces SQL Server 2008 R2 BI quick adoption programs

    - by gsusx
    During the last year, we (Tellago) have been involved in various business intelligence initiatives that leverage some emerging BI techniques such as self-service BI or complex event processing (CEP). Specifically, in the last few months, we have partnered with Microsoft to deliver a series of events across the country where we present the different technologies of the SQL Server 2008 R2 BI stack such as PowerPivot, StreamInsight, Ad-Hoc Reporting and Master Data Services. As part of those events...(read more)

    Read the article

  • SO-Aware sessions in Dallas and Houston

    - by gsusx
    Our WCF Registry: SO-Aware keeps being evangelized throughout the world. This week Tellago Studios' Dwight Goins will be speaking at Microsoft events in Dallas and Houston ( https://msevents.microsoft.com/cui/EventDetail.aspx?culture=en-US&EventID=1032469800&IO=ycqB%2bGJQr78fJBMJTye1oA%3d%3d ) about WCF management best practices using SO-Aware . If you are in the area and passionate about WCF you should definitely swing by and give Dwight a hard time ;)...(read more)

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >