Search Results

Search found 2619 results on 105 pages for 'azure diagnostics'.

Page 37/105 | < Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >

  • How to leverage the internal HTTP endpoint available on Azure web roles?

    - by adelsors
    Imagine you have a Web application using an in-memory collection that changes occasionally, loading it from storage on the Application_Start global.asax event and updating it whenever it changes. If you want to deploy this application on Azure you need to keep in mind that more than one instance of the application can be running at any time and therefore you need to provide some mechanism to keep all instances informed with the latest changes. Because that the communication through internal endpoints between Azure role instances is at no cost, a good solution can be maintaining the information on Azure Storage Tables, reading its contents on the Application_Start event and populating its changes to all instances using the internal HTTP port available on Azure Web Roles. You need to follow these steps to leverage the internal HTTP endpoint available on Azure web roles: 1.   Define an internal HTTP endpoint in the Web Role properties, for example InternalHttpEndpoint   2.   Add a new WCF service to the Web Role, for example NotificationServices.svc 3.   Add a method on the new service to receive notifications from other role instances. 4.   Declare a class that inherits from System.ServiceModel.Activation.ServiceHostFactory and override the method CreateServiceHost to host the internal endpoint.   Note that you can use SecurityMode.None because the internal endpoint is private to the instances of the service, this is provided by the platform. 5.   Edit the markup of the service right clicking the svc file and selecting "View markup" to add the new factory as the factory to be used to create the service    6. Now you can notify changes to other instances using this code:

    Read the article

  • Windows Azure : le DevCamp 100 % en ligne aujourd'hui de 15h à 19h, des goodies à gagner et 30 jours d'essai gratuit à la plateforme

    Windows Azure : un DevCamp 100 % en ligne le 1er juillet De 15h à 19h, des goodies à gagner, 30 jours d'essai gratuit à Azure et une remise de 150 € pendant cet essaiMicrosoft organise un événement 100 % en ligne autour de Windows Azure le 1er juillet prochain.Ce « DevCamp » sur la plateforme Cloud pour les développeurs se déroulera de 15h à 19h (mais il sera bien sûr possible de la rejoindre à tout moment).Au total, ce sont 8 sessions de 30 minutes chacune qui aborderont des thèmes aussi variés que la BI en mode Cloud (avec SharePoint), la gestion d'identité dans Azure et Office365, l'administration d'une infrastructure Cloud hybride, le Big Data ou les Backend pour les applis multidevice.

    Read the article

  • DevCamp Azure : l'évènement de Microsoft disponible en replay pour ceux qui ont loupé le live du 1er juillet

    DevCamp Azure : l'évènement de Microsoft disponible en replay Pour ceux qui ont loupé le live du 1er juilletVous avez loupé l'évènement 100 % en ligne autour de Windows Azure organisé par Microsot ce lundi ?Bonne nouvelle : un replay de ce « DevCamp » sur la plateforme Cloud pour les développeurs est disponible depuis aujourd'hui. Vous retrouverez les 8 sessions de 30 minutes sur des thèmes aussi variés que la BI en mode Cloud (avec SharePoint), la gestion d'identité dans Azure et Office365, l'administration d'une infrastructure Cloud hybride, le Big Data ou les Backend pour les applis multi...

    Read the article

  • Windows Azure : un DevCamp 100 % en ligne le 1er juillet de 15h à 19h, des goodies à gagner et toujours 30 jours d'essai gratuit à la plateforme

    Windows Azure : un DevCamp 100 % en ligne le 1er juillet De 15h à 19h, des goodies à gagner, 30 jours d'essai gratuit à Azure et une remise de 150 € pendant cet essaiMicrosoft organise un événement 100 % en ligne autour de Windows Azure le 1er juillet prochain.Ce « DevCamp » sur la plateforme Cloud pour les développeurs se déroulera de 15h à 19h (mais il sera bien sûr possible de la rejoindre à tout moment).Au total, ce sont 8 sessions de 30 minutes chacune qui aborderont des thèmes aussi variés que la BI en mode Cloud (avec SharePoint), la gestion d'identité dans Azure et Office365, l'administration d'une infrastructure Cloud hybride, le Big Data ou les Backend pour les applis multidevice.

    Read the article

  • How to get the MSOnlineBackup Cmdlets?

    - by gregpakes
    I am trying to manage the Azure Online Backup from PowerShell. There is a set of Cmdlets called MSOnlineBackup. See technet.microsoft.com/en-us/library/dn249523.aspx. When I try to run: Import-Module MSOnlineBackup I get: The specified module 'MSOnlineBackup' was not loaded because no valid module file was found in any module directory. On the technet page it states that it is included in 4.0.4.0, If I run: $PSVersionTable.PSVersion It returns: Major: 4 Minor: 0 Build: -1 Revision: -1 I am running Windows 8.1. As you can probably tell I am no Powershell expert. I have also tried installed the Azure Backup Agent, but it says it needs a Server operating system. Can anyone tell me how I can get the MSOnlineBackup module on my machine so I can start automating Windows Azure Backup?

    Read the article

  • Windows Azure : Server Error , 404 - File or directory not found.

    - by veda
    I want to upload some files of size 35MB on to the blob container. I have coded for splitting the data into blocks and upload it on to the blob container and form a blob using PUT. I tested the code for some files of Size 2MB or something... It worked well. But When I tried it for a large MB file, its giving me this error Server Error 404 - File or directory not found. The resource you are looking for might have been removed, had its name changed, or is temporarily unavailable. when I tried it for files of size 6MB, it gives me this error.. Server Error in '/' Application. Runtime Error Description: An application error occurred on the server. The current custom error settings for this application prevent the details of the application error from being viewed remotely (for security reasons). It could, however, be viewed by browsers running on the local server machine. Details: To enable the details of this specific error message to be viewable on remote machines, please create a <customErrors> tag within a "web.config" configuration file located in the root directory of the current web application. This <customErrors> tag should then have its "mode" attribute set to "Off". <!-- Web.Config Configuration File --> <configuration> <system.web> <customErrors mode="Off"/> </system.web> </configuration> Notes: The current error page you are seeing can be replaced by a custom error page by modifying the "defaultRedirect" attribute of the application's <customErrors> configuration tag to point to a custom error page URL. <!-- Web.Config Configuration File --> <configuration> <system.web> <customErrors mode="RemoteOnly" defaultRedirect="mycustompage.htm"/> </system.web> </configuration> Can anyone tell me, How to solve this...

    Read the article

  • Detecting a message box launched in another application

    - by richard-assaf
    I am developing a windows service, in vb .et, that launches a legacy application that performs some work. The service acts as a wrapper around the legacy app allowing users to automate an otherwise manual operation. Everything is working great, except occasionally the legacy app displays a messagebox. When it does this the process halts until the the message box closes. As the service will be running on a server there will be no user to close the message box. The service launches the legacy application in a System.Diagnostics.Process. My question is, is there way to detect that a message box has been displayed by a process that I have started using System.Diagnostics.Process and is there a way to through code to close the messagebox. I've tried to be brief so if you need more information please let me know. Thanks in advance Richie

    Read the article

  • Detecting a message box opened in another application

    - by richie
    I am developing a windows service, in vb .et, that launches a legacy application that performs some work. The service acts as a wrapper around the legacy app allowing users to automate an otherwise manual operation. Everything is working great, except occasionally the legacy app displays a messagebox. When it does this the process halts until the message box is closed. As the service will be running on a server there will be no user to close the message box. The service launches the legacy application in a System.Diagnostics.Process. My question is, is there way to detect that a message box has been displayed by a process that I have started using System.Diagnostics.Process and is there a way to through code to close the messagebox. I've tried to be brief so if you need more information please let me know. Thanks in advance Richie

    Read the article

  • Tulsa - Launch 2010 Highlight Events

    - by dmccollough
    Tuesday May 04, 2010 Renaissance Tulsa Hotel and Convention Center Seville II and III 6808 S. 107th East Avenue Tulsa Oklahoma 74133   For the Developer 1:00 PM – 5:00 PM Event Overview MSDN Events Present:  Launch 2010 Highlights Join your local Microsoft Developer Evangelism team to find out first-hand about how the latest features in Microsoft® Visual Studio® 2010 can help boost your development creativity and performance.  Learn how to improve the process of refactoring your existing code base and drive tighter collaboration with testers. Explore innovative web technologies and frameworks that can help you build dynamic web applications and scale them to the cloud. And, learn about the wide variety of rich application platforms that Visual Studio 2010 supports, including Windows 7, the Web, Windows Azure, SQL Server, and Windows Phone 7 Series.   Click here to register.   For the IT Professional 8:00 AM – 12:00 PM Event Overview TechNet Events Present:  Launch 2010 Highlights Join your local Microsoft IT Pro Evangelism team to find out first-hand what Microsoft® Office® 2010 and SharePoint® 2010 mean for the productivity of you and your people—across PC, phone, and browser.  Learn how this latest wave of technologies provides revolutionary user experience and how it takes us into a future of greater productivity.  Come and explore the tools that will help you optimize desktop deployment.   Click here to register.

    Read the article

  • Clarity of the cloud with Microsoft Learning Experience.

    - by Testas
      while waiting for the Superbowl, I thought I would write this..... 2014 will not only see the release of a new version of SQL Server, but also accompanying this is the release of courses and certification tracks from Microsoft Learning Experience – formerly Microsoft Learning -- that will support the education of SQL Server and related technologies. The notable addition in the curriculum, is substantial material on cloud and big data features that pertain to data and business intelligence. There are entire module/chapters that are dedicated Power BI, SQL Azure and HDInsight. Certifications and courses from Microsoft can get stick – sometimes fair and sometimes unfairly. Whilst I am a massive advocate of community to get information and education. Microsoft’s new courses will bring clarity to the burning topics of the moment and help you to understand the capabilities of Power BI and HDInsight. From a business intelligence perspective there will be three courses: 20463C: Data warehousing in SQL Server 2014 20466C: data models and reports in SQL Server 2014 20467A: Designing Self-Service Business Intelligence and Big Data Solutions These are not the exact titles of the course, but will be confirmed prior to the release. And if you have already completed the SQL Server 2012 or 2008 curriculum, there is an upgrade course from 10977A: Upgrading business intelligence skills from 2008 to 2014. Again this is not the exact title, but these should give you an idea. Look out for announcements from Microsoft Learning Experience….   CHRIS

    Read the article

  • Performance-Driven Development

    - by BuckWoody
    I was reading a blog yesterday about the evils of SELECT *. The author pointed out that it's almost always a bad idea to use SELECT * for a query, but in the case of SQL Azure (or any cloud database, for that matter) it's especially bad, since you're paying for each transmission that comes down the line. A very good point indeed. This got me to thinking - shouldn't we treat ALL programming that way? In other words, wouldn't it make sense to pretend that we are paying for every chunk of data - a little less for a bit, a lot more for a BLOB or VARCHAR(MAX), that sort of thing? In effect, we really are paying for that. Which led me to the thought of Performance-Driven Development, or the act of programming with the goal of having the fastest code from the very outset. This isn't an original title, since a quick Bing-search shows me a couple of offerings from Forrester and a professional in Israel who already used that title, but the general idea I'm thinking of is assigning a "cost" to each code round-trip, be it network, storage, trip time and other variables, and then rewarding the developers that come up with the fastest code. I wonder what kind of throughput and round-trip times you could get if your developers were paid on a scale of how fast the application performed... Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Large invoice database structure and rendering

    - by user132624
    Our client has a MS SQL database that has 1 million customer invoice records in it. Using the database, our client wants its customers to be able to log into a frontend web site and then be able to view, modify and download their company’s invoices. Given the size of the database and the large number of customers who may log into the web site at any time, we are concerned about data base engine performance and web page invoice rendering performance. The 1 million invoice database is for just 90 days sales, so we will remove invoices over 90 days old from the database. Most of the invoices have multiple line items. We can easily convert our invoices into various data formats so for example it is easy for us to convert to and from SQL to XML with related schema and XSLT. Any data conversion would be done on another server so as not to burden the web interface server. We have tentatively decided to run the web site on a .NET Framework IIS web server using MS SQL on MS Azure. How would you suggest we structure our database for best performance? For example, should we put all the invoices of all customers located within the same 5 digit or 6 digit zip codes into the same table? Or could we set up a separate home directory for each customer on IIS and place each customer’s invoices in each customer’s home directory in XML format? And secondly what would you suggest would be the best method to render customer invoices on a web page and allow customers to modify for best performance? The ADO.net XML Data Set looks intriguing to us as a method, but we have never used it.

    Read the article

  • Best practices for caching search queries

    - by David Esteves
    I am trying to improve performance of my ASP.net Web Api by adding a data cache but I am not sure how exactly to go about it as it seems to be more complex than most caching scenarios. An example is I have a table of Locations and an api to retrieve locations via search, for an autocomplete. /api/location/Londo and the query would be something like SELECT * FROM Locations WHERE Name like 'Londo%' These locations change very infrequently so I would like to cache them to prevent trips to the database for no real reason and improve the response time. Looking at caching options I am using the Windows Azure Appfabric system, the problem is it's just a key/value cache. Since I can only retrieve items based on keys I couldn't actually use it for this scenario as far as Im aware. Is what I am trying to do bad use of a caching system? Should I try looking into NoSql DB which could possibly run as a cache for something like this to improve performance? Should I just cache the entire table/collection in a single key with a specific data structure which could assist with the searching and then do the search upon retrieval of the data?

    Read the article

  • Presenting the Cloud in a Different Way

    - by BuckWoody
    I had the honor of presenting the Developers at the Portland PASS chapter, and decided to go a different way than just using PowerPoint Slides…. (click on any picture to enlarge) The point is that when you need to get a point across, it’s OK to change tactics to make sure the information sticks. In this case, I decided to make the audience the PowerPoint. I used a few props to show the various paradigms we use to describe what the industry uses for the word “cloud” First, we talked about Infrastructure as a Service. I picked a gentleman who didn’t quite fit the hard hat and safety vest I picked out for him. Notice our “user” as she accesses our “Server” (complete with tray and glass) which has been virtualized.    Software as a service comes next. In this case, the user and potentially even customers just access software (represented here as a Windows ME box…) remotely – everything is virtualized. Finally, Platform as a Service – Yup, Platform shoes as a necklace, and a tie-dye shirt to represent the 70’s – a decade when mainframes used stateless programming as well. Notice also the components of Windows Azure – Compute (Keyboard) Application Fabric (Toy Bus) and Storage (Bucket).   And at the end of the day, it’s all about serving those customers…

    Read the article

  • Asynchronously returning a hierarchal data using .NET TPL... what should my return object "look" like?

    - by makerofthings7
    I want to use the .NET TPL to asynchronously do a DIR /S and search each subdirectory on a hard drive, and want to search for a word in each file... what should my API look like? In this scenario I know that each sub directory will have 0..10000 files or 0...10000 directories. I know the tree is unbalanced and want to return data (in relation to its position in the hierarchy) as soon as it's available. I am interested in getting data as quickly as possible, but also want to update that result if "better" data is found (better means closer to the root of c:) I may also be interested in finding all matches in relation to its position in the hierarchy. (akin to a report) Question: How should I return data to my caller? My first guess is that I think I need a shared object that will maintain the current "status" of the traversal (started | notstarted | complete ) , and might base it on the System.Collections.Concurrent. Another idea that I'm considering is the consumer/producer pattern (which ConcurrentCollections can handle) however I'm not sure what the objects "look" like. Optional Logical Constraint: The API doesn't have to address this, but in my "real world" design, if a directory has files, then only one file will ever contain the word I'm looking for.  If someone were to literally do a DIR /S as described above then they would need to account for more than one matching file per subdirectory. More information : I'm using Azure Tables to store a hierarchy of data using these TPL extension methods. A "node" is a table. Not only does each node in the hierarchy have a relation to any number of nodes, but it's possible for each node to have a reciprocal link back to any other node. This may have issues with recursion but I'm addressing that with a shared object in my recursion loop. Note that each "node" also has the ability to store local data unique to that node. It is this information that I'm searching for. In other words, I'm searching for a specific fixed RowKey in a hierarchy of nodes. When I search for the fixed RowKey in the hierarchy I'm interested in getting the results FAST (first node found) but prefer data that is "closer" to the starting point of the hierarchy. Since many nodes may have the particular RowKey I'm interested in, sometimes I may want to get a report of ALL the nodes that contain this RowKey.

    Read the article

  • Node.js running under IIS Express Keeps Crashing

    - by PazoozaTest Pazman
    I recently resinstalled Windows 7 on my machine and went back to downloading and installing the tools to help me continue developing node.js windows azure web applications. I followed the instructions given on the node.js azure site: http://www.windowsazure.com/en-us/develop/nodejs/ and using web installer 4.0 it says I have successfully installed these tools: Windows Azure Powershell Windows Azure SDK for Node.js - June 2012 Windows Azure SDK for .Net (VS 2012 RC) - June 2012 IIS Recommend Configuration The problem I am experiencing is that when I run the site using powershell e.g: start-azureemulator -launch it goes ahead and runs IIS Express, and after several minutes IIS Express crashes with the following information: Problem signature: Problem Event Name: APPCRASH Application Name: iisexpress.exe Application Version: 8.0.8298.0 Application Timestamp: 4f620349 Fault Module Name: iiscore.dll Fault Module Version: 8.0.8298.0 Fault Module Timestamp: 4f63b65c Exception Code: c0000005 Exception Offset: 00021767 OS Version: 6.1.7601.2.1.0.256.28 Locale ID: 1033 Additional Information 1: f66d Additional Information 2: f66d807b515d6b2dc6f28f66db769a01 Additional Information 3: 7b2f Additional Information 4: 7b2f6797d07ebc2c23f2b227e779722e I am running 2 instances each time, and both of them crash one after the other. Is anyone experiencing something similar and fix this issue ? Is their an upgrade I need to do ? I've run windows update but it says I've got all the latest updates etc. Can I tell the powershell cmdlet to use IIS 7 instead of IIS Express? I'm guessing its something to do with IIS Express on my machine. I did some hunting around and found this person here who experienced a similar problem: https://github.com/tjanczuk/iisnode/issues/149 I've got a cron job running every 1 second, to check if any website totals need to be updated. Could this be causing IIS Express to crash? Cheers

    Read the article

  • Oracle and Microsoft Expand Choice and Flexibility in Deploying Oracle Software in the Cloud

    - by Gene Eun
    Oracle and Microsoft have entered into a new partnership that will help customers embrace cloud computing by providing greater choice and flexibility in how to deploy Oracle software.  Here are the key elements of the partnership: Effective today, our customers can run supported Oracle software on Windows Server Hyper-V and in Windows Azure Effective today, Oracle provides license mobility for customers who want to run Oracle software on Windows Azure Microsoft will add Infrastructure Services instances with popular configurations of Oracle software including Java, Oracle Database and Oracle WebLogic Server to the Windows Azure image gallery Microsoft will offer fully licensed and supported Java in Windows Azure Oracle will offer Oracle Linux, with a variety of Oracle software, as preconfigured instances on Windows Azure Oracle’s strategy and commitment is to support multiple platforms, and Microsoft Windows has long been an important supported platform.  Oracle is now extending that support to Windows Server Hyper-V and Window Azure by providing certification and support for Oracle applications, middleware, database, Java and Oracle Linux on Windows Server Hyper-V and Windows Azure. As of today, customers can deploy Oracle software on Microsoft private clouds and Windows Azure, as well as Oracle private and public clouds and other supported cloud environments. For information related to software licensing in Windows Azure, see Licensing Oracle Software in the Cloud Computing Environment. Also, Oracle Support policies as they apply to Oracle software running in Windows Azure or on Windows Server Hyper-V are covered in two My Oracle Support (MOS) notes which are shown below: MOS Note 1563794.1 Certified Software on Microsoft Windows Server 2012 Hyper-V - NEW MOS Note 417770.1 Oracle Linux Support Policies for Virtualization and Emulation - UPDATED

    Read the article

  • Part 1 - Load Testing In The Cloud

    - by Tarun Arora
    Azure is fascinating, but even more fascinating is the marriage of Azure and TFS! Introduction Recently a client I worked for had 2 major business critical applications being delivered, with very little time budgeted for Performance testing, we immediately hit a bottleneck when the performance testing phase started, the in house infrastructure team could not support the hardware requirements in the short notice. It was suggested that the performance testing be performed on one of the QA environments which was a fraction of the production environment. This didn’t seem right, the team decided to turn to the cloud. The team took advantage of the elasticity offered by Azure, starting with a single test agent which was provisioned and ready for use with in 30 minutes the team scaled up to 17 test agents to perform a very comprehensive performance testing cycle. Issues were identified and resolved but the highlight was that the cost of running the ‘test rig’ proved to be less than if hosted on premise by the infrastructure team. Thank you for taking the time out to read this blog post, in the series of posts, I’ll try and cover the start to end of everything you need to know to use Azure to build your Test Rig in the cloud. But Why Azure? I have my own Data Centre… If the environment is provisioned in your own datacentre, - No matter what level of service agreement you may have with your infrastructure team there will be down time when the environment is patched - How fast can you scale up or down the environments (keeping the enterprise processes in mind) Administration, Cost, Flexibility and Scalability are the areas you would want to think around when taking the decision between your own Data Centre and Azure! How is Microsoft's Public Cloud Offering different from Amazon’s Public Cloud Offering? Microsoft's offering of the Cloud is a hybrid of Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) which distinguishes Microsoft's offering from other providers such as Amazon (Amazon only offers IaaS). PaaS – Platform as a Service IaaS – Infrastructure as a Service Fills the needs of those who want to build and run custom applications as services. Similar to traditional hosting, where a business will use the hosted environment as a logical extension of the on-premises datacentre. A service provider offers a pre-configured, virtualized application server environment to which applications can be deployed by the development staff. Since the service providers manage the hardware (patching, upgrades and so forth), as well as application server uptime, the involvement of IT pros is minimized. On-demand scalability combined with hardware and application server management relieves developers from infrastructure concerns and allows them to focus on building applications. The servers (physical and virtual) are rented on an as-needed basis, and the IT professionals who manage the infrastructure have full control of the software configuration. This kind of flexibility increases the complexity of the IT environment, as customer IT professionals need to maintain the servers as though they are on-premises. The maintenance activities may include patching and upgrades of the OS and the application server, load balancing, failover clustering of database servers, backup and restoration, and any other activities that mitigate the risks of hardware and software failures.   The biggest advantage with PaaS is that you do not have to worry about maintaining the environment, you can focus all your time in solving the business problems with your solution rather than worrying about maintaining the environment. If you decide to use a VM Role on Azure, you are asking for IaaS, more on this later. A nice blog post here on the difference between Saas, PaaS and IaaS. Now that we are convinced why we should be turning to the cloud and why in specific Azure, let’s discuss about the Test Rig. The Load Test Rig – Topology Now the moment of truth, Of course a big part of getting value from cloud computing is identifying the most adequate workloads to take to the cloud, so I’ve decided to try to make a Load Testing rig where the Agents are running on Windows Azure.   I’ll talk you through the above Topology, - User: User kick starts the load test run from the developer workstation on premise. This passes the request to the Test Controller. - Test Controller: The Test Controller is on premise connected to the same domain as the developer workstation. As soon as the Test Controller receives the request it makes use of the Windows Azure Connect service to orchestrate the test responsibilities to all the Test Agents. The Windows Azure Connect endpoint software must be active on all Azure instances and on the Controller machine as well. This allows IP connectivity between them and, given that the firewall is properly configured, allows the Controller to send work loads to the agents. In parallel, the Controller will collect the performance data from the agents, using the traditional WMI mechanisms. - Test Agents: The Test Agents are on the Windows Azure Public Cloud, as soon as the test controller issues instructions to the test agents, the test agents start executing the load tests. The HTTP requests are issued against the web server on premise, the results are captured by the test agents. And finally the results are passed over to the controller. - Servers: The Web Server and DB Server are hosted on premise in the datacentre, this is usually the case with business critical applications, you probably want to manage them your self. Recap and What’s next? So, in the introduction in the series of blog posts on Load Testing in the cloud I highlighted why creating a test rig in the cloud is a good idea, what advantages does Windows Azure offer and the Test Rig topology that I will be using. I would also like to mention that i stumbled upon this [Video] on Azure in a nutshell, great watch if you are new to Windows Azure. In the next post I intend to start setting up the Load Test Environment and discuss pricing with respect to test agent machine types that will be used in the test rig. Hope you enjoyed this post, If you have any recommendations on things that I should consider or any questions or feedback, feel free to add to this blog post. Remember to subscribe to http://feeds.feedburner.com/TarunArora.  See you in Part II.   Share this post : CodeProject

    Read the article

  • Exporting virtual machine from Windows Azure to Amazon

    - by Somebody
    As documentation says: You can import VMware ESX and VMware Workstation VMDK images, Citrix Xen VHD images and Microsoft Hyper-V VHD images for Windows Server 2003, Windows Server 2003 R2, Windows Server 2008 and Windows Server 2008 R2. What about virtual machine with Ubuntu on it? Will it import it successfully? Have anyone tried to do it? Is it possible to import VM directly from Windows Azure. Without need to download VM on my machine and then upload it to Amazon? Thanks.

    Read the article

  • Windows Azure Directory Sync Generic Failure

    - by Armand
    Ok so I have a domain that I want to sync to Office365 but when I start the Windows Azure Active Directory Sync tool Configuration Wizard I get an error with the following details: System.Management.ManagementException: Generic failure at System.Management.ManagementException.ThrowWithExtendedInfo(ManagementStatus errorCode) at System.Management.ManagementObjectCollection.ManagementObjectEnumerator.MoveNext() at Microsoft.Online.DirSync.Common.MiisAction.GetTargetMA() at Microsoft.Online.DirSync.Common.MiisAction.IsSyncInProgress() at Microsoft.Online.DirSync.Common.PrerequisiteChecks.ThrowIfSyncInProgress() at Microsoft.Online.DirSync.UI.IntroductionWizardPage.PrerequisiteValidation() at Microsoft.Online.DirSync.UI.IntroductionWizardPage.OnLoad(EventArgs e) at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible) at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible) at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible) at System.Windows.Forms.Control.CreateControl(Boolean fIgnoreVisible) at System.Windows.Forms.Control.CreateControl() at System.Windows.Forms.Control.WmShowWindow(Message& m) at System.Windows.Forms.Control.WndProc(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m) at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam) I have searched far and wide to no avail, this happens before I can even enter any details. A few notes: The server is not the domain controller Sharepoint 2013 is installed on this server The account I log in with and run the application with is a domain and enterprise admin I right click and run as administrator when I start the application So when I click continue on the error and go through the steps I get two possible scenarios that change from time to time at now predictable rate: 1) I just get an error, generic failure. 2) I get an error "Cannot start service MSOnlineSyncScheduler on computer '.'." Any help?

    Read the article

  • How to leverage the internal HTTP endpoint available on Azure web roles?

    - by Alfredo Delsors
    Imagine you have a Web application using an in-memory collection that changes occasionally but is used very often. The collection gets loaded from storage on the Application_Start global.asax event and is updated whenever its content changes. If you want to deploy this application on Azure you need to keep in mind that more than one instance of the application can be running at any time and therefore you need to provide some mechanism to keep all instances informed with the latest changes. Because the communication through internal endpoints between Azure role instances is at no cost, a good solution can be maintaining the information on Azure Storage Tables, reading its contents on the Application_Start event and populating its changes to all other instances using the internal HTTP port available on Azure Web Roles. You need to follow these steps to leverage the internal HTTP endpoint available on Azure web roles to maintain all instances up to date. 1.   Define an internal HTTP endpoint in the Web Role properties, for example InternalHttpEndpoint   2.   Add a new WCF service to the Web Role, for example NotificationService.svc 3.   Disable multiple site bindings in web.config: <serviceHostingEnvironment multipleSiteBindingsEnabled="false"> 4.   Add a method on the new service to receive notifications from other role instances. namespace Service { [ServiceContract] public interface INotificationService { [OperationContract(IsOneWay = true)] void Notify(Information info); } } 5.   Declare a class that inherits from System.ServiceModel.Activation.ServiceHostFactory and override the method CreateServiceHost to host the internal endpoint. public class InternalServiceFactory : ServiceHostFactory { protected override ServiceHost CreateServiceHost(Type serviceType, Uri[] baseAddresses) { var internalEndpointAddress = string.Format( "http://{0}/NotificationService.svc", RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["InternalHttpEndpoint"].IPEndpoint); ServiceHost host = new ServiceHost( typeof(NotificationService), new Uri(internalEndpointAddress)); BasicHttpBinding binding = new BasicHttpBinding(SecurityMode.None); host.AddServiceEndpoint( typeof(INotificationService), binding, internalEndpointAddress); return host; } } Note that you can use SecurityMode.None because the internal endpoint is private to the instances of the service. 6.   Edit the markup of the service right clicking the svc file and selecting "View markup" to add the new factory as the factory to be used to create the service <%@ ServiceHost Language="C#" Debug="true" Factory="Service.InternalServiceFactory" Service="Service.NotificationService" CodeBehind="NotificationService.svc.cs" %> 7.   Now you can notify changes to other instances using this code: var current = RoleEnvironment.CurrentRoleInstance; var endPoints = current.Role.Instances .Where(instance => instance != current) .Select(instance => instance.InstanceEndpoints["InternalHttpEndpoint"]); foreach (var ep in endPoints) { EndpointAddress address = new EndpointAddress( String.Format("http://{0}/NotificationService.svc", ep.IPEndpoint)); BasicHttpBinding binding = new BasicHttpBinding(SecurityMode.None); var factory = new ChannelFactory<INotificationService>(binding); INotificationService instance = factory.CreateChannel(address); instance.Notify(changedinfo); }

    Read the article

  • Which one to select for my future career; Java, C#, Azure or Apex?

    - by user636195
    Hi folks, This time, I am going to studying Masters in Computer Science in U.S.A after a week. I have been doing my B.Sc for the past three years and after my freshman year I started working on projects (in C# and very rarely in Java) for the past two years.(i.e while I was a second and third year student). Now I am in a college where all of the programming courses are going to be taken in Java only (using Eclipse) and I am going to stay in this college for 8 months on campus and then fully employed for two years in other companies as a CPT. I really love to work on Microsoft products because, for me, they are simple and easy to use and understand. My future plan is to work in Cloud computing and be a Cloud based business owner in the near future. Since the college is going to teach us and let us do every project in Java, I was confused which programming language to use that will help me and enhance me in my career, and of course I wanted to select the one I liked to do everytime. I also heard a lot about Azure (Microsoft’s ) and Apex (Salesforce.com’s cloud computing programming language). Would you please give me your advice and recommendation based on my situation? Should I have to study only Java, or should I have to study C# or Azure beside Java on my own? The reason I asked this is because, since I have no clue how Azure works and how long it will take me to know the language, I am really confused which one to select (Java Vs C# and Azure Vs Apex or if there is any popular and mostly used Cloud Computing langauge). Do you think I can get a job in cloud computing if I study Azure or Apex by my own without experience? There is also one issue I want to consider which is a short term issue is. i.e Salary. Since I have to pay my student loan, I also need to get a good job which will let me pay my loan within two years. But, as I said, my long term plan is, get experience in Cloud Computing (from programming to administrative part,i.e every area of cloud computing) and then have my own business may be within 5-10 years. What do you think? Thank you for your time.

    Read the article

  • System.Diaganostics.Process.Id Isn't the Same Process Id Shown in Task Manger. Why?

    - by LonnieBest
    I'm using c#'s System.Diagnostic.Process object. One of its properties is Id. The Id this produces is not the same as the PID, shown in Windows Task Manager. Why is this? You see, once this process is started. It launches two other unmanaged processes, for which I can't explicitly get IDs for by object property references. I have to search through all processes to find them by process name via System.Diagnostics.Process.GetProcesses(). I'm trying to find a reliable way to kill this process and all associated processes by PID, the one that shows in Task Manager. Is there a better way? I can't just kill all processes with the associated process names, because that might kill other instances of those processes that have nothing to do with my program.

    Read the article

  • PerformanceCounterCategory.Exists is very slow if category doesn't exists

    - by Shrike
    I have kind of library which uses a bunch of its own perf counters. But I want my library works fine even if that perf counters weren't installed. So I've created wrappers aroung PerformanceCounter and on the first use check if PerfCounter exists or not. If they exists then I'm using native PerformanceCounter instead I'm using a wrapper which do nothing. So to check perf counter existence I use PerformanceCounterCategory.Exists The problem is that if there isn't such category then PerformanceCounterCategory.Exists call takes (on my machine) about 10 seconds! Needless to say it's too slow. What can I do? Code to try it by youself: using System; using System.Diagnostics; class Program { static void Main(string[] args) { var ts = Stopwatch.StartNew(); var res = PerformanceCounterCategory.Exists("XYZ"); Console.WriteLine(ts.ElapsedMilliseconds); Console.WriteLine("result:" + res); } }

    Read the article

  • How can I most accurately calculate the execution time of an ASP.NET page while also displaying it o

    - by henningst
    I want to calculate the execution time of my ASP.NET pages and display it on the page. Currently I'm calculating the execution time using a System.Diagnostics.Stopwatch and then store the value in a log database. The stopwatch is started in OnInit and stopped in OnPreRenderComplete. This seems to be working quite fine, and it's giving a similar execution time as the one shown in the page trace. The problem now is that I'm not able to display the execution time on the page because the stopwatch is stopped too late in the life cycle. What is the best way to do this?

    Read the article

< Previous Page | 33 34 35 36 37 38 39 40 41 42 43 44  | Next Page >