Search Results

Search found 4410 results on 177 pages for 'azure worker roles'.

Page 19/177 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Microsoft Azure s'enrichit de deux nouveaux services pour le NoSQL et la recherche en plein texte sans oublier Apache HBase pour Azure HDInsight

    Microsoft Azure s'enrichit de deux nouveaux services pour le NoSQL et la recherche en plein texte sans oublier Apache HBase pour Azure HDInsight Microsoft a publié sous forme d'aperçu de nouvelles fonctionnalités pour son service cloud Microsoft Azure, il s'agit de deux services : l'un dédié aux bases données et l'autre pour la recherche en plein texte.Le service dédié aux bases de données a été baptisé Azure DocumentDB, il permettra de compléter les bases de données relationnelles avec un service...

    Read the article

  • Integrate Bing Search API into ASP.Net application

    - by sreejukg
    Couple of months back, I wrote an article about how to integrate Bing Search engine (API 2.0) with ASP.Net website. You can refer the article here http://weblogs.asp.net/sreejukg/archive/2012/04/07/integrate-bing-api-for-search-inside-asp-net-web-application.aspx Things are changing rapidly in the tech world and Bing has also changed! The Bing Search API 2.0 will work until August 1, 2012, after that it will not return results. Shocked? Don’t worry the API has moved to Windows Azure market place and available for you to sign up and continue using it and there is a free version available based on your usage. In this article, I am going to explain how you can integrate the new Bing API that is available in the Windows Azure market place with your website. You can access the Windows Azure market place from the below link https://datamarket.azure.com/ There is lot of applications available for you to subscribe and use. Bing is one of them. You can find the new Bing Search API from the below link https://datamarket.azure.com/dataset/5BA839F1-12CE-4CCE-BF57-A49D98D29A44 To get access to Bing Search API, first you need to register an account with Windows Azure market place. Sign in to the Windows Azure market place site using your windows live account. Once you sign in with your windows live account, you need to register to Windows Azure Market place account. From the Windows Azure market place, you will see the sign in button it the top right of the page. Clicking on the sign in button will take you to the Windows live ID authentication page. You can enter a windows live ID here to login. Once logged in you will see the Registration page for the Windows Azure market place as follows. You can agree or disagree for the email address usage by Microsoft. I believe selecting the check box means you will get email about what is happening in Windows Azure market place. Click on continue button once you are done. In the next page, you should accept the terms of use, it is not optional, you must agree to terms and conditions. Scroll down to the page and select the I agree checkbox and click on Register Button. Now you are a registered member of Windows Azure market place. You can subscribe to data applications. In order to use BING API in your application, you must obtain your account Key, in the previous version of Bing you were required an API key, the current version uses Account Key instead. Once you logged in to the Windows Azure market place, you can see “My Account” in the top menu, from the Top menu; go to “My Account” Section. From the My Account section, you can manage your subscriptions and Account Keys. Account Keys will be used by your applications to access the subscriptions from the market place. Click on My Account link, you can see Account Keys in the left menu and then Add an account key or you can use the default Account key available. Creating account key is very simple process. Also you can remove the account keys you create if necessary. The next step is to subscribe to BING Search API. At this moment, Bing Offers 2 APIs for search. The available options are as follows. 1. Bing Search API - https://datamarket.azure.com/dataset/5ba839f1-12ce-4cce-bf57-a49d98d29a44 2. Bing Search API – Web Results only - https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65 The difference is that the later will give you only web results where the other you can specify the source type such as image, video, web, news etc. Carefully choose the API based on your application requirements. In this article, I am going to use Web Results Only API, but the steps will be similar to both. Go to the API page https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65, you can see the subscription options in the right side. And in the bottom of the page you can see the free option Since I am going to use the free options, just Click the Sign Up link for that. Just select I agree check box and click on the Sign Up button. You will get a recipt pagethat detail your subscription. Now you are ready Bing Search API – Web results. The next step is to integrate the API into your ASP.Net application. Now if you go to the Search API page (as well as in the Receipt page), you can see a .Net C# Class Library link, click on the link, you will get a code file named “BingSearchContainer.cs”. In the following sections I am going to demonstrate the use of Bing Search API from an ASP.Net application. Create an empty ASP.Net web application. In the solution explorer, the application will looks as follows. Now add the downloaded code file (“BingSearchContainer.cs”) to the project. Right click your project in solution explorer, Add -> existing item, then browse to the downloaded location, select the “BingSearchContainer.cs” file and add it to the project. To build the code file you need to add reference to the following library. System.Data.Services.Client You can find the library in the .Net tab, when you select Add -> Reference Try to build your project now; it should build without any errors. Add an ASP.Net page to the project. I have included a text box and a button, then a Grid View to the page. The idea is to Search the text entered and display the results in the gridview. The page will look in the Visual Studio Designer as follows. The markup of the page is as follows. In the button click event handler for the search button, I have used the following code. Now run your project and enter some text in the text box and click the Search button, you will see the results coming from Bing, cool. I entered the text “Microsoft” in the textbox and clicked on the button and I got the following results. Searching Specific Websites If you want to search a particular website, you pass the site url with site:<site url name> and if you have more sites, use pipe (|). e.g. The following search query site:microsoft.com | site:adobe.com design will search the word design and return the results from Microsoft.com and Adobe.com See the sample code that search only Microsoft.com for the text entered for the above sample. var webResults = bingContainer.Web("site:www.Microsoft.com " + txtSearch.Text, null, null, null, null, null, null); Paging the results returned by the API By default the BING API will return 100 results based on your query. The default code file that you downloaded from BING doesn’t include any option for this. You can modify the downloaded code to perform this paging. The BING API supports two parameters $top (for number of results to return) and $skip (for number of records to skip). So if you want 3rd page of results with page size = 10, you need to pass $top = 10 and $skip=20. Open the BingSearchContainer.cs in the editor. You can see the Web method in it as follows. public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options) {  In the method signature, I have added two more parameters public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options, int resultCount, int pageNo) { and in the method, you need to pass the parameters to the query variable. query = query.AddQueryOption("$top", resultCount); query = query.AddQueryOption("$skip", (pageNo -1)*resultCount); return query; Note that I didn’t perform any validation, but you need to check conditions such as resultCount and pageCount should be greater than or equal to 1. If the parameters are not valid, the Bing Search API will throw the error. The modified method is as follows. The changes are highlighted. Now see the following code in the SearchPage.aspx.cs file protected void btnSearch_Click(object sender, EventArgs e) {     var bingContainer = new Bing.BingSearchContainer(new Uri(https://api.datamarket.azure.com/Bing/SearchWeb/));     // replace this value with your account key     var accountKey = "your key";     // the next line configures the bingContainer to use your credentials.     bingContainer.Credentials = new NetworkCredential(accountKey, accountKey);     var webResults = bingContainer.Web("site:microsoft.com" +txtSearch.Text , null, null, null, null, null, null,3,2);     lstResults.DataSource = webResults;     lstResults.DataBind(); } The following code will return 3 results starting from second page (by skipping first 3 results). See the result page as follows. Bing provides complete integration to its offerings. When you develop search based applications, you can use the power of Bing to perform the search. Integrating Bing Search API to ASP.Net application is a simple process and without investing much time, you can develop a good search based application. Make sure you read the terms of use before designing the application and decide which API usage is suitable for you. Further readings BING API Migration Guide http://go.microsoft.com/fwlink/?LinkID=248077 Bing API FAQ http://go.microsoft.com/fwlink/?LinkID=252146 Bing API Schema Guide http://go.microsoft.com/fwlink/?LinkID=252151

    Read the article

  • Assignment of roles in communication when sides could try to cheat

    - by 9000
    Assume two nodes in a peer-to-peer network initiating a communication. In this communication, one node has to serve as a "sender", another as a "receiver" (role names are arbitrary here). I'd like the nodes to assert either role with approximately equal probability. That is, in N communications with various other nodes a given node would assume the "sender" role roughly N/2 times. Since there's no third-party arbiter available, nodes should agree on their roles by exchanging messages. The catch is that we can encounter a rogue node which would try to become the "receiver" in most or all cases, and coax the other side to always serve as a "sender". I'm looking for an algorithm to assign roles to sides of communication so that no side could get a predetermined role with high probability. It's OK for the side which is trying to cheat to fail to communicate.

    Read the article

  • Off-the-shelf solutions for migrating data from azure blobstorage to rackspace cloud files

    - by S.C.
    I have large amounts of data (500+ GB) stored in azure blob storage that I need to transfer to rackspace cloud files. I know it is possible to perform such a migration using the SDKs from both services, but is there a free, standard, 1-step process for doing this? I've built a POC utility but would like to avoid having to optimize it to perform the transfer within a reasonable amount of time. Thanks.

    Read the article

  • Multi Level Security via Roles

    - by Geertjan
    I'm simulating a small scenario: Users can be dragged into roles; roles can be dragged into role groups. When a drop is made into a role group, a new role is created (WindowManager.getDefault().setRole("")). Then, when the user logs in, they log into a particular role. Depending on the role they log into, a different role group is assigned, which maps to a certain "role" in NetBeans Platform terms, i.e., the related level of security is applied and the related windows open.

    Read the article

  • Why ASP.NET menu control ignores roles in Web.sitemap?

    - by MainMa
    Hi, I have a website with a menu based on sitemap. ActiveDirectoryRoleProvider is a custom class. securityTrimmingEnabled of sitemap provider is set to true. Now, nevertheless the roles set in the sitemap file, site menu displays every sitemap entity. So for example if I have in sitemap a node with roles="*", a second one with roles="Administrators" and a third one with roles="Foo" and I login as a member of Administrators group but not Foo group, the site menu will display all three items. On the other hand, if I have a node which does not specify roles attribute but has children, this node will never be displayed. If I put: <%= HttpContext.Current.User.IsInRole("Administrators") ? "Admin" : "Not admin"%> <%= HttpContext.Current.User.IsInRole("Foo") ? "Foo" : "Not foo"%> before the menu, it displays that I'm Admin, but Not foo, which is just fine. So if it knows that I'm Admin but Not foo, why does it continue to display Foo's sitemap nodes? Note: changing authorizations has no effect on the menu. It continues to show every item, even for the pages I'm unable to access.

    Read the article

  • MySQL search for user and their roles

    - by Jenkz
    I am re-writing the SQL which lets a user search for any other user on our site and also shows their roles. An an example, roles can be "Writer", "Editor", "Publisher". Each role links a User to a Publication. Users can take multiple roles within multiple publications. Example table setup: "users" : user_id, firstname, lastname "publications" : publication_id, name "link_writers" : user_id, publication_id "link_editors" : user_id, publication_id Current psuedo SQL: SELECT * FROM ( (SELECT user_id FROM users WHERE firstname LIKE '%Jenkz%') UNION (SELECT user_id FROM users WHERE lastname LIKE '%Jenkz%') ) AS dt JOIN (ROLES STATEMENT) AS roles ON roles.user_id = dt.user_id At the moment my roles statement is: SELECT dt2.user_id, dt2.publication_id, dt.role FROM ( (SELECT 'writer' AS role, link_writers.user_id, link_writers.publication_id FROM link_writers) UNION (SELECT 'editor' AS role, link_editors.user_id, link_editors.publication_id FROM link_editors) ) AS dt2 The reason for wrapping the roles statement in UNION clauses is that some roles are more complex and require a table join to find the publication_id and user_id. As an example "publishers" might be linked accross two tables "link_publishers": user_id, publisher_group_id "link_publisher_groups": publisher_group_id, publication_id So in that instance, the query forming part of my UNION would be: SELECT 'publisher' AS role, link_publishers.user_id, link_publisher_groups.publication_id FROM link_publishers JOIN link_publisher_groups ON lpg.group_id = lp.group_id I'm pretty confident that my table setup is good (I was warned off the one-table-for-all system when researching the layout). My problem is that there are now 100,000 rows in the users table and upto 70,000 rows in each of the link tables. Initial lookup in the users table is fast, but the joining really slows things down. How can I only join on the relevant roles? -------------------------- EDIT ---------------------------------- Explain above (open in a new window to see full resolution). The bottom bit in red, is the "WHERE firstname LIKE '%Jenkz%'" the third row searches WHERE CONCAT(firstname, ' ', lastname) LIKE '%Jenkz%'. Hence the large row count, but I think this is unavoidable, unless there is a way to put an index accross concatenated fields? The green bit at the top just shows the total rows scanned from the ROLES STATEMENT. You can then see each individual UNION clause (#6 - #12) which all show a large number of rows. Some of the indexes are normal, some are unique. It seems that MySQL isn't optimizing to use the dt.user_id as a comparison for the internal of the UNION statements. Is there any way to force this behaviour? Please note that my real setup is not publications and writers but "webmasters", "players", "teams" etc.

    Read the article

  • Integration Patterns with Azure Service Bus Relay, Part 3.5: Node.js relay

    - by Elton Stoneman
    This is an extension to Part 3 in the IPASBR series, see also: Integration Patterns with Azure Service Bus Relay, Part 1: Exposing the on-premise service Integration Patterns with Azure Service Bus Relay, Part 2: Anonymous full-trust .NET consumer Integration Patterns with Azure Service Bus Relay, Part 3: Anonymous partial-trust consumer In Part 3 I said “there isn't actually a .NET requirement here”, and this post just follows up on that statement. In Part 3 we had an ASP.NET MVC Website making a REST call to an Azure Service Bus service; to show that the REST stuff is really interoperable, in this version we use Node.js to make the secure service call. The code is on GitHub here: IPASBR Part 3.5. The sample code is simpler than Part 3 - rather than code up a UI in Node.js, the sample just relays the REST service call out to Azure. The steps are the same as Part 3: REST call to ACS with the service identity credentials, which returns an SWT; REST call to Azure Service Bus Relay, presenting the SWT; request gets relayed to the on-premise service. In Node.js the authentication step looks like this: var options = { host: acs.namespace() + '-sb.accesscontrol.windows.net', path: '/WRAPv0.9/', method: 'POST' }; var values = { wrap_name: acs.issuerName(), wrap_password: acs.issuerSecret(), wrap_scope: 'http://' + acs.namespace() + '.servicebus.windows.net/' }; var req = https.request(options, function (res) { console.log("statusCode: ", res.statusCode); console.log("headers: ", res.headers); res.on('data', function (d) { var token = qs.parse(d.toString('utf8')); callback(token.wrap_access_token); }); }); req.write(qs.stringify(values)); req.end(); Once we have the token, we can wrap it up into an Authorization header and pass it to the Service Bus call: token = 'WRAP access_token=\"' + swt + '\"'; //... var reqHeaders = { Authorization: token }; var options = { host: acs.namespace() + '.servicebus.windows.net', path: '/rest/reverse?string=' + requestUrl.query.string, headers: reqHeaders }; var req = https.request(options, function (res) { console.log("statusCode: ", res.statusCode); console.log("headers: ", res.headers); response.writeHead(res.statusCode, res.headers); res.on('data', function (d) { var reversed = d.toString('utf8') console.log('svc returned: ' + d.toString('utf8')); response.end(reversed); }); }); req.end(); Running the sample Usual routine to add your own Azure details into Solution Items\AzureConnectionDetails.xml and “Run Custom Tool” on the .tt files. Build and you should be able to navigate to the on-premise service at http://localhost/Sixeyed.Ipasbr.Services/FormatService.svc/rest/reverse?string=abc123 and get a string response, going to the service direct. Install Node.js (v0.8.14 at time of writing), run FormatServiceRelay.cmd, navigate to http://localhost:8013/reverse?string=abc123, and you should get exactly the same response but through Node.js, via Azure Service Bus Relay to your on-premise service. The console logs the WRAP token returned from ACS and the response from Azure Service Bus Relay which it forwards:

    Read the article

  • Multitenant Design for SQL Azure: White Paper Available

    - by Herve Roggero
    Cloud computing is about scaling out all your application tiers, from web application to the database layer. In fact, the whole promise of Azure is to pay for just what you need. You need more IIS servers? No problemo... just spin another web server. You expect to double your storage needs for Azure Tables? No problemo; you are covered there too... just pay for your storage needs. But what about the database tier, SQL Azure? How do you add new databases easily, and transparently, so that your application simply uses more of SQL Azure if its needs to? Without changing a single line of code? And what if you need to scale back down? Welcome to the world of database scalability. There are many terms that describe database scalability, including data federation, multitenant designs, and even NoSQL depending on the technical solution you are implementing.  Because SQL Azure is a transactional database system, NoSQL is not really an option. However data federation and multitenant designs offer some very interesting scalability options that are worth considering. Data federation, a feature of SQL Azure that will be offered in the future, offers very interesting capabilities available natively on the SQL Azure platform. More to come in a few weeks... Multitenant designs on the other hand are design practices and technologies designed to help you reach flexible scalability options not available otherwise. The first incarnation of such a method was made available on CodePlex as an open source project (http://enzosqlshard.codeplex.com).  This project was an attempt to provide a sharding library for educational purposes.  All that sounds really cool... and really esoteric... almost a form of database "voodoo"... However after being on multiple Azure projects I am starting to see a real need. Customers want to be able to free themselves from the database tier, so that if they have 10 new customers tomorrow, all they need to do is add 2 more SQL Azure instances. It's that simple. How you achieve this, and suggested application design guidelines, are available in a white paper I just published.  The white paper offers two primary sections. The first section describes the business and technical problem at hand, and how to classify it according to specific design patterns. For example, I discuss compressed shards through schema separation. The second section offers a method for addressing the needs of a multitenant design using a new library, the big bother of the codeplex project mentioned previously (that I created earlier this year), complete with management interface and such. A Beta of this platform will be made available within weeks; as soon as the documentation will be ready.   I would like to ask you to drop me a quick email at [email protected] if you are going to download the white paper. It's not required, but it would help me get in touch with you for feedback.  You can download this white paper here:   http://www.bluesyntax.net/files/EnzoFramework.pdf . Thank you, and I am looking for feedback, thoughts and implementation opportunities.

    Read the article

  • Moving StarterSTS to the (Azure) Cloud

    - by Your DisplayName here!
    Quite some people asked me about an Azure version of StarterSTS. While I kinda knew what I had to do to make the move, I couldn’t find the time. Until recently. This blog post briefly documents the necessary changes and design decisions for the next version of StarterSTS which will work both on-premise and on Azure. Provider Fortunately StarterSTS is already based on the idea of “providers”. Authentication, roles and claims generation is based on the standard ASP.NET provider infrastructure. This makes the migration to different data stores less painful. In my case I simply moved the ASP.NET provider database to SQL Azure and still use the standard SQL Server based membership, roles and profile provider. In addition StarterSTS has its own providers to abstract resource access for certificates, relying party registration, client certificate registration and delegation. So I only had to provide new implementations. Signing and SSL keys now go in the Azure certificate store and user mappings (client certificates and delegation settings) have been moved to Azure table storage. The one thing I didn’t anticipate when I originally wrote StarterSTS was the need to also encapsulate configuration. Currently configuration is “locked” to the standard .NET configuration system. The new version will have a pluggable SettingsProvider with versions for .NET configuration as well as Azure service configuration. If you want to externalize these settings into e.g. a database, it is now just a matter of supplying a corresponding provider. Moving between the on-premise and Azure version will be just a matter of using different providers. URL Handling Another thing that’s substantially different on Azure (and load balanced scenarios in general) is the handling of URLs. In farm scenarios, the standard APIs like ASP.NET’s Request.Url return the current (internal) machine name, but you typically need the address of the external facing load balancer. There’s a hotfix for WCF 3.5 (included in v4) that fixes this for WCF metadata. This was accomplished by using the HTTP Host header to generate URLs instead of the local machine name. I now use the same approach for generating WS-Federation metadata as well as information card files. New Features I introduced a cache provider. Since we now have slightly more expensive lookups (e.g. relying party data from table storage), it makes sense to cache certain data in the front end. The default implementation uses the ASP.NET web cache and can be easily extended to use products like memcached or AppFabric Caching. Starting with the relying party provider, I now also provide a read/write interface. This allows building management interfaces on top of this provider. I also include a (very) simple web page that allows working with the relying party provider data. I guess I will use the same approach for other providers in the future as well. I am also doing some work on the tracing and health monitoring area. Especially important for the Azure version. Stay tuned.

    Read the article

  • Different Azure blob streams when using .Net client vs. REST interface

    - by knightpfhor
    I have encountered an unusual difference in the way that the .Net client for Azure and the direct REST API bring back streams of binary data. If I use the CloundBlob.DownloadToStream() vs. getting the response stream from the HTTP response, I get streams with the same length, but different content. Specifically the REST response seems to 0 out a series of bytes. I've discovered this issue because I'm trying to use the byte range feature for blobs which is currently not supported in the .Net client (if I'm wrong on this point and someone can point at where I can do this it might make the rest of this question irrelevant). If I upload a binary representation of the first 2k unicode characters with this code: Public Sub WriteFoo() Dim Blob As CloudBlob Dim Stream1 As MemoryStream Dim Container As CloudBlobContainer Dim Builder As StringBuilder Dim NextCharacter As String Dim Formatter As BinaryFormatter Container = CloudStorageAccount.DevelopmentStorageAccount.CreateCloudBlobClient.GetContainerReference("testcontainer") Container.CreateIfNotExist() Blob = Container.GetBlobReference("Foo") Stream1 = New MemoryStream() Builder = New Text.StringBuilder() For Index As Integer = 1 To 2000 Select Case Index Case Is <= 9 NextCharacter = ChrW(9) Case Is <= 31 NextCharacter = Environment.NewLine Case 127 NextCharacter = Environment.NewLine Case Else NextCharacter = ChrW(Index) End Select Builder.Append(NextCharacter) Next Formatter = New BinaryFormatter() Formatter.Serialize(Stream1, Builder.ToString()) Stream1.Position = 0 Blob.UploadFromStream(Stream1) End Sub Then try to access it with the following code: Public Sub ReadFoo() Dim Blob As CloudBlob Dim Request As System.Net.HttpWebRequest Dim Response As System.Net.WebResponse Dim ResponseSize As Integer Dim ResponseBuffer As Byte() Dim ResponseStream As Stream Dim Stream1 As MemoryStream Dim Stream2 As MemoryStream Dim Container As CloudBlobContainer Dim Byte1 As Integer Dim Byte2 As Integer Container = CloudStorageAccount.DevelopmentStorageAccount.CreateCloudBlobClient.GetContainerReference("testcontainer") Container.CreateIfNotExist() Blob = Container.GetBlobReference("Foo") Stream1 = New MemoryStream() Stream2 = New MemoryStream() Blob.DownloadToStream(Stream1) Request = DirectCast(System.Net.WebRequest.Create(Blob.Uri), System.Net.HttpWebRequest) Request.Headers.Add("x-ms-version", "2009-09-19") Request.Headers.Add("x-ms-range", String.Format("bytes={0}-{1}", 0, Integer.MaxValue)) Blob.Container.ServiceClient.Credentials.SignRequest(Request) Response = Request.GetResponse() ResponseStream = Response.GetResponseStream() ResponseSize = CInt(Response.ContentLength) ReDim ResponseBuffer(ResponseSize - 1) ResponseStream.Read(ResponseBuffer, 0, ResponseSize) Stream2.Write(ResponseBuffer, 0, ResponseSize) Stream1.Position = 0 Stream2.Position = 0 If Stream1.Length <> Stream2.Length Then System.Diagnostics.Debug.WriteLine(String.Format("Streams a different length. 1: {0}. 2: {1}", Stream1.Length, Stream2.Length)) Else While Stream1.Position < Stream1.Length Byte1 = Stream1.ReadByte() Byte2 = Stream2.ReadByte() If Byte1 <> Byte2 Then System.Diagnostics.Debug.WriteLine(String.Format("Streams differ at position {0}, 1: {1}. 2: {2}", Stream1.Position - 1, Byte1, Byte2)) End If End While End If End Sub Past all certain point all of the data in Stream2 (the data I've retrieved from the REST api) ends up being 0. To make matters even more confusing, when I reverse the order that I put the characters in the string e.g. For Index As Integer = 2000 To 1 rather than For Index As Integer = 1To 2000 it all works OK. Any help is much appreciated. My computer is sick of me swearing at it.

    Read the article

  • Mapping tomcat apache worker

    - by metamorpheus
    I am running an Apache2 server connected with Tomcat5.5 Workers.properties workers.tomcat_home=/usr/share/tomcat5.5 workers.java_home=/usr/lib/jvm/java-6-sun ps=/ worker.list=worker1 worker.worker1.port=8009 worker.worker1.host=127.0.0.1 worker.worker1.type=ajp13 worker.worker1.lbfactor=1 The JkMount is defined as follows LoadModule jk_module /usr/lib/apache2/modules/mod_jk.so JkWorkersFile /etc/apache2/workers.properties JkLogFile /var/log/apache2/mod_jk.log JkLogLevel debug JkLogStampFormat "[%a %b %d %H:%M:%S %Y] " JkMount /jsp-examples worker1 JkMount /jsp-examples/* worker1 JkMount /servlets-examples worker1 JkMount /servlets-examples/* worker1 JkMount /tcontainer worker1 JkMount /tcontainer/* worker1 If i call 127.0.0.1/servlets-examples, i get the examples displayed and executed correctly. If i call [same server as above]/tcontainer, i get the following error: The requested resource (/tcontainer) is not available. (this is an error provided by tomcat5.5) How can i define where to get the sources? i have a configuration file in /usr/share/tomcat-5.5-webapps/tcontainer.xml: <Context path="/tcontainer" docBase="/var/www/web96/html/tcontainer" debug="0" privileged="true" allowLinking="true"> </Context> What did i forget to configure or what is wrong with my definitions? Thanks

    Read the article

  • SQL Azure Roadmap gets a little clearer &ndash; announcements from Tech Ed

    - by Eric Nelson
    On Monday at Tech?Ed 2010 we announced new stuff (I like new stuff) that “showcases our continued commitment to deliver value, flexibility and control of data through data cloud services to our customers”. Ok, that does sound like marketing speak (and it is) but the good news is there is some meat behind it. We have some decent new features coming and we also have some clarity on when we will be able to get our hands on those features. SQL Azure Business Edition Extends to 50 GB – June 28th SQL Azure Business Edition database is now extending from 10GB to 50GB The new 50GB database size will be available worldwide starting June 28th SQL Azure Business Edition Subscription Offer – August 1st Starting August 1st, we will have a new discounted SQL Azure promotional offer (SQL Azure Development Accelerator Core) More information is available at http://www.microsoft.com/windowsazure/offers/. Public Preview of the Data Sync Service  - CTP now Data Sync Service for SQL Azure allows for more flexible control over data by deciding which data components should be distributed across multiple datacenters in different geographic locations, based on your internal policies and business needs.  Available as a community technology preview after registering at http://www.sqlazurelabs.com SQL Server Web Manager for SQL Azure - CTP this Summer SQL Server Web Manager (SSWM) is a lightweight and easy to use database management tool for SQL Azure databases, to be offered this summer. Access 10 Support for SQL Azure – available now Yey – at last! Microsoft Office 2010 will natively support data connectivity to SQL Azure – we can now start developing those “departmental apps” with the confidence of a highly available SQL store provisioned in seconds. NB: I don’t believe we will support any previous versions of Access talking to SQL Azure. The Pre-announced Spatial Data Support to Become Live – Live now* At MIX in March we announced spatial was coming and apparently it is now here - although I need to check. Related Links UK based? Sign up at http://ukazure.ning.com SQL Azure Team Blog http://blogs.msdn.com/b/sqlazure/

    Read the article

  • Roger Jennings’ Cloud Computing with the Windows Azure Platform

    - by guybarrette
    Writing and publishing a book about a technology early in its infancy is cruel.  Your subjected to many product changes and your book might be outdated the day it reaches the book stores.  I bought Roger Jennings “Cloud Computing with the Windows Azure Platform” book knowing that it was published in October 2009 and that many changes occurred to the Azure platform in 2009. Right off the bat and from a technology point of view, some chapters are now outdated but don’t reject this book because of that.  In the first few chapters, Jennings does a great job at explaining Cloud Computing and the Azure platform from a business point of view, something that few Azure articles and blogs fail to do right now.  You may want to wait for the second edition and read Jennings’ outstanding Azure focused blog in the meantime.   var addthis_pub="guybarrette";

    Read the article

  • SyncToBlog #12 Windows Azure and Cloud Links

    - by Eric Nelson
    Some more “syncing to paper” :) Steve Marx wrote a very interesting article about using Hosted Web Core in an Azure Worker Role. Hosted Web Core is a new feature in IIS 7 that enables developers to create applications that load the core IIS functionality. Wade Wegner is a new Technical Evangelist for Windows Azure platform AppFabric Example from Wade (and how I found him) Host WCF Services in IIS with Service Bus Endpoints Google and vmware “get engaged” over cloud http://googlecode.blogspot.com/2010/05/enabling-cloud-portability-with-google.html A new cloud comparison site – slick but limited coverage (it is not at Azure level, rather BPOS level) www.cloudhypermarket.com  The Rise of NoSQL Database (devx free registration required) Moe Khosravy talks about Codename "Dallas"  to my colleague David G (14min video) New videos Calculating the cost of Azure and Calculating the cost of SQL Azure Related Links: Previous SyncToBlog posts My delicious bookmarks

    Read the article

  • GitHub Integration in Windows Azure Web Site

    - by Shaun
    Microsoft had just announced an update for Windows Azure Web Site (a.k.a. WAWS). There are four major features added in WAWS which are free scaling mode, GitHub integration, custom domain and multi branches. Since I ‘m working in Node.js and I would like to have my code in GitHub and deployed automatically to my Windows Azure Web Site once I sync my code, this feature is a big good news to me.   It’s very simple to establish the GitHub integration in WAWS. First we need a clean WAWS. In its dashboard page click “Set up Git publishing”. Currently WAWS doesn’t support to change the publish setting. So if you have an existing WAWS which published by TFS or local Git then you have to create a new WAWS and set the Git publishing. Then in the deployment page we can see now WAWS supports three Git publishing modes: - Push my local files to Windows Azure: In this mode we will create a new Git repository on local machine and commit, publish our code to Windows Azure through Git command or some GUI. - Deploy from my GitHub project: In this mode we will have a Git repository created on GitHub. Once we publish our code to GitHub Windows Azure will download the code and trigger a new deployment. - Deploy from my CodePlex project: Similar as the previous one but our code would be in CodePlex repository.   Now let’s back to GitHub and create a new publish repository. Currently WAWS GitHub integration only support for public repositories. The private repositories support will be available in several weeks. We can manage our repositories in GitHub website. But as a windows geek I prefer the GUI tool. So I opened the GitHub for Windows, login with my GitHub account and select the “github” category, click the “add” button to create a new repository on GitHub. You can download the GitHub for Windows here. I specified the repository name, description, local repository, do not check the “Keep this code private”. After few seconds it will create a new repository on GitHub and associate it to my local machine in that folder. We can find this new repository in GitHub website. And in GitHub for Windows we can also find the local repository by selecting the “local” category.   Next, we need to associate this repository with our WAWS. Back to windows developer portal, open the “Deploy from my GitHub project” in the deployment page and click the “Authorize Windows Azure” link. It will bring up a new windows on GitHub which let me allow the Windows Azure application can access your repositories. After we clicked “Allow”, windows azure will retrieve all my GitHub public repositories and let me select which one I want to integrate to this WAWS. I selected the one I had just created in GitHub for Windows. So that’s all. We had completed the GitHub integration configuration. Now let’s have a try. In GitHub for Windows, right click on this local repository and click “open in explorer”. Then I added a simple HTML file. 1: <html> 2: <head> 3: </head> 4: <body> 5: <h1> 6: I came from GitHub, WOW! 7: </h1> 8: </body> 9: </html> Save it and back to GitHub for Windows, commit this change and publish. This will upload our changes to GitHub, and Windows Azure will detect this update and trigger a new deployment. If we went back to azure developer portal we can find the new deployment. And our commit message will be shown as the deployment description as well. And here is the page deployed to WAWS.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • Windows Azure guidance from the Patterns and Practices team

    - by Eric Nelson
    The P&P team have started to share guidance on the Windows Azure Platform.  They plan to group their efforts around: 1. Moving to the Cloud 2. Integrating with the Cloud 3. Leveraging the Cloud First up is a document which explains the capabilities and limitations of Enterprise Library 5.0 Beta 2 in terms of use within .NET applications designed to run with the Windows Azure platform. You can download it here. Related Links: UK Azure Online Community – join today. UK Windows Azure Site Start working with Windows Azure

    Read the article

  • Azure Storage Explorer

    - by kaleidoscope
    Azure Storage Explorer –  an another way to Deploy the services on Cloud Azure Storage Explorer is a useful GUI tool for inspecting and altering the data in your Azure cloud storage projects including the logs of your cloud-hosted applications. All three types of cloud storage can be viewed: blobs, queues, and tables. You can also create or delete blob/queue/table containers and items. Text blobs can be edited and all data types can be imported/exported between the cloud and local files. Table records can be imported/exported between the cloud and spreadsheet CSV files. Why Azure Storage Explorer Azure Storage Explorer is a licensed CodePlex project provided by Neudesic – a Microsoft partner.  It is a simple UI that requires you to input your blob storage name, access key and endpoints in the Storage Settings dialog. For more details please refer to the link: http://azurestorageexplorer.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=35189   Anish, S

    Read the article

  • Can't rdp into new ( or old ) Azure VM

    - by Raif
    I have an Azure account with a VM on it. I haven't used it in about 8 months. I tried to connect today but it wont take my creds. Now I'm not entirely sure that I have my password correct, pretty sure but not entirely. So I created a new VM and set the password. Clicked the Connect button on the portal window, tried to connect and was rejected using the password I know to be correct. I have disabled my local machine firewall and antivirus.

    Read the article

  • New release of the Windows Azure SDK and Tools (March CTP)

    - by kaleidoscope
    From now on, you only have to download the Windows Azure Tools for Microsoft Visual Studio and the SDK will be installed as part of that package. What’s new in Windows Azure SDK Support for developing Managed Full Trust applications. It also provides support for Native Code via PInvokes and spawning native processes. Support for developing FastCGI applications, including support for rewrite rules via URL Rewrite Module. Improved support for the integration of development storage with Visual Studio, including enhanced performance and support for SQL Server (only local instance). What’s new in Windows Azure Tools for Visual Studio Combined installer includes the Windows Azure SDK Addressed top customer bugs. Native Code Debugging. Update Notification of future releases. FastCGI template http://blogs.msdn.com/jnak/archive/2009/03/18/now-available-march-ctp-of-the-windows-azure-tools-and-sdk.aspx   Ritesh, D

    Read the article

  • Azure cloud app subdomain pointing to actual domain

    - by Amit Aggarwal
    Say we have a domain xyz.com registered with some registrar ... we pointed that domain to the name server of our dedicated server where the DNS will be hosted for that domain. Now, we just want that dedicated server to host the emails coming and the domain will point to abc.cloudapp.net (azure cloud app, they don't provide any static IP ... and only public url is given) Now, someone please helping me in editing/creating the DNS file on our dedicated server to make sure things work properly... if possible past here minimum settings we need in DNS file to make sure mails are on dedicated server and app is on cloud... Thanks, Amit

    Read the article

  • Links from UK TechDays 2010 sessions on Entity Framework, Parallel Programming and Azure

    - by Eric Nelson
    [I will do some longer posts around my sessions when I get back from holiday next week] Big thanks to all those who attended my 3 sessions at TechDays this week (April 13th and 14th, 2010). I really enjoyed both days and watched some great session – my personal fave being the Silverlight/Expression session by my friend and colleague Mike Taulty. The following links should help get you up and running on each of the technologies. Entity Framework 4 Entity Framework 4 Resources http://bit.ly/ef4resources Entity Framework Team Blog http://blogs.msdn.com/adonet Entity Framework Design Blog http://blogs.msdn.com/efdesign/ Parallel Programming Parallel Computing Developer Center http://msdn.com/concurrency Code samples http://code.msdn.microsoft.com/ParExtSamples Managed Team Blog http://blogs.msdn.com/pfxteam Tools Team Blog http://blogs.msdn.com/visualizeparallel My code samples http://gist.github.com/364522  And PDC 2009 session recordings to watch: Windows Azure Platform UK Site http://bit.ly/landazure UK Community http://bit.ly/ukazure (http://ukazure.ning.com ) Feedback www.mygreatwindowsazureidea.com Azure Diagnostics Manager - A client for Windows Azure Diagnostics Cloud Storage Studio - A client for Windows Azure Storage SQL Azure Migration Wizard http://sqlazuremw.codeplex.com

    Read the article

  • Microsoft Cuts Windows Azure Compute and Storage Pricing

    The savings begin with Microsoft's Windows Azure Storage Pay-As-You-Go service, which now costs $0.125 per GB as opposed to $0.14 per GB, a savings of 12 percent. Microsoft also slashed the pricing for Windows Azure Storage's 6 Month Plans as much as 14 percent across all tiers. Lastly, compute customers can now enjoy Windows Azure Extra Small Compute pricing of $0.02 per hour instead of $0.04 per hour, a savings of 50 percent. To exhibit the cost advantages offered by Windows Azure, Microsoft noted in a blog post that a 24x7 Extra Small Compute instance with a 100MB SQL Azure database can b...

    Read the article

  • SQL Azure maximum database size rises from 10GB to 50GB in June

    - by Eric Nelson
    At Mix we announced that we will be offering a new 50gb size option in June. If you would like to become an early adopter of this new size option before generally available, send an email to [email protected]  and it will auto-reply with instructions to fill out a survey to nominate your application that requires greater than 10gb of storage. Other announcements included: MARS in April: Execute multiple batches in a single connection Spatial Data in June: Geography and geometry types SQL Azure Labs: SQL Azure Labs provides a place where you can access incubations and early preview bits for products and enhancements to SQL Azure. Currently OData Service for SQL Azure. Related Links: SQL Azure Announcements at MIX http://ukazure.ning.com

    Read the article

  • Azure VM with many IPs or SSL certificates

    - by timmah.faase
    I am looking to move our hosting environment to Azure and by doing so have created a sandpit VM to figure things out. We host around 300-400 websites in IIS and about 2% of these sites have unique, non wildcard certificates all requiring a unique public IP in our current setup. Can you get a range of IPs pointing to 1 VM/Endpoint? Or is it possible to create an SSL proxy? I've never created an SSL proxy but like the idea of it. I'd need advise here on how to proceed if this is the best option. Sorry if this has been answered! Sorry also if my question isn't worded eloquently.

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >