Search Results

Search found 6770 results on 271 pages for 'azure storage'.

Page 20/271 | < Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >

  • Windows Azure Queues, WCF, MSMQ integration

    - by user104295
    Hi there, I have a scenario where I need a desktop console app to communicate with a Windows Azure Queue... the most important thing is that the message is received by the server eventually. Also, the desktop app may be disconnected from the Internet sometimes. In the traditional WCF+MSMQ approach you'd be able to send a message which would be cached in MSMQ until MSMQ could reach the Server's MSMQ and send the message. What's the equivalent when Windows Azure is the server-side? Is it possible for the same approach to be used, where MSMQ just communicates with a Windows Azure Queue rather than an MSMQ on a Windows Server? Maybe Windows Azure Queue is the wrong approach? I have heard about something called message buffer, but don't know what this is (yet!). thanks for your help Kris

    Read the article

  • Does VSTO run on Windows Azure?

    - by Joni
    I have a Web application which will be deployed to Windows Azure and I'm looking for alternatives to generate Excel spreadsheets. Can I use VSTO to programatically generate an Excel spreadsheet in a Web Role running on Windows Azure?... If yes, how should I deploy the application to Windows Azure? What assemblies should I include? Thanks!

    Read the article

  • Windows Azure - Automatic Load Balancing - partitioning

    - by veda
    I was going through some videos. I found that Windows Azure will group the blobs into partitions based on the partition key and will Automatically Load Balance these partitions on their servers. The partition key for a blob is blob name. Using the blob name, azure will automatically do partitions. Now, My question is that Can I able to make the azure to do partitions based on the Container Name. I wanted my partition key to be container name. For example, I have a storage account. In that I have 2 containers named container1 and container2. In container1, I have 1000 files named 1.txt, 2.txt, 3.txt, ......., 501.txt, 502.txt, ..... 999.txt, 1000.txt and in container2, I have another 1000 files named 1001.txt, 1002.txt, 1003.txt, ......., 1501.txt, 1502.txt, ..... 1999.txt, 2000.txt Now, Will Windows Azure will generate 2000 partitions based on the blob name and serve me through several servers??? Won't it be better if Azure partitions based on the Container name? container1 on one server and conatiner2 on another.

    Read the article

  • Keep local MS SQL 2008 DB table and remote SQL Azure DB table in sync

    - by Boomerangertanger
    Hi there, I have a dedicated server which hosts a Windows Service which does a lot of very heavy load stuff and populates a number of SQL Server database tables. However, of all the database tables it populates and works with, I want only one to be synchronised with a remote SQL Azure DB table. This is because this table holds what I called Resolved data, which is the end result of the Windows Service's work. I would like to keep a SQL Azure database table in sync with this database table. As far as I understand, my options are: Move everything onto Azure (but that involves a massive development overhead and risk) Have another Windows Service on the dedicated server which essentially looks at changed records since the last update and then manually update the SQL Azure table

    Read the article

  • Windows azure deployment

    - by smoothe
    I just built a simple hello world windows azure service containing just one web role, I used visual studio 2008 and Windows azure tools for VS 1.2 I am pretty new to this and I have been trying to deploy an application all afternoon now. I'm in australia and deploying in the region Asia anywhere. I have pretty much followed the info provided on MSDN and it says uploaded 95% then after about ten minutes the deployment disappears. I have tried using the old windows azure developer portal and 30minutes later I can not access the service and it's status is either busy or stopped. I have the introductory offer for an extra small compute instance on the subscription I am deploying to. Can anyone with experience with windows azure elaborate on the subject of deploying apps and the status on my application, I am very keen to get into the platform and this issue has just about spoiled my weekend.

    Read the article

  • Best practices for (over)using Azure queues

    - by John
    Hi, I'm in the early phases of designing an Azure-based application. One of the things that attracts me to Azure is the scalability, given the variability of the demand I'm likely to expect. As such I'm trying to keep things loosely coupled so I can add instances when I need to. The recommendations I've seen for architecting an application for Azure include keeping web role logic to a minimum, and having processing done in worker roles, using queues to communicate and some sort of back-end store like SQL Azure or Azure Tables. This seems like a good idea to me as I can scale up either or both parts of the application without any issue. However I'm curious if there are any best practices (or if anyone has any experiences) for when it's best to just have the web role talk directly to the data store vs. sending data by the queue? I'm thinking of the case where I have a simple insert to do from the web role - while I could set this up as a message, send it on the queue, and have a worker role pick it up and do the insert, it seems like a lot of double-handling. However I also appreciate that it may be the case that this is better in the long run, in case the web role gets overwhelmed or more complex logic ends up being required for the insert. I realise this might be a case where the answer is "it depends entirely on the situation, check your perf metrics" - but if anyone has any thoughts I'd be very appreciative! Thanks John

    Read the article

  • Use Azure Appfabric on a local system

    - by Mimefilt
    Hi there, we want to use the azure appfabric for our software. But not every customer wants to buy an expensive azure account. Is it possibile to define and use an azure interface in our software but to connect the server and client local? Mimefilt

    Read the article

  • What is the correct install process to setup Node.js with Windows Azure Emulator

    - by PazoozaTest Pazman
    This question is related to this question: Node.js running under IIS Express Keeps Crashing to which I need help with reinstalling and getting node.js up and running in windows emulator working. Hello I am reinstalling my machine: Toshiha Laptop 2 GB Ram 32 bit processor What is the correct procedure from start to finish to get node.js development working, so far nothing has worked and the emulator (IIS Express) worker processor keeps crashing. No matter how many instances they all end up crashing. Up until two weeks ago my node development was working fine, but I had to do a reinstall, and since then I haven't been doing any node.js development on windows emulator because the latest June 2012 Azure SDK for Node.js is buggy. These are the steps I have taken: 1) Reformat HD 2) Insert Windows 7 N SP1 CD 3) Reboot machine into CD installation 4) Follow and wait until Windows 7 installed 5) Run Add/Remove programs + enable IIS + IIS management tools 6) Run Windows Update (installed about 53 updates) 7) Go here http://www.windowsazure.com/en-us/develop/nodejs/ 8) Click Windows Installer June 2012 and install Windows Azure SDK for Node.js - June 2012 9) Run Azure Powershell 10) Navigate to c:\node\testSite\webrole1 11) launch site: start-azureemulator -launch 12) Play around on website (then crash!) Problem signature: Problem Event Name: APPCRASH Application Name: iisexpress.exe Application Version: 8.0.8298.0 Application Timestamp: 4f620349 Fault Module Name: iiscore.dll Fault Module Version: 8.0.8298.0 Fault Module Timestamp: 4f63b65c Exception Code: c0000005 Exception Offset: 00021767 OS Version: 6.1.7601.2.1.0.256.28 Locale ID: 1033 Additional Information 1: f66d Additional Information 2: f66d807b515d6b2dc6f28f66db769a01 Additional Information 3: 7b2f Additional Information 4: 7b2f6797d07ebc2c23f2b227e779722e Am I missing a step in my resintall process? Do I have all the required files to do node.js windows azure emulator development? Why is IIS Express crashing all the time? Can I still do node.js windows azure emulator development without using IIS Express and use my local Windows 7 N (SP1) IIS 7.x that comes shipped?

    Read the article

  • Node.js Adventure - Host Node.js on Windows Azure Worker Role

    - by Shaun
    In my previous post I demonstrated about how to develop and deploy a Node.js application on Windows Azure Web Site (a.k.a. WAWS). WAWS is a new feature in Windows Azure platform. Since it’s low-cost, and it provides IIS and IISNode components so that we can host our Node.js application though Git, FTP and WebMatrix without any configuration and component installation. But sometimes we need to use the Windows Azure Cloud Service (a.k.a. WACS) and host our Node.js on worker role. Below are some benefits of using worker role. - WAWS leverages IIS and IISNode to host Node.js application, which runs in x86 WOW mode. It reduces the performance comparing with x64 in some cases. - WACS worker role does not need IIS, hence there’s no restriction of IIS, such as 8000 concurrent requests limitation. - WACS provides more flexibility and controls to the developers. For example, we can RDP to the virtual machines of our worker role instances. - WACS provides the service configuration features which can be changed when the role is running. - WACS provides more scaling capability than WAWS. In WAWS we can have at most 3 reserved instances per web site while in WACS we can have up to 20 instances in a subscription. - Since when using WACS worker role we starts the node by ourselves in a process, we can control the input, output and error stream. We can also control the version of Node.js.   Run Node.js in Worker Role Node.js can be started by just having its execution file. This means in Windows Azure, we can have a worker role with the “node.exe” and the Node.js source files, then start it in Run method of the worker role entry class. Let’s create a new windows azure project in Visual Studio and add a new worker role. Since we need our worker role execute the “node.exe” with our application code we need to add the “node.exe” into our project. Right click on the worker role project and add an existing item. By default the Node.js will be installed in the “Program Files\nodejs” folder so we can navigate there and add the “node.exe”. Then we need to create the entry code of Node.js. In WAWS the entry file must be named “server.js”, which is because it’s hosted by IIS and IISNode and IISNode only accept “server.js”. But here as we control everything we can choose any files as the entry code. For example, I created a new JavaScript file named “index.js” in project root. Since we created a C# Windows Azure project we cannot create a JavaScript file from the context menu “Add new item”. We have to create a text file, and then rename it to JavaScript extension. After we added these two files we should set their “Copy to Output Directory” property to “Copy Always”, or “Copy if Newer”. Otherwise they will not be involved in the package when deployed. Let’s paste a very simple Node.js code in the “index.js” as below. As you can see I created a web server listening at port 12345. 1: var http = require("http"); 2: var port = 12345; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then we need to start “node.exe” with this file when our worker role was started. This can be done in its Run method. I found the Node.js and entry JavaScript file name, and then create a new process to run it. Our worker role will wait for the process to be exited. If everything is OK once our web server was opened the process will be there listening for incoming requests, and should not be terminated. The code in worker role would be like this. 1: public override void Run() 2: { 3: // This is a sample worker implementation. Replace with your logic. 4: Trace.WriteLine("NodejsHost entry point called", "Information"); 5:  6: // retrieve the node.exe and entry node.js source code file name. 7: var node = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot\node.exe"); 8: var js = "index.js"; 9:  10: // prepare the process starting of node.exe 11: var info = new ProcessStartInfo(node, js) 12: { 13: CreateNoWindow = false, 14: ErrorDialog = true, 15: WindowStyle = ProcessWindowStyle.Normal, 16: UseShellExecute = false, 17: WorkingDirectory = Environment.ExpandEnvironmentVariables(@"%RoleRoot%\approot") 18: }; 19: Trace.WriteLine(string.Format("{0} {1}", node, js), "Information"); 20:  21: // start the node.exe with entry code and wait for exit 22: var process = Process.Start(info); 23: process.WaitForExit(); 24: } Then we can run it locally. In the computer emulator UI the worker role started and it executed the Node.js, then Node.js windows appeared. Open the browser to verify the website hosted by our worker role. Next let’s deploy it to azure. But we need some additional steps. First, we need to create an input endpoint. By default there’s no endpoint defined in a worker role. So we will open the role property window in Visual Studio, create a new input TCP endpoint to the port we want our website to use. In this case I will use 80. Even though we created a web server we should add a TCP endpoint of the worker role, since Node.js always listen on TCP instead of HTTP. And then changed the “index.js”, let our web server listen on 80. 1: var http = require("http"); 2: var port = 80; 3:  4: http.createServer(function (req, res) { 5: res.writeHead(200, { "Content-Type": "text/plain" }); 6: res.end("Hello World\n"); 7: }).listen(port); 8:  9: console.log("Server running at port %d", port); Then publish it to Windows Azure. And then in browser we can see our Node.js website was running on WACS worker role. We may encounter an error if we tried to run our Node.js website on 80 port at local emulator. This is because the compute emulator registered 80 and map the 80 endpoint to 81. But our Node.js cannot detect this operation. So when it tried to listen on 80 it will failed since 80 have been used.   Use NPM Modules When we are using WAWS to host Node.js, we can simply install modules we need, and then just publish or upload all files to WAWS. But if we are using WACS worker role, we have to do some extra steps to make the modules work. Assuming that we plan to use “express” in our application. Firstly of all we should download and install this module through NPM command. But after the install finished, they are just in the disk but not included in the worker role project. If we deploy the worker role right now the module will not be packaged and uploaded to azure. Hence we need to add them to the project. On solution explorer window click the “Show all files” button, select the “node_modules” folder and in the context menu select “Include In Project”. But that not enough. We also need to make all files in this module to “Copy always” or “Copy if newer”, so that they can be uploaded to azure with the “node.exe” and “index.js”. This is painful step since there might be many files in a module. So I created a small tool which can update a C# project file, make its all items as “Copy always”. The code is very simple. 1: static void Main(string[] args) 2: { 3: if (args.Length < 1) 4: { 5: Console.WriteLine("Usage: copyallalways [project file]"); 6: return; 7: } 8:  9: var proj = args[0]; 10: File.Copy(proj, string.Format("{0}.bak", proj)); 11:  12: var xml = new XmlDocument(); 13: xml.Load(proj); 14: var nsManager = new XmlNamespaceManager(xml.NameTable); 15: nsManager.AddNamespace("pf", "http://schemas.microsoft.com/developer/msbuild/2003"); 16:  17: // add the output setting to copy always 18: var contentNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:Content", nsManager); 19: UpdateNodes(contentNodes, xml, nsManager); 20: var noneNodes = xml.SelectNodes("//pf:Project/pf:ItemGroup/pf:None", nsManager); 21: UpdateNodes(noneNodes, xml, nsManager); 22: xml.Save(proj); 23:  24: // remove the namespace attributes 25: var content = xml.InnerXml.Replace("<CopyToOutputDirectory xmlns=\"\">", "<CopyToOutputDirectory>"); 26: xml.LoadXml(content); 27: xml.Save(proj); 28: } 29:  30: static void UpdateNodes(XmlNodeList nodes, XmlDocument xml, XmlNamespaceManager nsManager) 31: { 32: foreach (XmlNode node in nodes) 33: { 34: var copyToOutputDirectoryNode = node.SelectSingleNode("pf:CopyToOutputDirectory", nsManager); 35: if (copyToOutputDirectoryNode == null) 36: { 37: var n = xml.CreateNode(XmlNodeType.Element, "CopyToOutputDirectory", null); 38: n.InnerText = "Always"; 39: node.AppendChild(n); 40: } 41: else 42: { 43: if (string.Compare(copyToOutputDirectoryNode.InnerText, "Always", true) != 0) 44: { 45: copyToOutputDirectoryNode.InnerText = "Always"; 46: } 47: } 48: } 49: } Please be careful when use this tool. I created only for demo so do not use it directly in a production environment. Unload the worker role project, execute this tool with the worker role project file name as the command line argument, it will set all items as “Copy always”. Then reload this worker role project. Now let’s change the “index.js” to use express. 1: var express = require("express"); 2: var app = express(); 3:  4: var port = 80; 5:  6: app.configure(function () { 7: }); 8:  9: app.get("/", function (req, res) { 10: res.send("Hello Node.js!"); 11: }); 12:  13: app.get("/User/:id", function (req, res) { 14: var id = req.params.id; 15: res.json({ 16: "id": id, 17: "name": "user " + id, 18: "company": "IGT" 19: }); 20: }); 21:  22: app.listen(port); Finally let’s publish it and have a look in browser.   Use Windows Azure SQL Database We can use Windows Azure SQL Database (a.k.a. WACD) from Node.js as well on worker role hosting. Since we can control the version of Node.js, here we can use x64 version of “node-sqlserver” now. This is better than if we host Node.js on WAWS since it only support x86. Just install the “node-sqlserver” module from NPM, copy the “sqlserver.node” from “Build\Release” folder to “Lib” folder. Include them in worker role project and run my tool to make them to “Copy always”. Finally update the “index.js” to use WASD. 1: var express = require("express"); 2: var sql = require("node-sqlserver"); 3:  4: var connectionString = "Driver={SQL Server Native Client 10.0};Server=tcp:{SERVER NAME}.database.windows.net,1433;Database={DATABASE NAME};Uid={LOGIN}@{SERVER NAME};Pwd={PASSWORD};Encrypt=yes;Connection Timeout=30;"; 5: var port = 80; 6:  7: var app = express(); 8:  9: app.configure(function () { 10: app.use(express.bodyParser()); 11: }); 12:  13: app.get("/", function (req, res) { 14: sql.open(connectionString, function (err, conn) { 15: if (err) { 16: console.log(err); 17: res.send(500, "Cannot open connection."); 18: } 19: else { 20: conn.queryRaw("SELECT * FROM [Resource]", function (err, results) { 21: if (err) { 22: console.log(err); 23: res.send(500, "Cannot retrieve records."); 24: } 25: else { 26: res.json(results); 27: } 28: }); 29: } 30: }); 31: }); 32:  33: app.get("/text/:key/:culture", function (req, res) { 34: sql.open(connectionString, function (err, conn) { 35: if (err) { 36: console.log(err); 37: res.send(500, "Cannot open connection."); 38: } 39: else { 40: var key = req.params.key; 41: var culture = req.params.culture; 42: var command = "SELECT * FROM [Resource] WHERE [Key] = '" + key + "' AND [Culture] = '" + culture + "'"; 43: conn.queryRaw(command, function (err, results) { 44: if (err) { 45: console.log(err); 46: res.send(500, "Cannot retrieve records."); 47: } 48: else { 49: res.json(results); 50: } 51: }); 52: } 53: }); 54: }); 55:  56: app.get("/sproc/:key/:culture", function (req, res) { 57: sql.open(connectionString, function (err, conn) { 58: if (err) { 59: console.log(err); 60: res.send(500, "Cannot open connection."); 61: } 62: else { 63: var key = req.params.key; 64: var culture = req.params.culture; 65: var command = "EXEC GetItem '" + key + "', '" + culture + "'"; 66: conn.queryRaw(command, function (err, results) { 67: if (err) { 68: console.log(err); 69: res.send(500, "Cannot retrieve records."); 70: } 71: else { 72: res.json(results); 73: } 74: }); 75: } 76: }); 77: }); 78:  79: app.post("/new", function (req, res) { 80: var key = req.body.key; 81: var culture = req.body.culture; 82: var val = req.body.val; 83:  84: sql.open(connectionString, function (err, conn) { 85: if (err) { 86: console.log(err); 87: res.send(500, "Cannot open connection."); 88: } 89: else { 90: var command = "INSERT INTO [Resource] VALUES ('" + key + "', '" + culture + "', N'" + val + "')"; 91: conn.queryRaw(command, function (err, results) { 92: if (err) { 93: console.log(err); 94: res.send(500, "Cannot retrieve records."); 95: } 96: else { 97: res.send(200, "Inserted Successful"); 98: } 99: }); 100: } 101: }); 102: }); 103:  104: app.listen(port); Publish to azure and now we can see our Node.js is working with WASD through x64 version “node-sqlserver”.   Summary In this post I demonstrated how to host our Node.js in Windows Azure Cloud Service worker role. By using worker role we can control the version of Node.js, as well as the entry code. And it’s possible to do some pre jobs before the Node.js application started. It also removed the IIS and IISNode limitation. I personally recommended to use worker role as our Node.js hosting. But there are some problem if you use the approach I mentioned here. The first one is, we need to set all JavaScript files and module files as “Copy always” or “Copy if newer” manually. The second one is, in this way we cannot retrieve the cloud service configuration information. For example, we defined the endpoint in worker role property but we also specified the listening port in Node.js hardcoded. It should be changed that our Node.js can retrieve the endpoint. But I can tell you it won’t be working here. In the next post I will describe another way to execute the “node.exe” and Node.js application, so that we can get the cloud service configuration in Node.js. I will also demonstrate how to use Windows Azure Storage from Node.js by using the Windows Azure Node.js SDK.   Hope this helps, Shaun All documents and related graphics, codes are provided "AS IS" without warranty of any kind. Copyright © Shaun Ziyan Xu. This work is licensed under the Creative Commons License.

    Read the article

  • SQL Azure news at TechEd 2010

    More Azure news from TechEd US 2010.  This time, its from the SQL Azure team: 50GB databases available on June 28th Support for Spatial Data Data Sync Service for SQL Azure Microsoft SQL Server Web Manager Access 2010 Support for SQL Azure Read at about it here var addthis_pub="guybarrette";...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL Azure news at TechEd 2010

    - by guybarrette
    More Azure news from TechEd US 2010.  This time, it’s from the SQL Azure team: 50GB databases available on June 28th Support for Spatial Data Data Sync Service for SQL Azure Microsoft SQL Server Web Manager Access 2010 Support for SQL Azure Read at about it here var addthis_pub="guybarrette";

    Read the article

  • Microsoft Azure s'enrichit de deux nouveaux services pour le NoSQL et la recherche en plein texte sans oublier Apache HBase pour Azure HDInsight

    Microsoft Azure s'enrichit de deux nouveaux services pour le NoSQL et la recherche en plein texte sans oublier Apache HBase pour Azure HDInsight Microsoft a publié sous forme d'aperçu de nouvelles fonctionnalités pour son service cloud Microsoft Azure, il s'agit de deux services : l'un dédié aux bases données et l'autre pour la recherche en plein texte.Le service dédié aux bases de données a été baptisé Azure DocumentDB, il permettra de compléter les bases de données relationnelles avec un service...

    Read the article

  • What happens to the storage capacity when I uninstall Ubuntu?

    - by shole1202
    I used the wubi installer for Ubuntu 12.04. After having trouble with getting the Operating System to boot, I tried uninstalling it with wubi. From 'My Computer' (in Windows 7), I noticed the maximum capacity of my hard drive drop from 256gb to 238gb. I have tried using some methods with the command prompt to locate the missing storage, but Windows now only recognizes that the storage on the disk to have 238gb instead of the original 256. Is there any way to recover that memory?

    Read the article

  • VMWare ESX, storage over 2TB

    - by Phliplip
    Hi, First of, i'm a webdeveloper and my server experience lies in setting up FreeBSD servers for webserver. I'm working on a project for at photographer, and i'm hired to develop a new online photo ordering system - where user of course can view their photos :) They have a massive need of storage, thus we have bought a HP G6 and 8x1TB SATA HDD. Our plan is to install VMWare ESX 4.0, running multiple virtual machines; FreeBSD 8 for webserver and some windows servers. Allready done that. Then mount one big storage to the BSD, and share it through Samba to the WinServers. The raid is set up with an array of 2x 1TB to handle the VMs. And the rest is setup as 3 2x1TB to handle the photo-data. Thus 2.73TB for photo-data (the raids are 1+0). Now if we add a datastore in the ESX and add the 3 LUNs we can get a datastore of 2.74TB. But i don't se how i can add this datastore direct to the VM. Only the BSD VM needs access to this. Only way is to create a VirtualDisk, with a max of 2TB (8MB blocksize). This is because the datastore where we save the virtualdisk has a maximum filesize of 2TB. Then add it as a harddisk to the BSD VM. In the 'Add Harddisk' pane for the VM, i see an option for Raw Disk Management. I think this is to access the datastore or the raid directly. Only problem is that its greyed out! Can i access the datastorage directly from the BSD? Without creating and adding virtualdisk.

    Read the article

  • Dedicated server with a lot of storage and good support - and cost-effective

    - by Martin Burger
    Hello, I am from Germany and looking for a dedicated server located in the US with a lot of storage: 750 - 1500 GB. CPU speed and amount of memory are secondary, the server will host large amounts of media files via http and ftp - the basic task is to help people exchange media files. In Germany, there are some good offers, like "Root Server EQ6" at www.hetzner.de. For example, that company provides support of high quality, and their plans are very cost-effective. The plan mentioned above costs about $90 per month and provides two 1500 GB SATA-II HDDs (Software-RAID 1). In the US, I found (amongst others) Go Daddy and rackspace. Go Daddy offers some "Storage Monster" plans that include 2 x 1,000 GB hard drives for about $180 per month - already twice as much as Hetzner above. However, I found some blog and forum entries that complain about the support provided by Go Daddy. Rackspace seems to provide decent support, but they are very "upscale". Their dedicated servers are customizable and start at $419 - thus, about 4.5 times as much as Hetzner. Can anybody recommend a solution / plan that is comparable to the one by Hetzner? Or are prices for dedicated servers in general much higher than in Germany? Regards, Martin

    Read the article

  • Multi-petabyte scale out storage solution [closed]

    - by Alex Yuriev
    Let's say that I have a need to have a single-name space scale to multi-petabyte object store with a file system-like wrapper. What is currently out there that supports the following: Single name space that can take 1B files. Support for multiple entry points using NFS At least node level replication ( preferably node and file level replication ) Online software upgrades No "magic sauce" on the storage layer The following has been evaluated: Gluster & Lustre - just ick - fundamental lack of understanding of why online upgrades are mandatory. OneFS - we have it. It is smelling more and more like it hides a dead body under the hood. Other than MapR and zfs am I missing anything? P.S. Oh yes, I keep forgetting that the forums are for people to discuss if 2TB drive actually stores 2TB info. May bad. Seriously though - how the heck can "meets the following requirements" can be considered a "debate"? P.P.S. I did not throw an idiotic insult - i pointed out that this is actually an interesting question compared to a conversation about storage capacity of a 2TB hard drive. It is not a question of what works better - it is a question that asks did I miss any of the products that currently exist which fit the criteria where criteria is clearly outline. I got one answer below which included something that I have not looked at in a long time which looks quite a bit grown up compared to the time I briefly look at it before.

    Read the article

  • Integrate Bing Search API into ASP.Net application

    - by sreejukg
    Couple of months back, I wrote an article about how to integrate Bing Search engine (API 2.0) with ASP.Net website. You can refer the article here http://weblogs.asp.net/sreejukg/archive/2012/04/07/integrate-bing-api-for-search-inside-asp-net-web-application.aspx Things are changing rapidly in the tech world and Bing has also changed! The Bing Search API 2.0 will work until August 1, 2012, after that it will not return results. Shocked? Don’t worry the API has moved to Windows Azure market place and available for you to sign up and continue using it and there is a free version available based on your usage. In this article, I am going to explain how you can integrate the new Bing API that is available in the Windows Azure market place with your website. You can access the Windows Azure market place from the below link https://datamarket.azure.com/ There is lot of applications available for you to subscribe and use. Bing is one of them. You can find the new Bing Search API from the below link https://datamarket.azure.com/dataset/5BA839F1-12CE-4CCE-BF57-A49D98D29A44 To get access to Bing Search API, first you need to register an account with Windows Azure market place. Sign in to the Windows Azure market place site using your windows live account. Once you sign in with your windows live account, you need to register to Windows Azure Market place account. From the Windows Azure market place, you will see the sign in button it the top right of the page. Clicking on the sign in button will take you to the Windows live ID authentication page. You can enter a windows live ID here to login. Once logged in you will see the Registration page for the Windows Azure market place as follows. You can agree or disagree for the email address usage by Microsoft. I believe selecting the check box means you will get email about what is happening in Windows Azure market place. Click on continue button once you are done. In the next page, you should accept the terms of use, it is not optional, you must agree to terms and conditions. Scroll down to the page and select the I agree checkbox and click on Register Button. Now you are a registered member of Windows Azure market place. You can subscribe to data applications. In order to use BING API in your application, you must obtain your account Key, in the previous version of Bing you were required an API key, the current version uses Account Key instead. Once you logged in to the Windows Azure market place, you can see “My Account” in the top menu, from the Top menu; go to “My Account” Section. From the My Account section, you can manage your subscriptions and Account Keys. Account Keys will be used by your applications to access the subscriptions from the market place. Click on My Account link, you can see Account Keys in the left menu and then Add an account key or you can use the default Account key available. Creating account key is very simple process. Also you can remove the account keys you create if necessary. The next step is to subscribe to BING Search API. At this moment, Bing Offers 2 APIs for search. The available options are as follows. 1. Bing Search API - https://datamarket.azure.com/dataset/5ba839f1-12ce-4cce-bf57-a49d98d29a44 2. Bing Search API – Web Results only - https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65 The difference is that the later will give you only web results where the other you can specify the source type such as image, video, web, news etc. Carefully choose the API based on your application requirements. In this article, I am going to use Web Results Only API, but the steps will be similar to both. Go to the API page https://datamarket.azure.com/dataset/8818f55e-2fe5-4ce3-a617-0b8ba8419f65, you can see the subscription options in the right side. And in the bottom of the page you can see the free option Since I am going to use the free options, just Click the Sign Up link for that. Just select I agree check box and click on the Sign Up button. You will get a recipt pagethat detail your subscription. Now you are ready Bing Search API – Web results. The next step is to integrate the API into your ASP.Net application. Now if you go to the Search API page (as well as in the Receipt page), you can see a .Net C# Class Library link, click on the link, you will get a code file named “BingSearchContainer.cs”. In the following sections I am going to demonstrate the use of Bing Search API from an ASP.Net application. Create an empty ASP.Net web application. In the solution explorer, the application will looks as follows. Now add the downloaded code file (“BingSearchContainer.cs”) to the project. Right click your project in solution explorer, Add -> existing item, then browse to the downloaded location, select the “BingSearchContainer.cs” file and add it to the project. To build the code file you need to add reference to the following library. System.Data.Services.Client You can find the library in the .Net tab, when you select Add -> Reference Try to build your project now; it should build without any errors. Add an ASP.Net page to the project. I have included a text box and a button, then a Grid View to the page. The idea is to Search the text entered and display the results in the gridview. The page will look in the Visual Studio Designer as follows. The markup of the page is as follows. In the button click event handler for the search button, I have used the following code. Now run your project and enter some text in the text box and click the Search button, you will see the results coming from Bing, cool. I entered the text “Microsoft” in the textbox and clicked on the button and I got the following results. Searching Specific Websites If you want to search a particular website, you pass the site url with site:<site url name> and if you have more sites, use pipe (|). e.g. The following search query site:microsoft.com | site:adobe.com design will search the word design and return the results from Microsoft.com and Adobe.com See the sample code that search only Microsoft.com for the text entered for the above sample. var webResults = bingContainer.Web("site:www.Microsoft.com " + txtSearch.Text, null, null, null, null, null, null); Paging the results returned by the API By default the BING API will return 100 results based on your query. The default code file that you downloaded from BING doesn’t include any option for this. You can modify the downloaded code to perform this paging. The BING API supports two parameters $top (for number of results to return) and $skip (for number of records to skip). So if you want 3rd page of results with page size = 10, you need to pass $top = 10 and $skip=20. Open the BingSearchContainer.cs in the editor. You can see the Web method in it as follows. public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options) {  In the method signature, I have added two more parameters public DataServiceQuery<WebResult> Web(String Query, String Market, String Adult, Double? Latitude, Double? Longitude, String WebFileType, String Options, int resultCount, int pageNo) { and in the method, you need to pass the parameters to the query variable. query = query.AddQueryOption("$top", resultCount); query = query.AddQueryOption("$skip", (pageNo -1)*resultCount); return query; Note that I didn’t perform any validation, but you need to check conditions such as resultCount and pageCount should be greater than or equal to 1. If the parameters are not valid, the Bing Search API will throw the error. The modified method is as follows. The changes are highlighted. Now see the following code in the SearchPage.aspx.cs file protected void btnSearch_Click(object sender, EventArgs e) {     var bingContainer = new Bing.BingSearchContainer(new Uri(https://api.datamarket.azure.com/Bing/SearchWeb/));     // replace this value with your account key     var accountKey = "your key";     // the next line configures the bingContainer to use your credentials.     bingContainer.Credentials = new NetworkCredential(accountKey, accountKey);     var webResults = bingContainer.Web("site:microsoft.com" +txtSearch.Text , null, null, null, null, null, null,3,2);     lstResults.DataSource = webResults;     lstResults.DataBind(); } The following code will return 3 results starting from second page (by skipping first 3 results). See the result page as follows. Bing provides complete integration to its offerings. When you develop search based applications, you can use the power of Bing to perform the search. Integrating Bing Search API to ASP.Net application is a simple process and without investing much time, you can develop a good search based application. Make sure you read the terms of use before designing the application and decide which API usage is suitable for you. Further readings BING API Migration Guide http://go.microsoft.com/fwlink/?LinkID=248077 Bing API FAQ http://go.microsoft.com/fwlink/?LinkID=252146 Bing API Schema Guide http://go.microsoft.com/fwlink/?LinkID=252151

    Read the article

  • HTML5 web storage: can different websites overwrite each other’s data on a user’s computer?

    - by Deepak Mahalingam
    I have a few questions regarding the concept of HTML5 storage. I went through the w3c specification, books and tutorials on the same, but still I am a bit unclear about certain concepts: Assume that I access Website A. Some JavaScript runs in my browser that sets a key value pair, say ('username','deepak'). Then I access Website B which also adds a key,value pair in the localstorage as ('username','mahalingam'). How will they both be differentiated? Will Website B override the value set by website A in my localstorage? How can we ensure that a website would not erase all of my localstorage?

    Read the article

  • Article about Sun ZFS Storage Appliances

    - by Owen Allen
    Sun ZFS Storage Appliances are versatile storage systems. Discovering and managing them in Ops Center, though, makes them even more versatile. If you discover a Sun ZFS Storage Appliance in Ops Center 12c, you can create iSCSI and Fibre Channel LUNS, and make the LUNs available to server pools and virtualization hosts as a storage library. Barbara Higgins has written an excellent article that walks you through the process of setting up a Sun ZFS Storage Appliance and discovering and managing it in Ops Center. If you're looking into ways to make a Sun ZFS Storage Appliance work for you, it's worth a look.

    Read the article

  • Join Our Call: Sun Storage 2500-M2 Announcement

    - by user797911
    Oracle's Sun Storage 2500-M2 array brings together the latest Fibre Channel (FC) and SAS2 technologies with Oracle's Sun Storage Common Array software from Oracle to create a robust solution that’s equally adept in an entry-level storage area network (SAN) for the mid-size business and integrating into an existing storage network within the enterprise. The Sun Storage 2500-M2 replaces Sun's Storage 2500 array product line and is designed so that the customer may have a quick qualification time for fast and easy deployment in the traditional 2500 environments. Jun Jang, Oracle Principal Product Manager will be hosting this 1 hour live call (a recording will be available), please join us to find out more: Event Date: 24-JUN-11 Event Time: 08:00 am PST/PDT/4pm UK time Web Registration and Access: http://oukc.oracle.com/static09/opn/login/?t=livewebcast|c=1031672594 Access for Mobile Devices: http://my.oracle.com/content/web/cnt636926 Call Provider: Intercall International Participant Dial-In Number: 706-634-8508 Additional International Dial-In Numbers Link: http://www.intercall.com/national/oracleuniversity/gdnam.html Dial-In Passcode: 96395

    Read the article

  • Join Our Call: Sun Storage 2500-M2 Announcement

    - by mseika
    Oracle's Sun Storage 2500-M2 array brings together the latest Fibre Channel (FC) and SAS2 technologies with Oracle's Sun Storage Common Array software from Oracle to create a robust solution that’s equally adept in an ! entry-level storage area network (SAN) for the mid-size business and integrating into an existing storage network within the enterprise. The Sun Storage 2500-M2 replaces Sun's Storage 2500 array product line and is designed so that the customer may have a quick qualification time for fast and easy deployment in the traditional 2500 environments. Jun Jang, Oracle Principal Product Manager will be hosting this 1 hour live call (a recording will be available), please join us to find out more:24. Jun 2011 08:00 am PST/PDT/4pm UK timeWeb Registration and AccessAccess for Mobile DevicesInternational Participant Dial-In Number: 706-634-8508Additional International Dial-In Numbers LinkDial-In Passcode: 6395

    Read the article

  • Azure Blob storage defrag

    - by kaleidoscope
    The Blob Storage is really handy for storing temporary data structures during a scaled-out distributed processing. Yet, the lifespan of those data structures should not exceed the one of the underlying operation, otherwise clutter and dead data could potentially start filling up your Blob Storage Temporary data in cloud computing is very similar to memory collection in object oriented languages, when it's not done automatically by the framework, temp data tends to leak. In particular, in cloud computing,  it's pretty easy to end up with storage leaks due to: Collection omission. App crash. Service interruption. All those events cause garbage to accumulate into your Blob Storage. Then, it must be noted that for most cloud apps, I/O costs are usually predominant compared to pure storage costs. Enumerating through your whole Blob Storage to clean the garbage is likely to be an expensive solution. Lokesh, M

    Read the article

  • How to Use Windows 8's Storage Spaces to Mirror & Combine Drives

    - by Chris Hoffman
    “Storage Spaces” is a new feature in Windows 8 that can combine multiple hard drives into a single virtual drive. It can mirror data across multiple drives for redundancy or combine multiple physical drives into a single pool of storage. You can even create pools of storage larger than the amount of physical storage space you have available. When the physical storage fills up, you can plug in another drive and take advantage of it with no additional configuration required. Storage Spaces is similar to RAID or LVM on Linux. The HTG Guide to Hiding Your Data in a TrueCrypt Hidden Volume Make Your Own Windows 8 Start Button with Zero Memory Usage Reader Request: How To Repair Blurry Photos

    Read the article

  • Oracle OpenWord 2012 - Managing Storage in the Cloud

    - by jwalker
    At Oracle OpenWorld this year attendees will get experience using the Sun ZFS Storage Appliance during the Managing Storage in the Cloud Hands-On-Lab. Using Sun ZFS Storage, we will be provisioning Oracle Enterprise Linux Virtual Machines and filesystem shares that can be used with Oracle Database. We will also be using Oracle DTrace Analytics to analyze I/O workloads and drill down to see how the storage is really being used. Hope you can join us! Session ID: HOL10034 Session Title: Managing Storage in the Cloud Speakers: Brian Haskins, Nagendran J, Paul Johnson, Karlheinz Vogel and Jim Walker Venue and Room: Marriott Marquis - Salon 14/15 Date and Times: Monday October 1 - 3:15-4:15PM, Tuesday October 2 - 5:00-6:00PM Oracle OpenWorld Storage Sessions

    Read the article

  • Off-the-shelf solutions for migrating data from azure blobstorage to rackspace cloud files

    - by S.C.
    I have large amounts of data (500+ GB) stored in azure blob storage that I need to transfer to rackspace cloud files. I know it is possible to perform such a migration using the SDKs from both services, but is there a free, standard, 1-step process for doing this? I've built a POC utility but would like to avoid having to optimize it to perform the transfer within a reasonable amount of time. Thanks.

    Read the article

< Previous Page | 16 17 18 19 20 21 22 23 24 25 26 27  | Next Page >