Search Results

Search found 13808 results on 553 pages for 'remote storage'.

Page 101/553 | < Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >

  • What is the best filesystem for storing thousands of files in one dictionary-like id-blob structure?

    - by Ivan
    What filesystem best suits my needs? Thousands or even millions of files in one directory. Good (ext4 & ntfs level or close) reliability (incl. fault tolerance) and access speed. No directories actually needed, as well as descriptive names, just a dictionary-like structure of id-blob pairs is all I need. No links, attributes, and access control features needed. The purpose is a file storage where all the metadata (data describing all the facts about what the file actually contains and who can access it) is stored in a MySQL database. As far as I know common filesystems like NTFS and ext3/4 can go dead-slow if there are too many files placed in one directory - that's why I ask.

    Read the article

  • What is the value/cost of enabling "spread spectrum clocking" on my hard drives?

    - by Stu Thompson
    I'm building up a biggish NAS box (10x WD RE4 2TB SATA RAID10) and ran into some problems. During the course of my research, debugging, investigations, etc, I discovered a jumper on the physical drives labeled "spread spectrum clocking". After some googling about this on teh internets, it seems to be a feature that some suggest (without reference or explanation) enabling in 'a storage configuration' that makes the drive less sussesptable to EMI. But why? I've got three core questions: Why is this feature not enabled by default? What are the actual benefits? Are there any costs?

    Read the article

  • Swap files in Cloud Infrastructures

    - by ffeldhaus
    At our company we set up an OpenStack Cloud and are currently creating internal guidelines for creation of OS templates / images. One controversial topic was if we should provide swap inside the VM templates. Therefore I'd like to ask the following questions From an elastic Cloud provider point of view, does it make sense to offer swap partitions / files in the VM templates or is swap not needed when a VM can be resized? Which scenarios necessarily demand a swap file to be present? What kind of Storage should be used for swap files (e.g. local / central, FC / iSCSI / NFS)? Are there any best practices for offering swap files in a performant way in Cloud Infrastructures?

    Read the article

  • Suggestions for hosted file sharing services

    - by Jon
    Before I pose my question, I will give some insight as per my scenario: I work for a small business (cost is an important factor) Our bandwidth is limited and would not support an in-house FTP server We need to share files (mostly pdf, inDesign, Illustrator documents) to our clients, and as we expand, we are finding that our current locally-hosted FTP solution is too slow and is becoming a detriment to our sales team. What we need is a remotely hosted solution to share files with our clients, specifically with the following features: Greater than 100gb of secure storage The Ability to distribute unique log in credentials to clients, granting access to a personalized directory or folder, while limiting access to other files on the server. A relatively simple web-based UI for clients with limited computer knowledge We have considered a dedicated remote server, and web-based services (box.net, yousendit.com, onehub.com, filesanywhere.com) but I am unsure as per the direction we should be taking - have I left another solution out? What would you suggest? Thanks in advance.

    Read the article

  • Should you archive documents before backing up to the cloud?

    - by gabbsmo
    I'm planning to add a cloud storage to my personal backup strategy. But now I wonder if it really is worth the trouble of compressing my documents and photos. The Open XML-format already have zip-compression and JPEG is a lossy image format. So there really isn't much benefit in compressing. 20Mb of documents become about 17Mb at the ULTRA preset of 7-zip. One benefit I can imagine is that you can shorten upload time by archiving the folders since it minimizes the number of requests that is needed to be sent to the server at upload and download. So what are your thoughts and experience in this issue? Should you archive your documents before backing them up to the cloud?

    Read the article

  • Safest snapshot of a failing harddrive?

    - by ironfroggy
    I have a headless machine that stopped booting, so I pulled it out for diagnostics and got a message that one of the harddrives was about to fail, so I pulled them all out and I need to get everything off, before figuring out which I need to get rid of. I wasn't sure which drive was failing, because it only said "Harddrive 1" and I don't know which it referred to. I'm wondering the best way to get everything off. I'm worried if I copy everything, I could get corrupt data and not realize some files are wrong until the drive is completely out of commission. What are my best options to get everything off in a way I can safely move to new storage?

    Read the article

  • Can different drive speeds and sizes be used in a hardware RAID configuration w/o affecting performance?

    - by R. Dill
    Specifically, I have a RAID 1 array configuration with two 500gb 7200rpm SATA drives mirrored as logical drive 1 (a) and two of the same mirrored as logical drive 2 (b). I'd like to add two 1tb 5400rpm drives in the same mirrored fashion as logical drive 3 (c). These drives will only serve as file storage with occasional but necessary access, and therefore, space is more important than speed. In researching whether this configuration is doable, I've been told and have read that the array will only see the smallest drive size and slowest speed. However, my understanding is that as long as the pairs themselves aren't mixed (and in this case, they aren't) that the array should view and use all drives at their actual speed and size. I'd like to be sure before purchasing the additional drives. Insight anyone?

    Read the article

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • No native symbols in symbol file

    - by Martin
    I am trying to remote debug a Windows Form app with VS2008. Attaching to process works fine (Remote (Native only with no authentication)), but when I open the modules window and try to load symbols I get No native symbols in symbol file. I realise it has something to do with how the app was build but cannot figure out what ?

    Read the article

  • C# remote web request certificate error

    - by Ben
    Hi. I am currently integrating a payment gateway into an application. Following a successful transaction the remote payment gateway posts details of the transaction back to my application (ITN). It posts to a HttpHandler that is used to read and validate the data. Part of the validation performed is a POST made by the handler to a validation service provided by the payment gateway. This effectively posts some of the original form values received back to the payment gateway to ensure they are valid. The url that I am posting back to is: "https://sandbox.payfast.co.za/eng/query/validate" and the code I am using: /// <summary> /// Posts the data back to the payment processor to validate the data received /// </summary> public static bool ValidateITNRequestData(NameValueCollection formVariables) { bool isValid = true; try { using (WebClient client = new WebClient()) { string validateUrl = (UseSandBox) ? SandboxValidateUrl : LiveValidateUrl; byte[] responseArray = client.UploadValues(validateUrl, "POST", formVariables); // get the resposne and replace the line breaks with spaces string result = Encoding.ASCII.GetString(responseArray); result = result.Replace("\r\n", " ").Replace("\r", "").Replace("\n", " "); if (result == null || !result.StartsWith("VALID")) { isValid = false; LogManager.InsertLog(LogTypeEnum.OrderError, "PayFast ITN validation failed", "The validation response was not valid."); } } } catch (Exception ex) { LogManager.InsertLog(LogTypeEnum.Unknown, "Unable to validate ITN data. Unknown exception", ex); isValid = false; } return isValid; } However, on calling WebClient.UploadValues the following exception is raised: System.Net.WebException: The underlying connection was closed: Could not establish trust relationship for the SSL/TLS secure channel. ---> System.Security.Authentication.AuthenticationException: The remote certificate is invalid according to the validation procedure. at System.Net.Security.SslState.StartSendAuthResetSignal(ProtocolToken message, AsyncProtocolRequest asyncRequest, Exception exception) at For the sake of brevity I haven't listed the full call stack (I can do if anyone thinks it will help). The remote certificate does appear to be valid. To get around the problem I did try adding a new RemoteCertificateValidationCallback that always returned true but just ended up getting the following exception: System.Security.SecurityException: Request for the permission of type 'System.Security.Permissions.SecurityPermission, mscorlib, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. at System.Security.CodeAccessSecurityEngine.Check(Object demand, StackCrawlMark& stackMark, Boolean isPermSet) at System.Security.CodeAccessPermission.Demand() at System.Net.ServicePointManager.set_ServerCertificateValidationCallback(RemoteCertificateValidationCallback value) at NopSolutions.NopCommerce.Payment.Methods.PayFast.PayFastPaymentProcessor.ValidateITNRequestData(NameValueCollection formVariables) The action that failed was: Demand The type of the first permission that failed was: System.Security.Permissions.SecurityPermission The Zone of the assembly that failed was: MyComputer So I am not sure this will work in medium trust? Any help would be much appreciated. Thanks Ben

    Read the article

  • MSDeploy: error using runCommand provider to call remote .cmd file (timeout)

    - by mjw.
    We are running into an error when trying to use the MSDeploy "runCommand" provider to execute a .cmd file on a remote machine. The expected run time should be about 10 seconds, but MSDeploy only runs it for about 2-3 seconds, after which time error details are returned. Here is the complete MSDeploy "runCommand" command line text I am using: msdeploy.exe -verb:sync -source:runCommand="D:\web deploy tester\test_cmd.cmd",dontUseCommandExe=false,waitAttempts=5,waitInterval=1000 -dest:auto,computername=http://test-machine:89/MsDeployAgentService/,userName=aaa,password=bbb Here are the error details returned: Error 'Error: (4/21/2010 12:19:25 PM) An error occurred when the request was processed on the remote computer. Error: The process 'C:\WINDOWS\system32\cmd.exe' (command line '/c "D:\web deploy tester\test_cmd.cmd"') was terminated because it exceeded the wait time. Error count: 1. ' occurred in call to RunCommand Any ideas as to why this is happening and how to resolve it?

    Read the article

  • How Do I 'git fetch' and 'git merge' from a Remote Tracking Branch (like 'git pull')

    - by kaybenleroll
    I have set up some remote tracking branches in git, but I never seem to be able to merge them into the local branch once I have updated them with 'git fetch'. For example, suppose I have remote branch called 'an-other-branch'. I set that up locally as a tracking branch using git branch --track an-other-branch origin/an-other-branch So far, so good. But if that branch gets updated (usually by me moving machine and commiting from that machine), and I want to update it on the original machine, I'm running into trouble with fetch/merge: git fetch origin an-other-branch git merge origin/an-other-branch Whenever I do this, I get an 'Already up-to-date' message and nothing merges. However, a git pull origin an-other-branch always updates it like you would expect. Also, running git diff git diff origin/an-other-branch shows that there are differences, so I think I have my syntax wrong. What am I doing wrong?

    Read the article

  • Recycle remote IIS app pool

    - by Abhijeet Patel
    I would like to use DirectoryServices to list and recycle App Pools hosted on any machine in my Workgroup. My approach is similar to some of the answers posted to this question,but in my case I'd like to do this for a remote machine running IIS 6. I'm prototyping this as a console app but will eventually be providing a web interface to allow recycling a selected app pool for a specified machine. Where can I specify the credentials to use for making Directory Services call to a remote machine. I hope I'm phrasing this correctly.

    Read the article

  • running a command in remote machine by using perl

    - by Bharath Kumar
    Hi All, I'm using following code to connect to a remote machine and try to execute one simple command on remote machine. cat tt.pl !/usr/bin/perl use strict; use warnings; use Net::Telnet; $telnet = new Net::Telnet ( Timeout=2, Errmode='die'); $telnet-open('172.168.12.58'); $telnet-waitfor('/login:\s*/'); $telnet-print('admin'); $telnet-waitfor('/password:\s*/'); $telnet-print('Blue'); $telnet-cmd('ver C:\log.txt'); $telnet-cmd('mkdir gy'); You have new mail in /var/spool/mail/root [root@localhost]# But when i'm executing this script it is throwing error messages [root@localhost]# perl tt.pl command timed-out at tt.pl line 12 [root@localhost]# Please help me in this

    Read the article

  • "SSH server" in Windows?

    - by Benjamin Oakes
    I have some command-line commands to execute on a Windows machine. The programs I need to run are only available on Windows. Is there a way to easily to do something like I would do with SSH? Example of what I mean: ssh [email protected] "remote command to execute" ...or do I have to Remote Desktop just to do this? (I'd like to run the commands programmatically from another computer rather than running them by hand.)

    Read the article

  • Can't SSH to remote server,how to avoid this

    - by snow8261
    From time to time,we suffer problems like we can not remote connect to our server via ssh.So we have to send someone on site to restart the computer for this problem.It causes a lot of pain.The situation is we have to remote connect to our server,which are very important like database server and application server and etc.We have met problems like ssh hang,like command ssh [email protected] with no response. when using ssh -v debug mode, it says : debug1: Connection established. debug1: identity file /.ssh/identity type -1 debug1: identity file /.ssh/id_rsa type -1 debug1: identity file /.ssh/id_dsa type -1 debug1: loaded 3 keys and we met this situation many times with no clue how to solve it.Is any log which can identify this problem? or Is there a tool for this problem? help needed!any idea are appreciated.

    Read the article

  • sql database connection on remote server

    - by arti pathak
    i create my web site on asp.net and i create database in sql using inbuilt it in visual studio i use step to create database 1.right click on solution explorer then add newSqldatabase then ok then if i connect it by coding then it will work on my local PC but actuall i am making website so i want to store it on my web server then if i load databse file (.mdf) file in my web server where my all files are loaded then it is showing it error "A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified)" so please help me and send me all step how i create database and what coding i use to connect on remote server....... Thank You......

    Read the article

  • CPU and profiling not supported for remote jvisualvm session

    - by yawn
    When monitoring a remote app (using jstatd) I can neither profile nor monitor cpu consumption. Heap monitoring (provided I do not use G1) works. jvisualvm provides the message "Not supported for this JVM." in the CPU graph window. Is there anything missing in my setup? The google showed up next to no results. The local environment (Mac OS X 10.6): java version "1.6.0_15" Java(TM) SE Runtime Environment (build 1.6.0_15-b03-219) Java HotSpot(TM) 64-Bit Server VM (build 14.1-b02-90, mixed mode) The remote environment (Linux version 2.6.16.27-0.9-smp (gcc version 4.1.0 (SUSE Linux))): java version "1.6.0_16" Java(TM) SE Runtime Environment (build 1.6.0_16-b01) Java HotSpot(TM) 64-Bit Server VM (build 14.2-b01, mixed mode) Local monitoring works as advertised.

    Read the article

  • Optimal storage of data structure for fast lookup and persistence

    - by Mikael Svenson
    Scenario I have the following methods: public void AddItemSecurity(int itemId, int[] userIds) public int[] GetValidItemIds(int userId) Initially I'm thinking storage on the form: itemId -> userId, userId, userId and userId -> itemId, itemId, itemId AddItemSecurity is based on how I get data from a third party API, GetValidItemIds is how I want to use it at runtime. There are potentially 2000 users and 10 million items. Item id's are on the form: 2007123456, 2010001234 (10 digits where first four represent the year). AddItemSecurity does not have to perform super fast, but GetValidIds needs to be subsecond. Also, if there is an update on an existing itemId I need to remove that itemId for users no longer in the list. I'm trying to think about how I should store this in an optimal fashion. Preferably on disk (with caching), but I want the code maintainable and clean. If the item id's had started at 0, I thought about creating a byte array the length of MaxItemId / 8 for each user, and set a true/false bit if the item was present or not. That would limit the array length to little over 1mb per user and give fast lookups as well as an easy way to update the list per user. By persisting this as Memory Mapped Files with the .Net 4 framework I think I would get decent caching as well (if the machine has enough RAM) without implementing caching logic myself. Parsing the id, stripping out the year, and store an array per year could be a solution. The ItemId - UserId[] list can be serialized directly to disk and read/write with a normal FileStream in order to persist the list and diff it when there are changes. Each time a new user is added all the lists have to updated as well, but this can be done nightly. Question Should I continue to try out this approach, or are there other paths which should be explored as well? I'm thinking SQL server will not perform fast enough, and it would give an overhead (at least if it's hosted on a different server), but my assumptions might be wrong. Any thought or insights on the matter is appreciated. And I want to try to solve it without adding too much hardware :) [Update 2010-03-31] I have now tested with SQL server 2008 under the following conditions. Table with two columns (userid,itemid) both are Int Clustered index on the two columns Added ~800.000 items for 180 users - Total of 144 million rows Allocated 4gb ram for SQL server Dual Core 2.66ghz laptop SSD disk Use a SqlDataReader to read all itemid's into a List Loop over all users If I run one thread it averages on 0.2 seconds. When I add a second thread it goes up to 0.4 seconds, which is still ok. From there on the results are decreasing. Adding a third thread brings alot of the queries up to 2 seonds. A forth thread, up to 4 seconds, a fifth spikes some of the queries up to 50 seconds. The CPU is roofing while this is going on, even on one thread. My test app takes some due to the speedy loop, and sql the rest. Which leads me to the conclusion that it won't scale very well. At least not on my tested hardware. Are there ways to optimize the database, say storing an array of int's per user instead of one record per item. But this makes it harder to remove items.

    Read the article

  • hp -ux remote cpio copy

    - by soField
    REMOTE SERVER remsh remoteserverhostname -l remoteusername find /tmp/a1/ | cpio -o > /tmp/paketr.cpio LOCAL SERVER rcp remoteserverhostname:/tmp/paketr.cpio /tmp/aaa cpio -idmv < /tmp/paketr.cpio i'am trying to get and create directory structure from remote server to local server i can do this with following command list but i wonder if i can do this with just one command by running cpio with pass-through mode remsh remoteserverhostname find /tmp/a1 | cpio -pd /tmp current </tmp/tmp/a1/b1/y1> newer current </tmp/tmp/a1/b1/z1> newer current </tmp/tmp/a1/b2/l2smc> newer "/tmp/a1/b3": No such file or directory Cannot stat </tmp/a1/b3>. 0 blocks so when i try to cpio -pd option , i am expecting it to create directories for me but it does not i was using rcp but its not preserving symbolic links :( what can i do ? hp-ux

    Read the article

  • A device specific alpha bitmap fails after switching resolutions in remote desktop

    - by Bob
    All my alpha bitmaps, created using CreateCompatibleBitmap(..), start to receive an error code 87 after someone signs in with Remote Desktop. I am assuming that this is because the resolution changed and I am using a device specific bitmap. I am wondering what the best route is to fix this issue without migrating to a device independent bitmap? Some options are: 1) Detect remote desktop change and flag all bitmaps to be reloaded (I have done this but it does not work as well as I would like). 2) Wait for error code 87 to happen on an alphablend image that previously worked, and then reload it (was going to try this next, im sure it will work, but a little hacky) 3) Detect random event such as WM_DISPLAYCHANGE or _ that tells me when I should do this (ie: a device change event I'm guessing, or maybe something more specific?) 4) _? Thanks for any help in advance

    Read the article

  • Dropdown menus not showing when i use my wpf application via Remote connections

    - by Bijoy Das Yesudas
    When i connect to my machine using remote desktop the dropdown menus and comboboxes i used in my wpf application is not showing up. And after closing the session when i comes back to my development machine where actually my application runs there also the same issue happens after the remote desktop session. Tried by changing resolution and all sometimes it works but most of the time result is negative. The only temporary solution i found for this is just rebooting the system which is not applicable. Could anyone please help me to fix this issue. Thanks in advance.

    Read the article

  • The remote server returned an error: NotFound.

    - by xscape
    Hi, I'm trying to retrieve a string in my old webservice but it give me an error of The remote server returned an error: NotFound. and its InnerException is {System.Net.WebException: The remote server returned an error: NotFound. --- System.Net.WebException: The remote server returned an error: NotFound. at System.Net.Browser.BrowserHttpWebRequest.InternalEndGetResponse(IAsyncResult asyncResult) at System.Net.Browser.BrowserHttpWebRequest.<c_DisplayClass5.b_4(Object sendState) at System.Net.Browser.AsyncHelper.<c_DisplayClass2.b_0(Object sendState) --- End of inner exception stack trace --- at System.Net.Browser.AsyncHelper.BeginOnUI(SendOrPostCallback beginMethod, Object state) at System.Net.Browser.BrowserHttpWebRequest.EndGetResponse(IAsyncResult asyncResult) at System.ServiceModel.Channels.HttpChannelFactory.HttpRequestChannel.HttpChannelAsyncRequest.CompleteGetResponse(IAsyncResult result)} this is the method which error prompted, this method returns a string format void client_ValidateUserEncryptedCompleted(object sender, DummyWS.ValidateUserEncryptedCompletedEventArgs e) { object token = e.Result; client = new DummyWS.MachineHistoryWSSoapClient(); if (token != null) { client.GetSummaryXMLAsync(token, "", ""); } } I am currently using Silverlight 4.0 and my ServiceReferences.ClientConfig is <configuration> <system.serviceModel> <bindings> <basicHttpBinding> <binding name="MachineHistoryWSSoap" maxBufferSize="2147483647" maxReceivedMessageSize="2147483647"> <security mode="None" /> </binding> </basicHttpBinding> </bindings> <client> <endpoint address="http://localhost/MHVwsModified/MachineHistoryWS.asmx" binding="basicHttpBinding" bindingConfiguration="MachineHistoryWSSoap" contract="DummyWS.MachineHistoryWSSoap" name="MachineHistoryWSSoap" /> </client> </system.serviceModel> My Web.Config in my web service is <configuration xmlns="http://schemas.microsoft.com/.NetConfiguration/v2.0"> <system.web> <compilation debug="true"> <assemblies> <add assembly="System.Windows.Forms, Version=2.0.0.0, Culture=neutral, PublicKeyToken=B77A5C561934E089" /></assemblies></compilation> <authentication mode="Windows" /> </system.web> <system.webServer> <directoryBrowse enabled="true" /> </system.webServer> Any help will be aprreciated thank you.

    Read the article

< Previous Page | 97 98 99 100 101 102 103 104 105 106 107 108  | Next Page >