Search Results

Search found 10384 results on 416 pages for 'plan cache'.

Page 118/416 | < Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >

  • What should I know before considering a VPS or dedicated server?

    - by Corey Sarnia
    I have a plan for the future for an application and web service. The client will have an application that will send requests to a server-side Java back-end that will process requests, and the server should also be able to host a website, preferably on a WAMP setup (which is what I'm used to; very little *nix knowledge). Now, I cannot provide any hard stats because this is only a plan that's in a discussion stage. However, we do fully expect it will scale enough to need some type of dedicated hosting. My question is this: what types of things should I know about before looking into getting hosting? What should I be asking the hosting providers before I decide on a purchase? When is it appropriate to switch from a VPS to a fully dedicated server?

    Read the article

  • ASP Fails with 500 Error

    - by VinceM
    We have a server setup as an IIS box and have some static pages with a few asp pages that handle the form submissions. The asp is really vbscript that sends a CDO message. When moving these pages to the new server the form will not submit, it gives a 500 error and the following shows in Event Viewer: Error: The Template Persistent Cache initialization failed for Application Pool 'DefaultAppPool' because of the following error: Could not create a Disk Cache Sub-directory for the Application Pool. The data may have additional error codes.. I can't seem to find any info on this anywhere... I was thinking it may have something to do with the fact that we created this server from an image of another server. Thanks for your help in advance... Vince

    Read the article

  • How can I incrementally backup a large amount of data [with rsync]?

    - by Annan
    A website contains ~40GB images + files which needs to be backed up. Rollbacks need to be possible daily for the last 30 days. And backup server < 1.2TB My idea is to have one full backup from 30 days ago, then incremental backups for the last 30 days. On each day the last incremental backup is combined with the full backup and a new incremental backup is added. Can this strategy be implemented with rsync, if so how? Are there any problems with this plan? A better plan? PS: Incremental backups, not backup incrementally (which rsync does automatically)

    Read the article

  • VMware Workstation 7&8&9 does not generate /etc/vmware/network upon installation

    - by dash17291
    When I install VMware Workstation on Arch linux Virtual ethernet is not working. $ sudo tail /var/log/vnetlib Aug 28 22:20:33 VNLFileExists - Cannot check for file or directory: /etc/vmware/networking , error: No such file or directory Aug 28 22:20:33 VNLNetCfgLoad - Import file does not exist Aug 28 22:20:33 VNL_Load - Error loading the vnet configuration, file used: /etc/vmware/networking Aug 28 22:20:33 VNLNetCfgUnload - Requested cache is not loaded Database file is not present. Failed to initialize Aug 28 22:20:41 VNLFileExists - Cannot check for file or directory: /etc/vmware/networking , error: No such file or directory Aug 28 22:20:41 VNLNetCfgLoad - Import file does not exist Aug 28 22:20:41 VNL_Load - Error loading the vnet configuration, file used: /etc/vmware/networking Aug 28 22:20:41 VNLNetCfgUnload - Requested cache is not loaded Required modules compiled. Previously I have copied that file or directory (I don't remember) from a working installation, but now I need a real solution. It's strange for me, may be a hardware issue also because with Ubuntu the same thing happens on the same computer.

    Read the article

  • Best usb storage for my router, Asus RT-AC66U?

    - by Jason94
    I have the ASUS RT-AC66U and I want to add a USB storage to it. It has 2x USB, and Im already using one for my printer. So the last one I want to use to attach a USB storage, and I've read some reviews stating the throughput of the USB could be up to 18 mb/s. So in regard of USB storage, should I care about hard disk cache? Simple powered-over-usb seems to have 8 mb cache, other (externally powered) has 16 for instance.

    Read the article

  • Odd behavior of permissions and icons

    - by Urban
    After some virus had infected my Windows 7, I did a complete format and re-installed the OS. I was just installing applications and copying back some data when I noticed some shortcuts changing their icons to something I couldnt recognize (Yellow icons in the image below). Also, a few exe files which previously did not ask for User permission, are now asking for it. Wondering if this is an icon cache problem, I cleared the cache by deleting the IconCache.db in AppData/Local but the problem still persists. Although I did a full system scan with MS Security Essentials, I'm not sure if this is another virus or some other problem. I would appreciate any suggestions you might have. Edit: Now even Firefox needs permissions to launch. It's icon hasnt changed, but it's got the 'shield' overlay on it like the other yellow icons.

    Read the article

  • Squid throws error, The requested URL could not be retrieved

    - by Supratik
    Hi Sometimes I am getting the following error The requested URL could not be retrieved While trying to retrieve the URL: http://groups.google.com/ The following error was encountered: Unable to determine IP address from host name for groups.google.com The dnsserver returned: Refused: The name server refuses to perform the specified operation. This means that: The cache was not able to resolve the hostname presented in the URL. Check if the address is correct. Your cache administrator is root. What could be the reason for the above error ? Regards Supratik

    Read the article

  • When does a cached website refresh?

    - by user142485
    If i go to example.com [placeholder for different website] it creates the numerous cache entries in the browser for different items on the page and two entries for example.com. One of which expires in 1.5hrs and the other has 'No expiration time.' What I am wondering is: when does the browser display the cached page and when does it get the newest version from the server (when re-visiting the site)? What do the two different expiration times for the top level domain mean? A web page I went to was temporarily redirecting to a different page. After it was back up, I was still getting redirected to the temp page even though the correct page was up. Would this have eventually resolved itself based on expiration times or does the cache need to be cleared?

    Read the article

  • Efficiently installing fully-patched Windows XP, IE, and Office 2007 on an isolated PC

    - by JPaget
    I have been tasked to install Windows XP, IE, and Office 2007 on a computer that will become part of a standalone network not connected to the Internet. What is a good way to install all of the security updates? I'm installing from CD's of Windows XP SP2 and MS Office 2007. Next I plan to download Windows XP SP3 and Office 2007 SP2, burn them to CD's, and install both service packs. Finally I plan to go to the Microsoft Download Center and download all applicable security updates, burn then to CD, and install them. I estimate that there are over 100 of these updates. Is there a more efficient way to do this?

    Read the article

  • RAID10 Without BBU, With UPS

    - by Richard
    My datacenter says that each rack has primary and backup power on each rack. I assume this means there is a UPS for each server. Therefore, do I have any need of getting a BBU for the following setup? Intel Cherry 520 SSD x 4 RAID 10 LSI-9260 with WRITEBACK CACHE ENABLED I have heard that without a BBU the data in the cache could be lost. Since my needs aren't mission-critical, I can afford to lose some data. But would the rest of the data on the HD be corrupted?

    Read the article

  • What is the best Linux distro for a php web server? [on hold]

    - by benjisail
    We are planning to upgrade our hardware and at the same time we plan to reinstall all our web server from a fresh OS. Currently our web server is running on CentOS 4.7 on a dedicated server. We are using Apache, Mysql, PHP, SVN, FTP and all the needed tools for a web server managed through SSH. We plan to use a cloud server for the new web server. I don't know which Linux distro to take for this new server. Should I stay with Centos and just take the latest release 5.4 or should I switch to something else like a Debian base distro (Ubuntu Server)? The thing that I didn't like with CentOS was the none availability of the latest version of PHP and Apache on Yum. This make it harder to keep our webserver updated with the latest technologies... Thanks for your help!

    Read the article

  • Chrome Residual Redirect to Login Page

    - by Shadow503
    My college redirects people in the dorms to a login page when using an ethernet (or wifi) connection. I am now at home, and certain domains keep redirecting to this login page. I've tried running ipconfig /flushdns and I flushed the chrome's local dns cache as described here: How to clear/flush the DNS cache in Google Chrome?. Interestingly enough, while http://www.reddit.com redirects to the login page, http://www.reddit.com/r/funny works. Firefox works fine for both urls. Is there a way to fix this without deleting all of my cookies? Thanks!

    Read the article

  • Hosting company that does Linux VPS and MS SQL

    - by danielmcq
    I'm looking for hosted solutions but there are so many companies that finding the right one using a Google search is a bit overwhelming. Ideally I would like a hosting company which has following options: -Linux VPSs - Individual VPSs should be fairly cheap since I plan on putting one or two services per VPS i.e web server on one (httpd and ColdFusion), an SVN server on another, etc. -Managed MS SQL databases - My company already has data in MS SQL databases and a lot of ColdFusion code written that has MS SQL specific commands in it. -Individually purchased dedicated IP addresses -Preferably located in the North America region My plan would be to setup one Linux VPS as a gateway/firewall/VPN server and have all of my traffic routed through so that my other servers would not use of bandwidth by talking to each other. The trick is also finding a company that does Linux VPS AND MS SQL databases. Does anybody know of any hosting companies offer what I'm looking for? Let me know if I need to add more details.

    Read the article

  • is using Hosts for resolving a sql-server more performant?

    - by Ice
    Hi, we have a legacy application which uses a access.mdb with hundreds of ODBC-connected tables on a sql-server. the access.mdb contains nothing else than these odbc-connections. Now we consider to use a virtual sql-servername for these odbc connections and resolve it in the local hosts-file with the ip-address of the real sql-server. Like this we can easy switch between a test-sql-database server and the the server for production in changing one single entry in the hosts. EVERYTHING works fine and now comes the question: Could it be that this is more performant because there is one single point on resolving the sql-server (name or ip-address)? Is there something like a network-cache / DNS-Cache? peace Ice

    Read the article

  • How to change memory for DomU runtime

    - by saffron
    I have a xen server with xen-4.1.3, linux-image-3.2.0-3-amd64, debian squeeze and 16Gb of RAM. The domain-0 has 1Gb of ram, the rest of memory belongs to the hypervisor. I want to start a guest domain with a minimal amount of memory and increase it runtime later. When I start a guest domain with 256Mb of ram and run xm mem-set domu 4Gb, I get ~3Gb only in domu and a guest domain free says: root@test:~# free total used free shared buffers cached Mem: 2830620 72868 2757752 0 2432 43504 -/+ buffers/cache: 26932 2803688 Swap: 1048572 0 1048572 And a guest domain dmesg says: [ 0.000000] Memory: 175912k/2883584k available (3527k kernel code, 448k absent, 2707224k reserved, 3210k data, 612k init) When I start a guest domain with 2Gb of ram I can run xm mem-set domu 7Gb and get ~7Gb of ram in a guest domain: root@test:~# free total used free shared buffers cached Mem: 6828228 74944 6753284 0 1328 12568 -/+ buffers/cache: 61048 6767180 Swap: 1048572 0 1048572 And a guest domain dmesg: [ 0.000000] Memory: 1674960k/16651264k available (3527k kernel code, 448k absent, 14975856k reserved, 3210k data, 612k init) How can I start a guest domain with a minimal amount of ram (256Mb) and increase it under 15Gb?

    Read the article

  • Auth failed running command from shell script

    - by CSchulz
    I try to run following command from shell script: svn checkout http://url/ --username user --password password --non-interactive --no-auth-cache . It fails always with following error: svn: OPTIONS of 'http://url/': authorization failed: Could not authenticate to server: rejected Basic challenge (http://url) Here the call out of my script: $(svn $command $url $auth --non-interactive --no-auth-cache .) Running the same command from the terminal works fine. What is the difference between running from shell script and terminal? EDIT: Here some version information: OS: Porteus 1.0 based on Slackware 13.3 Subversion: subversion-1.6.16-i486-1

    Read the article

  • Best server for mailing application [closed]

    - by Cyber Junkie
    My application is similar to a reminder service that reminds users of events that they scheduled. I'm sending emails to users through a PHP script. I'm not sending one email to multiple recipients. Each recipient receives a different message. I plan to use cron jobs every minute and expect the application to send roughly 200 individual emails in 1 hour (for a small user base that may grow). I don't have hosting experience with this type of application. I plan to start on a shared host and move up in the future to vps or dedicated. Most shared hosts that I looked into allow 50-100 emails per hour with delays between mailings. Please kindly inform me what I should look for in web hosts for this kind of application.

    Read the article

  • Adding expire headers to content served from CDN?

    - by mdolon
    I'm using MaxCDN to serve content to my blog using W3 Total Cache. The problem I'm running into when evaluating my site using Google Page Speed and YSlow! is that expire headers are not being sent on content delivered from the CDN, nor are they coming from a cookieless domain. Is this something that is completely in the hands of my CDN or is it something I can fix using my server configuration? Some info about my setup: nginx with php-fastcgi wordpress 3.0 w3 total cache 0.9a (dev release) MaxCDN the site: http://devgrow.com/

    Read the article

  • Will readyboost speed up a secondary partion or harddrive?

    - by Sebastian
    I'm building a new computer and plan on using a few spare USB sticks with readyboost to cache disk writes. I'll be running Windows 7 Enterprise SP1 x64 I have a single 2 TB disk, and plan to make a partition for windows (100 GB) and use the rest for data. I know Readyboost will work nicely for the C drive, but am unable to find any information on whether it will accelerate a second drive or partition. Just to clarify, I'm NOT trying to use the harddrive instead of the usb sticks. I'm trying to speed up the second partition using the USB sticks So, will readyboost work for a secondary partion or harddrive?

    Read the article

  • Alternative to cPanel (For Email, ect)

    - by Dboy1612
    I'm currently setting up a VPS for the first time. Standard that I've ever worked with before on shared hosting was cPanel, but as the majority of my work I plan on doing from now on will be using NodeJS and Python/Flask, I'd like to avoid needing to install Apache/MySQL/PHP. What would be my best bet to help manage a mail server other than cPanel? Or even other specific server settings that may come in handy later? Plan on using Ubuntu if that counts for anything.

    Read the article

  • Why won't Windows XP 64 install on my machine?

    - by user272671
    I have a hp xw8200 workstation and would like to install Windowx XP 64 bit on it. Problem I seem to be having is when I reboot the machine and try to boot from the CD, nothing happens. Seems that the CD Drive is not being located or does not execute my cd to start the install. I already checked to verify that I can install Windows XP 64bit on my machine and also ensured I have the right bios and drivers for the install, only that the boot just does not take place. Anyone have any idea how I can solve this problem ? http://www.hp.com/workstations/pws/xw8200/xw8200.pdf http://h20331.www2.hp.com/Hpsub/cache/286710-0-0-225-121.html http://h20331.www2.hp.com/Hpsub/cache/286707-0-0-225-121.html thanks in advance

    Read the article

  • nginx location pathing issue

    - by Michael Jefferys
    I've got a pretty much default sites-enabled set up in my nginx on debian squeeze and i'm now trying to get it to serve up my munin graphs on myhost/munin/ Heres the location i've added to the config location /munin { root /var/cache/munin/www/; index index.htm index.html; } And here is the error I recieve: 2012/07/09 23:52:03 [error] 3598#0: *13 "/var/cache/munin/www/munin/index.htm" is not found (2: No such file or directory), client: 93.*.*.*, server: , request: "GET /munin/ HTTP/1.1", host: "" This set up used to 'just work' in apache. I'm new to nginx so a bit lost as to why its adding the extra /munin when looking for the path. Any advice?

    Read the article

  • Hardware needed for 2000 users? [closed]

    - by Trcx
    I have school assignment that is fairly well defined, requiring us to come up with a plan for an environment serving dynamic web applications to 2000 users, and should be able to scale up to six thousand. I have done plenty of research as far as load balancing, redundancy, UPSs, etc, but am having a hard time figuring out how much hardware is actually needed in the way of physical servers, ram, processing power, etc. The assignment states that the server will have a lot of dynamic code, email, and a database are required, all utilizing the appropriate microsoft service (MS SQL, Exchange, IIS). I already plan on splitting them out on to separate servers, but can't even fathom the hardware requirements of something that large scale. Could someone with experience weight in on this, or point me two some good articles?

    Read the article

  • Ubuntu 11.04 update fail!

    - by Robertini
    I try to update Ubuntu Natty Narwhal, but I got this error: dpkg: .../xserver-xorg-core_2%3a1.9.99.901+git20110131.be3be758-0ubuntu6_i386.deb miatt xserver-xorg-core-t is tartalmazza: xserver-xorg-core breaks xserver-xorg-video-8 nvidia-current provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-cirrus provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-ark provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-tdfx provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-sisusb provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-rendition provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-vesa provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-fbdev provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-savage provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-vmware provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-openchrome provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-s3virge provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-voodoo provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-apm provides xserver-xorg-video-8 and is present and telepített. xserver-xorg-video-sis provides xserver-xorg-video-8 and is present and telepített. dpkg: hibás feldolgozás: /var/cache/apt/archives/xserver-xorg-core_2%3a1.9.99.901+git20110131.be3be758-0ubuntu6_i386.deb (--unpack): installing xserver-xorg-core would break existing software (Adatbázis olvasása ... 178633 files and directories currently installed.) xserver-xorg-video-cirrus 1:1.3.2-2ubuntu3 cseréjének elokészítése (e csomaggal: .../xserver-xorg-video-cirrus_1%3a1.3.2-2ubuntu5_i386.deb) ... Csere kicsomagolása: xserver-xorg-video-cirrus ... man-db triggereinek feldolgozása… Hibák történtek a feldolgozáskor: /var/cache/apt/archives/xserver-xorg-core_2%3a1.9.99.901+git20110131.be3be758-0ubuntu6_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1)

    Read the article

  • How about a new platform for your next API&hellip; a CMS?

    - by Elton Stoneman
    Originally posted on: http://geekswithblogs.net/EltonStoneman/archive/2014/05/22/how-about-a-new-platform-for-your-next-apihellip-a.aspxSay what? I’m seeing a type of API emerge which serves static or long-lived resources, which are mostly read-only and have a controlled process to update the data that gets served. Think of something like an app configuration API, where you want a central location for changeable settings. You could use this server side to store database connection strings and keep all your instances in sync, or it could be used client side to push changes out to all users (and potentially driving A/B or MVT testing). That’s a good candidate for a RESTful API which makes proper use of HTTP expiration and validation caching to minimise traffic, but really you want a front end UI where you can edit the current config that the API returns and publish your changes. Sound like a Content Mangement System would be a good fit? I’ve been looking at that and it’s a great fit for this scenario. You get a lot of what you need out of the box, the amount of custom code you need to write is minimal, and you get a whole lot of extra stuff from using CMS which is very useful, but probably not something you’d build if you had to put together a quick UI over your API content (like a publish workflow, fine-grained security and an audit trail). You typically use a CMS for HTML resources, but it’s simple to expose JSON instead – or to do content negotiation to support both, so you can open a resource in a browser and see a nice visual representation, or request it with: Accept=application/json and get the same content rendered as JSON for the app to use. Enter Umbraco Umbraco is an open source .NET CMS that’s been around for a while. It has very good adoption, a lively community and a good release cycle. It’s easy to use, has all the functionality you need for a CMS-driven API, and it’s scalable (although you won’t necessarily put much scale on the CMS layer). In the rest of this post, I’ll build out a simple app config API using Umbraco. We’ll define the structure of the configuration resource by creating a new Document Type and setting custom properties; then we’ll build a very simple Razor template to return configuration documents as JSON; then create a resource and see how it looks. And we’ll look at how you could build this into a wider solution. If you want to try this for yourself, it’s ultra easy – there’s an Umbraco image in the Azure Website gallery, so all you need to to is create a new Website, select Umbraco from the image and complete the installation. It will create a SQL Azure website to store all the content, as well as a Website instance for editing and accessing content. They’re standard Azure resources, so you can scale them as you need. The default install creates a starter site for some HTML content, which you can use to learn your way around (or just delete). 1. Create Configuration Document Type In Umbraco you manage content by creating and modifying documents, and every document has a known type, defining what properties it holds. We’ll create a new Document Type to describe some basic config settings. In the Settings section from the left navigation (spanner icon), expand Document Types and Master, hit the ellipsis and select to create a new Document Type: This will base your new type off the Master type, which gives you some existing properties that we’ll use – like the Page Title which will be the resource URL. In the Generic Properties tab for the new Document Type, you set the properties you’ll be able to edit and return for the resource: Here I’ve added a text string where I’ll set a default cache lifespan, an image which I can use for a banner display, and a date which could show the user when the next release is due. This is the sort of thing that sits nicely in an app config API. It’s likely to change during the life of the product, but not very often, so it’s good to have a centralised place where you can make and publish changes easily and safely. It also enables A/B and MVT testing, as you can change the response each client gets based on your set logic, and their apps will behave differently without needing a release. 2. Define the response template Now we’ve defined the structure of the resource (as a document), in Umbraco we can define a C# Razor template to say how that resource gets rendered to the client. If you only want to provide JSON, it’s easy to render the content of the document by building each property in the response (Umbraco uses dynamic objects so you can specify document properties as object properties), or you can support content negotiation with very little effort. Here’s a template to render the document as HTML or JSON depending on the Accept header, using JSON.NET for the API rendering: @inherits Umbraco.Web.Mvc.UmbracoTemplatePage @using Newtonsoft.Json @{ Layout = null; } @if(UmbracoContext.HttpContext.Request.Headers["accept"] != null &amp;&amp; UmbracoContext.HttpContext.Request.Headers["accept"] == "application/json") { Response.ContentType = "application/json"; @Html.Raw(JsonConvert.SerializeObject(new { cacheLifespan = CurrentPage.cacheLifespan, bannerImageUrl = CurrentPage.bannerImage, nextReleaseDate = CurrentPage.nextReleaseDate })) } else { <h1>App configuration</h1> <p>Cache lifespan: <b>@CurrentPage.cacheLifespan</b></p> <p>Banner Image: </p> <img src="@CurrentPage.bannerImage"> <p>Next Release Date: <b>@CurrentPage.nextReleaseDate</b></p> } That’s a rough-and ready example of what you can do. You could make it completely generic and just render all the document’s properties as JSON, but having a specific template for each resource gives you control over what gets sent out. And the templates are evaluated at run-time, so if you need to change the output – or extend it, say to add caching response headers – you just edit the template and save, and the next client request gets rendered from the new template. No code to build and ship. 3. Create the content With your document type created, in  the Content pane you can create a new instance of that document, where Umbraco gives you a nice UI to input values for the properties we set up on the Document Type: Here I’ve set the cache lifespan to an xs:duration value, uploaded an image for the banner and specified a release date. Each property gets the appropriate input control – text box, file upload and date picker. At the top of the page is the name of the resource – myapp in this example. That specifies the URL for the resource, so if I had a DNS entry pointing to my Umbraco instance, I could access the config with a URL like http://static.x.y.z.com/config/myapp. The setup is all done now, so when we publish this resource it’ll be available to access.  4. Access the resource Now if you open  that URL in the browser, you’ll see the HTML version rendered: - complete with the  image and formatted date. Umbraco lets you save changes and preview them before publishing, so the HTML view could be a good way of showing editors their changes in a usable view, before they confirm them. If you browse the same URL from a REST client, specifying the Accept=application/json request header, you get this response:   That’s the exact same resource, with a managed UI to publish it, being accessed as HTML or JSON with a tiny amount of effort. 5. The wider landscape If you have fairy stable content to expose as an API, I think  this approach is really worth considering. Umbraco scales very nicely, but in a typical solution you probably wouldn’t need it to. When you have additional requirements, like logging API access requests - but doing it out-of-band so clients aren’t impacted, you can put a very thin API layer on top of Umbraco, and cache the CMS responses in your API layer:   Here the API does a passthrough to CMS, so the CMS still controls the content, but it caches the response. If the response is cached for 1 minute, then Umbraco only needs to handle 1 request per minute (multiplied by the number of API instances), so if you need to support 1000s of request per second, you’re scaling a thin, simple API layer rather than having to scale the more complex CMS infrastructure (including the database). This diagram also shows an approach to logging, by asynchronously publishing a message to a queue (Redis in this case), which can be picked up later and persisted by a different process. Does it work? Beautifully. Using Azure, I spiked the solution above (including the Redis logging framework which I’ll blog about later) in half a day. That included setting up different roles in Umbraco to demonstrate a managed workflow for publishing changes, and a couple of document types representing different resources. Is it maintainable? We have three moving parts, which are all managed resources in Azure –  an Azure Website for Umbraco which may need a couple of instances for HA (or may not, depending on how long the content can be cached), a message queue (Redis is in preview in Azure, but you can easily use Service Bus Queues if performance is less of a concern), and the Web Role for the API. Two of the components are off-the-shelf, from open source projects, and the only custom code is the API which is very simple. Does it scale? Pretty nicely. With a single Umbraco instance running as an Azure Website, and with 4x instances for my API layer (Standard sized Web Roles), I got just under 4,000 requests per second served reliably, with a Worker Role in the background saving the access logs. So we had a nice UI to publish app config changes, with a friendly Web preview and a publishing workflow, capable of supporting 14 million requests in an hour, with less than a day’s effort. Worth considering if you’re publishing long-lived resources through your API.

    Read the article

< Previous Page | 114 115 116 117 118 119 120 121 122 123 124 125  | Next Page >