Search Results

Search found 6412 results on 257 pages for 'intersystems cache'.

Page 6/257 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • how to set cache control to public in iis 7.5

    - by ivymike
    I'm trying to set cache control header to max age using the following snippet in my web.config: <system.webServer> <staticContent> <clientCache cacheControlMode="UseMaxAge" cacheControlMaxAge="1.00:00:00" /> </staticContent> </system.webServer> Some how this isn't being reflected in the response. Instead I see a Cache-Control: private header on the responses. I'm using NancyFx framework (which is a layer on top of Asp.net). Is there any thing else I need to do ? Below are the reponse headers I receive: HTTP/1.1 200 OK\r\n Cache-Control: private\r\n Content-Type: application/x-javascript\r\n Content-Encoding: gzip\r\n Last-Modified: Mon, 19 Mar 2012 16:42:03 GMT\r\n ETag: 8ced406593e38e7\r\n Vary: Accept-Encoding\r\n Server: Microsoft-IIS/7.5\r\n Nancy-Version: 0.9.0.0\r\n Set-Cookie: NCSRF=AAEAAAD%2f%2f%2f%2f%2fAQAAAAAAAAAMAgAAADxOYW5jeSwgVmVyc2lvbj0wLjkuMC4wLCBDdWx0dXJlPW5ldXRyYWwsIFB1YmxpY0tleVRva2VuPW51bGwFAQAAABhOYW5jeS5TZWN1cml0eS5Dc3JmVG9rZW4DAAAAHDxSYW5kb21CeXRlcz5rX19CYWNraW5nRmllbGQcPENyZWF0ZWREYXRlPmtfX0JhY2tpbmdGaWVsZBU8SG1hYz5rX19CYWNraW5nRmllbGQHAAcCDQICAAAACQMAAADTubwoldTOiAkEAAAADwMAAAAKAAAAAkpT5d9aTSzL3BAPBAAAACAAAAACPUCyrmSXQhkp%2bfrDz7lZa7O7ja%2fIg7HV9AW6RbPPRLYLAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA%3d; path=/; HttpOnly\r\n X-AspNet-Version: 4.0.30319\r\n Date: Tue, 20 Mar 2012 09:44:20 GMT\r\n Content-Length: 1624\r\n

    Read the article

  • Can varnish cache files without specific extension or residing in specific directory

    - by pataroulis
    I have a varnish installation to cache (MANY) images that my service serves. It is about 200 images of around 4k per second and varnish happily serves them according to the following rule: if (req.request == "GET" && req.url ~ "\.(css|gif|jpg|jpeg|bmp|png|ico|img|tga|wmf)$") { remove req.http.cookie; return(lookup); } Now, the thing is that I recently added another service on the same server that creates thumbnails to serve but it does not add a specific extension. The files are of the following filename pattern: http://www.example.com/thumbnails/date-of-thumbnail/xxxxxxxxx.xx where xx are numbers, so xxxxxxxxx.xx could be 6482364283.73 (two numbers at the end) (actually this is the timestamp so I can keep extra info in the filename) That has the side effect that varnish does not cache them and I see them constantly being served by apache itself. Even though I can change the format from now on to create thumbs ending in .jpg, is there a way to change the vcl file of my varnish daemon to either cache everything under a directory (the thumbnails directory) or everything with two numbers at its extension? Let me know if I can provide any additional info ! Thanks!

    Read the article

  • disk write cache buffer and separate power supply

    - by HugoRune
    Windows has a setting to turn off the write-cache buffer (see image) Turn off Windows write-cache buffer flushing on the device To prevent data loss, do not select this check box unless the device has a separate power supply that allows the device to flush its buffer in case of power failure. Is it feasible and economical to get such a "separate power supply" for the internal sata drives of a non-server PC? Under what name is such a power supply sold? I know that there are UPS devices that can be connected to external drives,but what is required to be able to switch this setting safely on for an internal disk? The setting has different descriptions in different version of windows Windows XP: Enable write caching on the disk This setting enables write caching in Windows to improve disk performance, but a power outage or equipment failure might result in data loss or corruption. Windows Server 2003: Enable write caching on the disk Recommended only for disks with a backup power supply. This setting further improves disk performance, but it also increases the risk of data loss if the disk loses power. Windows Vista: Enable advanced performance Recommended only for disks with a backup power supply. This setting further improves disk performance, but it also increases the risk of data loss if the disk loses power. Windows 7 and 8: Turn off Windows write-cache buffer flushing on the device To prevent data loss, do not select this check box unless the device has a separate power supply that allows the device to flush its buffer in case of power failure. This article by Raymond Chen has some more detailed information about what the setting does.

    Read the article

  • Gain Quick Access to the Cache in Firefox

    - by Asian Angel
    Are you looking for a quick and simple way to view the contents of the cache in Firefox? Then you will definitely want to see how easy it can be using the CacheViewer extension. Note: CacheViewer is a front-end app for easily accessing and searching the memory cache. Before Viewing the cache in Firefox using “about:cache” provides some information about the contents but may not be the most efficient method available for some people. CacheViewer in Action Once you have installed the extension there are three easy ways to access your new cache viewer. The first is using the “CacheViewer Command” available in the “Tools Menu” and the second is using the keyboard shortcut “Ctrl + Shift + C”. The third way is by adding a “Toolbar Button” to your browser’s UI. All three work equally well…choose the method that best suits your personal needs. When you access the “CacheViewer Window” this is what it will look like. You may decide to resize it and move (or hide) some of the columns for the best viewing. You can easily scroll through the cache contents and preview images if desired as shown here. If you keep the “CacheViewer Window” open you can refresh it as you browse using the “Refresh Button” in the lower right corner. This is a nice, quick, and very simple way to access the cache on demand and save items to your hard-drive if desired. Note: The “CacheViewer” can also be set to open in a new tab instead (see “Options”). Options Choose whether “CacheViewer” opens in a separate window (default) or in a new tab. Conclusion If you want a quick and simple way to view the cache in Firefox then the CacheViewer extension is just what you have been looking for. Link Download the CacheViewer extension (Mozilla Add-ons) Similar Articles Productive Geek Tips Add a Cache Clearing Button to FirefoxSearch for Install Packages from the Ubuntu Command LineQuick Tip: Empty Internet Explorer 7 Cache when Browser is ClosedView Internet Explorer Cache Files the Easy WayQuick Hits: 11 Firefox Tab How-Tos TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Out of band Security Update for Internet Explorer 7 Cool Looking Screensavers for Windows SyncToy syncs Files and Folders across Computers on a Network (or partitions on the same drive) If it were only this easy Classic Cinema Online offers 100’s of OnDemand Movies OutSync will Sync Photos of your Friends on Facebook and Outlook

    Read the article

  • Clarification On Write-Caching Policy, Its Underlying Options And How It Applies To Hard Drives And Solid-State Drives

    - by Boris_yo
    In last week after doing more research on subject matter, I have been wondering about what I have been neglecting all those years to understand write-caching policy, always leaving it on default setting. Write-caching policy improves writing performance and consists of write-back caching and write-cache buffer flushing. This is how I understand all the above, but correct me if I erred somewhere: Write-through cache / Write-through caching itself is not a part of write caching policy per se and it's when data is written to both cache and storage device so if Windows will need that data later again, it is retrieved from cache and not from storage device which means only improved read performance as there is no need for waiting for storage device to read required data again. Since data is still written to storage device, write performance isn't improved and represents no risk of data loss or corruption in case of power failure or system crash while only data in cache gets lost. This option seems to be enabled by default and is recommended for removable devices with no need to use function of "Safely Remove Hardware" on user's part. Write-back caching is similar to above but without writing data to storage device, periodically releasing data from cache and writing to storage device when it is idle. In my opinion this option improves both read and write performance but represents risk if power failure or system crash occurs with the outcome of not only losing data eventually to be written to storage device, but causing file inconsistencies or corrupted file system. Write-back caching cannot be enabled together with write-through caching and it is not recommended to be enabled if no backup power supply is availabe. Write-cache buffer flushing I reckon is similar to write-back caching but enables immediate release and writing of data from cache to storage device right before power outage occurs but I don't know if it applies also to occasional system crash. This option seem to be complementary to write-back cache reducing or potentially eliminating risk of data loss or corruption of file system. I have questions about relevance of last 2 options to today's modern SSDs in order to get best performance and with less wear on SSDs: I know that traditional hard drives come with onboard cache (I wonder what type of cache that is), but do SSDs also come with cache? Assuming they do, is this cache faster than their NAND flash and system RAM and worth taking the risk of utilizing it by enabling write-back cache? I read somewhere that generally storage device's cache is faster than RAM, but I want to be sure. Additionally I read that write-caching should be enabled since current data that is to be written later to NAND flash is kept for a while in cache and provided there is data that gets modified a lot before finally being written, holding of this data and its periodic release reduces its write times to SSD thereby reducing its wearing. Now regarding to write-cache buffer flushing, I heard that SSD controllers are so fast by themselves that enabling this option is not required, because they manage flushing. However, once again, I don't know if SSDs have their own onboard cache and whether or not it is faster than their NAND flash and system RAM because if it is, keeping this option enabled would make sense. Recently I have posted question about issue with my Intel 330 SSD 120GB which was main reason to do deeper research having suspicion of write-caching policy being the culprit of SSD's freezing issue assuming data being released is what causes freezes. Currently I have write-cache enabled and write-cache buffer flushing disabled because I believe SSD controller's management of write-cache flushing and Windows write-cache buffer flushing are conflicting with each other: Since I want to troubleshoot in small steps to finally determine the source of issue, I have decided to start with write-caching policy and the move to drivers, switching to AHCI later on and finally disabling DIPM (device initiated power management) through registry modification thanks to @TomWijsman

    Read the article

  • What's So Smart About Oracle Exadata Smart Flash Cache?

    - by kimberly.billings
    Want to know what's so "smart" about Oracle Exadata Smart Flash Cache? This three minute video explains how Oracle Exadata Smart Flash Cache helps solve the random I/O bottleneck challenge and delivers extreme performance for consolidated database applications. Exadata Smart Flash Cache is a feature of the Sun Oracle Database Machine. With it, you get ten times faster I/O response time and use ten times fewer disks for business applications from Oracle and third-party providers. Read the whitepaper for more information. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • Can I copy large files faster without using the file cache?

    - by Veazer
    After adding the preload package, my applications seem to speed up but if I copy a large file, the file cache grows by more than double the size of the file. By transferring a single 3-4 GB virtualbox image or video file to an external drive, this huge cache seems to remove all the preloaded applications from memory, leading to increased load times and general performance drops. Is there a way to copy large, multi-gigabyte files without caching them (i.e. bypassing the file cache)? Or a way to whitelist or blacklist specific folders from being cached?

    Read the article

  • From HttpRuntime.Cache to Windows Azure Caching (Preview)

    - by Jeff
    I don’t know about you, but the announcement of Windows Azure Caching (Preview) (yes, the parentheses are apparently part of the interim name) made me a lot more excited about using Azure. Why? Because one of the great performance tricks of any Web app is to cache frequently used data in memory, so it doesn’t have to hit the database, a service, or whatever. When you run your Web app on one box, HttpRuntime.Cache is a sweet and stupid-simple solution. Somewhere in the data fetching pieces of your app, you can see if an object is available in cache, and return that instead of hitting the data store. I did this quite a bit in POP Forums, and it dramatically cuts down on the database chatter. The problem is that it falls apart if you run the app on many servers, in a Web farm, where one server may initiate a change to that data, and the others will have no knowledge of the change, making it stale. Of course, if you have the infrastructure to do so, you can use something like memcached or AppFabric to do a distributed cache, and achieve the caching flavor you desire. You could do the same thing in Azure before, but it would cost more because you’d need to pay for another role or VM or something to host the cache. Now, you can use a portion of the memory from each instance of a Web role to act as that cache, with no additional cost. That’s huge. So if you’re using a percentage of memory that comes out to 100 MB, and you have three instances running, that’s 300 MB available for caching. For the uninitiated, a Web role in Azure is essentially a VM that runs a Web app (worker roles are the same idea, only without the IIS part). You can spin up many instances of the role, and traffic is load balanced to the various instances. It’s like adding or removing servers to a Web farm all willy-nilly and at your discretion, and it’s what the cloud is all about. I’d say it’s my favorite thing about Windows Azure. The slightly annoying thing about developing for a Web role in Azure is that the local emulator that’s launched by Visual Studio is a little on the slow side. If you’re used to using the built-in Web server, you’re used to building and then alt-tabbing to your browser and refreshing a page. If you’re just changing an MVC view, you’re not even doing the building part. Spinning up the simulated Azure environment is too slow for this, but ideally you want to code your app to use this fantastic distributed cache mechanism. So first off, here’s the link to the page showing how to code using the caching feature. If you’re used to using HttpRuntime.Cache, this should be pretty familiar to you. Let’s say that you want to use the Azure cache preview when you’re running in Azure, but HttpRuntime.Cache if you’re running local, or in a regular IIS server environment. Through the magic of dependency injection, we can get there pretty quickly. First, design an interface to handle the cache insertion, fetching and removal. Mine looks like this: public interface ICacheProvider {     void Add(string key, object item, int duration);     T Get<T>(string key) where T : class;     void Remove(string key); } Now we’ll create two implementations of this interface… one for Azure cache, one for HttpRuntime: public class AzureCacheProvider : ICacheProvider {     public AzureCacheProvider()     {         _cache = new DataCache("default"); // in Microsoft.ApplicationServer.Caching, see how-to      }         private readonly DataCache _cache;     public void Add(string key, object item, int duration)     {         _cache.Add(key, item, new TimeSpan(0, 0, 0, 0, duration));     }     public T Get<T>(string key) where T : class     {         return _cache.Get(key) as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } public class LocalCacheProvider : ICacheProvider {     public LocalCacheProvider()     {         _cache = HttpRuntime.Cache;     }     private readonly System.Web.Caching.Cache _cache;     public void Add(string key, object item, int duration)     {         _cache.Insert(key, item, null, DateTime.UtcNow.AddMilliseconds(duration), System.Web.Caching.Cache.NoSlidingExpiration);     }     public T Get<T>(string key) where T : class     {         return _cache[key] as T;     }     public void Remove(string key)     {         _cache.Remove(key);     } } Feel free to expand these to use whatever cache features you want. I’m not going to go over dependency injection here, but I assume that if you’re using ASP.NET MVC, you’re using it. Somewhere in your app, you set up the DI container that resolves interfaces to concrete implementations (Ninject call is a “kernel” instead of a container). For this example, I’ll show you how StructureMap does it. It uses a convention based scheme, where if you need to get an instance of IFoo, it looks for a class named Foo. You can also do this mapping explicitly. The initialization of the container looks something like this: ObjectFactory.Initialize(x =>             {                 x.Scan(scan =>                         {                             scan.AssembliesFromApplicationBaseDirectory();                             scan.WithDefaultConventions();                         });                 if (Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.IsAvailable)                     x.For<ICacheProvider>().Use<AzureCacheProvider>();                 else                     x.For<ICacheProvider>().Use<LocalCacheProvider>();             }); If you use Ninject or Windsor or something else, that’s OK. Conceptually they’re all about the same. The important part is the conditional statement that checks to see if the app is running in Azure. If it is, it maps ICacheProvider to AzureCacheProvider, otherwise it maps to LocalCacheProvider. Now when a request comes into your MVC app, and the chain of dependency resolution occurs, you can see to it that the right caching code is called. A typical design may have a call stack that goes: Controller –> BusinessLogicClass –> Repository. Let’s say your repository class looks like this: public class MyRepo : IMyRepo {     public MyRepo(ICacheProvider cacheProvider)     {         _context = new MyDataContext();         _cache = cacheProvider;     }     private readonly MyDataContext _context;     private readonly ICacheProvider _cache;     public SomeType Get(int someTypeID)     {         var key = "somename-" + someTypeID;         var cachedObject = _cache.Get<SomeType>(key);         if (cachedObject != null)         {             _context.SomeTypes.Attach(cachedObject);             return cachedObject;         }         var someType = _context.SomeTypes.SingleOrDefault(p => p.SomeTypeID == someTypeID);         _cache.Add(key, someType, 60000);         return someType;     } ... // more stuff to update, delete or whatever, being sure to remove // from cache when you do so  When the DI container gets an instance of the repo, it passes an instance of ICacheProvider to the constructor, which in this case will be whatever implementation was specified when the container was initialized. The Get method first tries to hit the cache, and of course doesn’t care what the underlying implementation is, Azure, HttpRuntime, or otherwise. If it finds the object, it returns it right then. If not, it hits the database (this example is using Entity Framework), and inserts the object into the cache before returning it. The important thing not pictured here is that other methods in the repo class will construct the key for the cached object, in this case “somename-“ plus the ID of the object, and then remove it from cache, in any method that alters or deletes the object. That way, no matter what instance of the role is processing the request, it won’t find the object if it has been made stale, that is, updated or outright deleted, forcing it to attempt to hit the database. So is this good technique? Well, sort of. It depends on how you use it, and what your testing looks like around it. Because of differences in behavior and execution of the two caching providers, for example, you could see some strange errors. For example, I immediately got an error indicating there was no parameterless constructor for an MVC controller, because the DI resolver failed to create instances for the dependencies it had. In reality, the NuGet packaged DI resolver for StructureMap was eating an exception thrown by the Azure components that said my configuration, outlined in that how-to article, was wrong. That error wouldn’t occur when using the HttpRuntime. That’s something a lot of people debate about using different components like that, and how you configure them. I kinda hate XML config files, and like the idea of the code-based approach above, but you should be darn sure that your unit and integration testing can account for the differences.

    Read the article

  • Cached ObjectDataSource not firing Select Event even Cache Dependecy Removed

    - by John Polvora
    I have the following scenario. A Page with a DetailsView binded to an ObjectDatasource with cache-enabled. The SelectMethod is assigned at Page_Load event, depending on my User Level Logic. After assigned the selectMethod and Parameters for the ODS, if Cache not exists, then ODS will be cached the first time. The next time, the cache will be applied to the ODS and the select event don't need to be fired since the dataresult is cached. The problem is, the ODS Cache works fine, but I have a Refresh button to clear the cache and rebind the DetailsView. Am I doing correctly ? Below is my code. <asp:DetailsView ID="DetailsView1" runat="server" DataSourceID="ObjectDataSource_Summary" EnableModelValidation="True" EnableViewState="False" ForeColor="#333333" GridLines="None"> </asp:DetailsView> <asp:ObjectDataSource ID="ObjectDataSource_Summary" runat="server" SelectMethod="" TypeName="BL.BusinessLogic" EnableCaching="true"> <SelectParameters> <asp:Parameter Name="idCompany" Type="String" /> <SelectParameters> </asp:ObjectDataSource> <asp:ImageButton ID="ImageButton_Refresh" runat="server" OnClick="RefreshClick" ImageUrl="~/img/refresh.png" /> And here is the code behind public partial class Index : Page { protected void Page_Load(object sender, EventArgs e) { ObjectDataSource_Summary.SelectMethod = ""; ObjectDataSource_Summary.SelectParameters[0].DefaultValue = ""; switch (this._loginData.UserLevel) //this is a struct I use for control permissions e pages behaviour { case OperNivel.SysAdmin: case OperNivel.SysOperator: { ObjectDataSource_Summary.SelectMethod = "SystemSummary"; ObjectDataSource_Summary.SelectParameters[0].DefaultValue = "0"; break; } case OperNivel.CompanyAdmin: case OperNivel.CompanyOperator: { ObjectDataSource_Summary.SelectMethod = "CompanySummary"; ObjectDataSource_Summary.SelectParameters[0].DefaultValue = this._loginData.UserLevel.ToString(); break; } default: break; } } protected void Page_LoadComplete(object sender, EventArgs e) { if (Cache[ObjectDataSource_Summary.CacheKeyDependency] == null) { this._loginData.LoginDatetime = DateTime.Now; Session["loginData"] = _loginData; Cache[ObjectDataSource_Summary.CacheKeyDependency] = _loginData; DetailsView1.DataBind(); } } protected void RefreshClick(object sender, ImageClickEventArgs e) { Cache.Remove(ObjectDataSource_Summary.CacheKeyDependency); } } Can anyone help me? The Select() Event of the ObjectDasource is not firing even I Remove the CacheKey Dependency

    Read the article

  • RAID 0 Volatile Volume Cache Mode configuration

    - by SnippetSpace
    I discovered that in IRST there is an option to set a cache mode for my 3 ssd raid 0 array. I've read the documentation by Intel and have some questions: Are there any overall benefits/risks from enabling cache mode? As I'm on a laptop, would write back be recommended? I read it increases chance of data loss on power interruption. What is the difference between how windows handles data integrity and the intel driver? Read only mode seems to have the benefit of faster reads, does it have any downsides? Thanks for your help guys!

    Read the article

  • tweak windows 7 virtual memory and cache / caching settings

    - by bortao
    im on windows 7 64 bit, with 4gb of memory whenever i copy or deal with a big ammount of data, windows swaps out everything from memory to the virtual memory swapfile, to make room to data cache. the problem is: i dont really need caching of this data im copying, its being copied only once, cacheing this data won't help me. on the other hand, swapping out the programs will give me a big lag time whenever i want to use those open programs again. what i want: restrict data cache to a certain ammount, lets say 1gb, or reserve a certain ammount of memory, lets say 2gb, exclusively for running programs memory. my swap file is on a separate partition, but i still have problems with swapping time.

    Read the article

  • tweak windows 7 virtual memory and cache / caching settings

    - by bortao
    im on windows 7 64 bit, with 4gb of memory whenever i copy or deal with a big ammount of data, windows swaps out everything from memory to the virtual memory swapfile, to make room to data cache. the problem is: i dont really need caching of this data im copying, its being copied only once, cacheing this data won't help me. on the other hand, swapping out the programs will give me a big lag time whenever i want to use those open programs again. what i want: restrict data cache to a certain ammount, lets say 1gb, or reserve a certain ammount of memory, lets say 2gb, exclusively for running programs memory. my swap file is on a separate partition, but i still have problems with swapping time.

    Read the article

  • How to place SuperFetch cache on an SSD?

    - by Ian Boyd
    I'm thinking of adding a solid state drive (SSD) to my existing Windows 7 installation. I know I can (and should) move my paging file to the SSD: Should the pagefile be placed on SSDs? Yes. Most pagefile operations are small random reads or larger sequential writes, both of which are types of operations that SSDs handle well. In looking at telemetry data from thousands of traces and focusing on pagefile reads and writes, we find that Pagefile.sys reads outnumber pagefile.sys writes by about 40 to 1, Pagefile.sys read sizes are typically quite small, with 67% less than or equal to 4 KB, and 88% less than 16 KB. Pagefile.sys writes are relatively large, with 62% greater than or equal to 128 KB and 45% being exactly 1 MB in size. In fact, given typical pagefile reference patterns and the favorable performance characteristics SSDs have on those patterns, there are few files better than the pagefile to place on an SSD. What I don't know is if I even can put a SuperFetch cache (i.e. ReadyBoost cache) on the solid state drive. I want to get the benefit of Windows being able to cache gigabytes of frequently accessed data on a relativly small (e.g. 30GB) solid state drive. This is exactly what SuperFetch+ReadyBoost (or SuperFetch+ReadyDrive) was designed for. Will Windows offer (or let) me place a ReadyBoost cache on a solid state flash drive connected via SATA? A problem with the ReadyBoost cache over the ReadyDrive cache is that the ReadyBoost cache does not survive between reboots. The cache is encrypted with a per-session key, making its existing contents unusable during boot and SuperFetch pre-fetching during login. Update One I know that Windows Vista limited you to only one ReadyBoost.sfcache file (I do not know if Windows 7 removed that limitation): Q: Can use use multiple devices for EMDs? A: Nope. We've limited Vista to one ReadyBoost per machine Q: Why just one device? A: Time and quality. Since this is the first revision of the feature, we decided to focus on making the single device exceptional, without the difficulties of managing multiple caches. We like the idea, though, and it's under consideration for future versions. I also know that the 4GB limit on the cache file was a limitation of the FAT filesystem used on most USB sticks - an SSD drive would be formatted with NTFS: Q: What's the largest amount of flash that I can use for ReadyBoost? A: You can use up to 4GB of flash for ReadyBoost (which turns out to be 8GB of cache w/ the compression) Q: Why can't I use more than 4GB of flash? A: The FAT32 filesystem limits our ReadyBoost.sfcache file to 4GB Can a ReadyBoost cache on an NTFS volume be larger than 4GB? Update Two The ReadyBoost cache is encrypted with a per-boot session key. This means that the cache has to be re-built after each boot, and cannot be used to help speed boot times, or latency from login to usable. Windows ReadyDrive technology takes advantage of non-volatile (NV) memory (i.e. flash) that is incorporated with some hybrid hard drives. This flash cache can be used to help Windows boot, or resume from hibernate faster. Will Windows 7 use an internal SSD drive as a ReadyBoost/*ReadyDrive*/SuperFetch cache? Is it possible to make Windows store a SuperFetch cache (i.e. ReadyBoost) on a non-removable SSD? Is it possible to not encrypt the ReadyBoost cache, and if so will Windows 7 use the cache at boot time? See also SuperUser.com: ReadyBoost + SSD = ? Windows 7 - ReadyBoost & SSD drives? Support and Q&A for Solid-State Drives Using SDD as a cache for HDD, is there a solution? Performance increase using SSD for paging/fetch/cache or ReadyBoost? (Win7) Windows 7 To Boost SSD Performance How to Disable Nonvolatile Caching

    Read the article

  • Strange Unclearable Cache Issue with Gmail and Google Apps

    - by Brian
    I am having a strange issue with Gmail and Google Apps... have a look at this screenshot: http://cld.ly/f51ume notice the missing images for the rounded corners? Well this is not such a problem but something similar to a cache issue is causing this as well as no background image, but MOST IMPORTANTLY chat and other "clickable" features aren't working. I've already cleared my cache multiple times and quit and re-started Firefox with no change. Everything is OK in other browsers. Any other debug suggestions?

    Read the article

  • 503 error Varnish cache when eAccelerator is started

    - by Netismine
    I have a Magento installation running on x-large Amazon server. I have Varnish, memcached and eAccelerator installed on the server. At first everything was working fine, but then at some point it stopped working, throwing 503 error with Varnish cache stamp below it. When I disable eaccelerator, error is gone and site is working. This is my eaccelerator config: extension="eaccelerator.so" eaccelerator.shm_size = "512" eaccelerator.cache_dir = "/var/cache/php-eaccelerator" eaccelerator.enable = "1" eaccelerator.optimizer = "1" eaccelerator.debug = 0 eaccelerator.log_file = "/var/log/httpd/eaccelerator_log" eaccelerator.name_space = "" eaccelerator.check_mtime = "1" eaccelerator.filter = "" eaccelerator.shm_ttl = "0" eaccelerator.shm_prune_period = "0" eaccelerator.shm_only = "0" eaccelerator.allowed_admin_path = "" any hints?

    Read the article

  • Force Windows to cache executables without running them?

    - by Josh Einstein
    Is there a way to force Windows to pre-load certain EXE/DLL binaries into its prefetch/superfetch cache as if they had been executed? I have a particular application that loads pretty slowly on first run but if it's "warm" (recently executed) it starts pretty quickly. I'd like to prime the cache early in the background before the application is needed. But since it shows a UI, I'm looking for a way to do this silently. So simply launching the application it isn't ideal. Thanks you in advance. Prompted by David's suggestion in the comments, I wrote a PowerShell script to memory map the files, seek to the end, and close them. I haven't done any controlled tests yet and it could just be my imagination, but Sublime Text (the application in question) appeared to load much more quickly this time around and I haven't used it for several hours.

    Read the article

  • using one disk as cache for others

    - by HugoRune
    Hi Given a PC with several hard drives: Is it possible to use one fast disk as a giant file cache? I.e. automatically copying frequently accessed data to that one disk, and transparently redirecting reads and writes to that disk, so that other drives would only have be accessed occassionally. (writes would have to be forwarded to the other disks after a while of course) Advantages: the other drives could be powered down most of the time; reducing power, heat, noise speed of the other drives would not matter much. cache disk could be solid state. How can I set such a system up? What OS supports these options? Is this possible at all using Windows or Linux?

    Read the article

  • Google Chrome not using local cache

    - by Steve
    Hi. I've been using Google Chrome as a substitute for Firefox not being able to handle having lots of tabs open at the same time. Unfortunately, it looks like Chrome is having the same problem. Freakin useless. I had to end Chrome as my whole system had slowed to a crawl. When I restarted it, I opted to restore the tabs that were last open. At this stage, every one of the 20+ tabs srated downloading the pages they had previously had open. My question is: why can't they open a locally stored/saved copy of the web page from cache? Does Google Chrome store pages in a cache? Also: after most of the pages had completed their downloading, I clicked on each tab to view the page. Half of them only display a white page, and I have to reload the page manually. What is causing this? Thanks for your help.

    Read the article

  • HTTP Cache Control max-age, must-revalidate

    - by nyb
    I have a couple of queries related to Cache-Control. If I specify Cache-Control "max-age=3600, must-revalidate" for a static html/js/images/css file, with Last Modified Header defined in HTTP header, a. Does browser/proxy cache(liek Squid/Akamai) go all the way to orgin server to validate before max-age expires? Or will it serve content from cache till max-age expires? b. After max-age expiry(that is expiry from cache), is there a IMS check or is content re-downloaded from origin server w/o IMS check?

    Read the article

  • MySQL query cache and PHP variables

    - by Saif Bechan
    I have seen the following statement made about the query cache: // query cache does NOT work $r = mysql_query("SELECT username FROM user WHERE signup_date >= CURDATE()"); // query cache works! $today = date("Y-m-d"); $r = mysql_query("SELECT username FROM user WHERE signup_date >= '$today'"); So the query cache only works on the second query. I was wondering if the query cache will also work on this: define('__TODAY',date("Y-m-d")); $r = mysql_query("SELECT username FROM user WHERE signup_date >= '".__TODAY."'");

    Read the article

  • How do I take advantage of Android's "Clear Cache" button

    - by Jay Askren
    In Android's settings, in the "Manage Applications" activity when clicking on an app, the data is broken down into Application, Data, and cache. There is also a button to clear the cache. My app caches audio files and I would like the user to be able to clear the cache using this button. How do I store them so they get lumped in with the cache and the user can clear them? I've tried storing files using both of the following techniques: newFile = File.createTempFile("mcb", ".mp3", context.getCacheDir()); newFile = new File(context.getCacheDir(), "mcb.mp3"); newFile.createNewFile(); In both cases, these files are listed as Data and not Cache.

    Read the article

  • how to disable web page cache throughout the servlets

    - by Kurt
    To no-cache web page, in the java controller servlet, I did somthing like this in a method: public ModelAndView home(HttpServletRequest request, HttpServletResponse response) throws Exception { ModelAndView mav = new ModelAndView(ViewConstants.MV_MAIN_HOME); mav.addObject("testing", "Test this string"); mav.addObject(request); response.setHeader("Cache-Control", "no-cache, no-store"); response.setHeader("Pragma", "no-cache"); response.setDateHeader("Expires", 0); return mav; } But this only works for a particular response object. I have many similar methods in a servlet. And I have many servlets too. If I want to disable cache throughout the application, what should I do? (I do not want to add above code for every single response object) Thanks in advance.

    Read the article

  • SQL Server Manageability Series: how to change the default path of .cache files of a data collector? #sql #mdw #dba

    - by ssqa.net
    How to change the default path of .cache files of a data collector after the Management Data Warehouse (MDW has been setup? This was the question asked by one of the DBAs in a client's place, instantly I enquired that were there any folder specified while setting up the MDW and obvious answer was no as there were left default. This means all the .CACHE files are stored under %C\TEMP directory which may post out of disk space problem on the server where the MDW is setup to collect. Going back...(read more)

    Read the article

  • Would there be any negative side-effects of sharing /var/cache/apt/ between two systems?

    - by ændrük
    In the interest of conserving bandwidth, I'm considering mounting a VirtualBox host's /var/cache/apt as /var/cache/apt in the guest. Both host and guest are Ubuntu 10.10 32-bit. Would there be any negative consequences to doing this? I'm aware of the more robust solutions like apt-proxy, but I'd prefer this simpler solution if it's possible in order to spare the host the overhead of running extra services.

    Read the article

  • jboss cache as hibernate 2nd level - cluster node doesn't persist replicated data

    - by Sergey Grashchenko
    I'm trying to build an architecture basically described in user guide http://www.jboss.org/file-access/default/members/jbosscache/freezone/docs/3.2.1.GA/userguide_en/html/cache_loaders.html#d0e3090 (Replicated caches with each cache having its own store.) but having jboss cache configured as hibernate second level cache. I've read manual for several days and played with the settings but could not achieve the result - the data in memory (jboss cache) gets replicated across the hosts, but it's not persisted in the datasource/database of the target (not original) cluster host. I had a hope that a node might become persistent at eviction, so I've got a cache listener and attached it to @NoveEvicted event. I found that though I could adjust eviction policy to fully control it, no any persistence takes place. Then I had a though that I could try to modify CacheLoader to set "passivate" to true, but I found that in my case (hibernate 2nd level cache) I don't have a way to access a loader. I wonder if replicated data persistence is possible at all by configuration tuning ? If not, will it work for me to create some manual peristence in CacheListener (I could check whether the eviction event is local, and if not - persist it to hibernate datasource somehow) ? I've used mvcc-entity configuration with the modification of cacheMode - set to REPL_ASYNC. I've also played with the eviction policy configuration. Last thing to mention is that I've tested entty persistence and replication in project that has been generated with Seam. I guess it's not important though.

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >