Search Results

Search found 49453 results on 1979 pages for 'memory mapped files'.

Page 110/1979 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • Windows 7 offline files - work temporarily offline even if network connection works

    - by Robert
    Sometimes I am connected via VPN to a network containing the server where files are stored which are cached by Windows offline files feature. Sometimes the connection works good and working this way is not a problem - on other times working is quite a pain because of high latency when working with the files in the Windows explorer. Is there an interactive way how a user (with admin permissions) can temporary suspend online usage of offline files? I already activated the "Transparent caching" group policy feature (Computer Configuration Policies Administrative Templates Networks Offline Files) with a network latency of 200msec but from my experience even if I get ping times to the file server of less than 40msec online usage is quite tenacious. Setting low latency times at this point causes the offline files often to toggle which makes problems with some applications working with several files and requires them to be consistent (like SVN client).

    Read the article

  • How to make FileZilla open all the required files with one click

    - by Omar Tariq
    Is there any way of configuring FileZilla so that I can open all the files on a server that I use to edit with just one click. For example if the files are like this: /home/abc/def/one.txt /home/abc/def/yet/another/directory/two.txt /home/abc/def/ghi/yet/another/directory/three.txt Then it is very time-consuming to navigate through each directory and open the required files. These are only 3 files but what if we have around 10 to 20 files? Yes, copying the path of the directories is one thing. But something that is built-in so that I can just click a button like open all the required files of this connection and it opens all the files in the editor (as set in FileZilla preferences) then that would be great!

    Read the article

  • Adobe Illustrator Saving to PSD: "Not enough memory to save the file"

    - by fiskfisk
    This is on CS5.5 under Windows XP Professional. There seems to be a known issue about saving (large) Adobe Illustrator files to PSD (thoroughly discussed), where the exporter will complain about "Not enough memory to save the file". This happens regardless of the available memory on the computer, and seems to be a limitation in the PSD exporter itself. The only possible solution so far seems to be to copy-n-paste each layer separately from the illustrator file and into the open Photoshop file. We need to keep the layers intact (and not merged), so selecting all the layers at the same time doesn't work. Do anyone have a workaround to the actual, original export issue, or a way to be able to get the layer information into Photoshop without handling each layer separately?

    Read the article

  • Disable "System Memory Testing" via OMSA 6.4.0

    - by EGr
    Is it possible to disable system memory testing via OMSA 6.4.0? I can only find ways to do it using newer versions of OMSA; and I can't even see the setting in 6.4.0. I have quite a few machines that I want to disable this (BIOS) setting on, but I don't want to have to install the new OMSA and reboot. My intentions are to disable the setting so that when the systems are rebooted in the future, they don't need to go through the system memory testing. If it is possible to disable this another way, without OMSA or manually changing the BIOS settings, I would be open to that as well.

    Read the article

  • Reboot VPS by reaching memory limit

    - by Ali
    When a server uses memory more than available RAM, the system will shut down the virtual machine. Then, it is only possible to boot from outside (VPS control panel, e.g. vePortal or SolusVM). However, it should be possible to plan a reboot before possible shut down. What is the best practical method to check the used memory, and reboot the system upon reaching e.g. 90% of the allowed RAM? Is there a common program or script to do so? I am using Debian/Ubuntu.

    Read the article

  • What is stored in %Windir%\System32\LogFiles\WMI\RtBackup?

    - by Helge Klein
    I occasionally notice in Resource Monitor hard disk activity related to ETL files in the folder C:\Windows\System32\LogFiles\WMI\RtBackup. Which process/service creates these ETL files and what is their purpose? Resource Monitor shows "System" as the process which is correct since ETW traces (that is what ETL files are) are created by the kernel. But I am interested in the process that causes the traces to be created. This happens on Windows 7, by the way.

    Read the article

  • Is there a way to communicate DBMS with raw memory block or binaries

    - by darkcminor
    I am trying to communicate a numerical matrix operations library like LAPACK with any DBMS. Is it possible to send/receive complete matrices as binary or as a direct memory pointers to process them (it will be something like: The Outside library processes data stored in DBMS, then it computes some huge matrix stuff and then via memory block or a binary DBMS get the result from library)? The main purpose is speed and avoid passing through a flat file, and last but not least, use library toefficiently do some operations DBMS are not designed to. * Is it possible that Oracle, SQL Server, MySQL support this technique?.

    Read the article

  • How to compare mp3, flac audio data in a file, ignoring header data (ID3 tag) etc.?

    - by Rob
    I've backed up some audio files up in 2 places and added ID3 tags into one backup but not the other, since time has passed my own memory has faded on whether the backups are actually the same, but now one has ID3 data and the other doesn't, basic binary compare will fail and inspection will be cumbersome. Is there a tool to compare just the audio data (not the header, ID3) in mp3s, flac files, and other files using header data such as ID3. started a thread on beyond compare here: http://www.scootersoftware.com/vbulletin/showthread.php?t=7413 would consider other comparison software that does this task

    Read the article

  • Best memory-efficient web browser for Ubuntu?

    - by Steve K
    I've installed Ubuntu 10.10 on an old laptop with only 756 MB of RAM, Pentium M 1.6 processor. I'm using Google Chrome 11.0 (dev channel) for web browsing, and it appears to be using up most of my memory and processor time. Does anyone know of a better browser than Chrome on Ubuntu, for an older computer like mine? I'm new to Ubuntu, so there may also be tweaks I can make to my existing system to have it perform better. But right now it's pretty slow when I've got ~5-10 tabs open. Related question: memory-efficient web-browser

    Read the article

  • Why can't gif images copy at a reasonable speed on this dell laptop with XP?

    - by alt234
    I've got this somewhat old Dell Latitude D810. Strangest thing... If I try to copy anything that has gif files in it the gif files take forever. Like a few minutes per gif regardless of size. Everything else copies fine. I notice this when copying files off our network, copying off multiple external drives, and even when files are copying during an installation process. I'm on Windows XP Pro service pack 3. I've never seen anything like this before. Anyone else?

    Read the article

  • Linux: don't use file system cache under a directory

    - by GetFree
    For a PHP website I'm monitoring, I need to see what files are being used each time the browser makes a request. I thought of using find . -type f -amin 1. With that I get all files which were read in the last minute (it's a developing server so only I am using the website). I took care of removing the noatime attribute from the mounting point. However there must be something else that's preventing the kernel from reading the actual files on disk because the access time is not being updated when I read a file. I guess it must be the file-system cache which is retrieving the files from memory. Is there a way to disable file caching under a specific directory? (public_html in my case) Also I read somewhere that there is the nobh mounting atributes which apparently disables file caching under that mounting point, but I'm not sure.

    Read the article

  • frequent errors with subversion repository on fat32 on USB memory stick

    - by sal
    I keep a copy of a Subversion Repository on a USB memory stick that is formatted with FAT32. I am using TortoiseSVN on XP and command line svn 1.6.x on Ubuntu and OSX with this memory stick. I notice that I need to do an svn cleanup just about every time or updates and commits will not work. I routinely have errors with .lock and *.svn/text-base/** files getting corrupted. Errors tend to be parameter is incorrect or lock file can not be read Sometimes svn cleanup works and sometimes chflags -R nouchg * Is there anything I can do to prevent this?

    Read the article

  • Transparently cache files from a network drive in Linux

    - by Vadim
    We have a Linux server that reads files from a network drive and processes them. In a common scenario, a user will log in and access the same files over and over again. The size of the files varies but the larger ones can be around 50+ Mb. The files seldom change. I was wondering if it's somehow possible to transparently cache the files. I don't want (or can) change the program the reads the files, nor do I control the protocol by which the files are accessed. I just want something to detect that I access a certain path, copy the file locally (if needed) and then read the file from the local drive. I've read about Bcache but can't figure out if it's what I need. Do you have any suggestions? Thanks, Vadim.

    Read the article

  • spurious hardware memory 'errors' on hp dl380 g5's being generated

    - by friedchicken
    Hi All, i've got 2 new HP dl380g5 servers running HP's esxi4 patched to 219382. they have both been patched up to the latest hp firmware levels (firmware cd 8.7) both are running 32gb (4 x 8gb sticks) both servers are showing the same symptoms - the memory lights come on for two (random) dimms on the front of the server and the health led turns red. sometimes the server stays up and running with no problems. othertimes the server locks dead and only a power reset can bring it back. there is nothing showing in the ilo logs and nothing within the vmware hardware monitoring. the memory has been replaced i've got other customers that have been running dl380g5's with out any issues on esx3.5 - this is our first vsphere deployment with them. these are meant to go live soon so any advice would be great. thanks in advance.

    Read the article

  • Hardware imposed 32-bit limit

    - by knittl
    i'm thinking about converting my OS (ubuntu) to the 64 bit version to use the last bit of memory (4 gb)—ok, it's rather reinstalling … will this work as expected or are there possible limits given by the mainboard/memory controller/some other component, so i cannot fully utilize my full ram? if so, are there benefits from upgrading anyway?

    Read the article

  • Hardware imposed 32-bit limit

    - by knittl
    i'm thinking about converting my OS (ubuntu) to the 64 bit version to use the last bit of memory (4 gb)—ok, it's rather reinstalling … will this work as expected or are there possible limits given by the mainboard/memory controller/some other component, so i cannot fully utilize my full ram? if so, are there benefits from upgrading anyway?

    Read the article

  • IIS7 doesn't monitor changes across symlinks

    - by Matt Hensley
    I've used the mklink utility to create a symlink to a directory of web content. IIS7 doesn't "see" changes any classic ASP files in this linked directory without issuing an iisreset. I've disabled caching and file changes are picked up on other static files (such as .html) but .asp files are ignored.

    Read the article

  • Why is my computer using so much ram?

    - by Ian
    I have 16gb of ram installed in my computer, and nearly 70% of it is used at all times. Looking at the processes in task manager the memory being used with everything in there doesn't even add up to 2gb. I have RAPID mode enabled on my ssd which may be using 2gb at most so I would understand that. I am thinking there might be a memory leak somewhere, but I don't really know how I would be able to tell. My performance and process list is below:

    Read the article

  • mod_pagespeed using up too many resources?

    - by OC2PS
    I have found several threads from 2010 and early-2011 talking about excessive resource usage by mod_pagespeed I haven't seen any recent ones. I am facing excessive memory usage, which is causing server to crash several times a day. I am wondering if the issue was ever fixed before mod_pagespeed 1.0 was released. Also, request any tips and suggestion on how to fix the issue. My server configuration: 4 quad-core processors (likely oversold) 3GB memory Xen Centos 6 64-bit Apache cPanel PHP mySQL

    Read the article

  • Would it be better to have 10 GB PC5300 or 8 GB PC PC6400?

    - by jayrdub
    My development machine is heavy on memory usage. It currently only has 4GB of PC5300. I'm going to buy 8 GB of RAM, but should I take out all my existing memory and install just PC6400 or leave some of the original and stick with 5300? I don't know if I'll ever need more than 8GB, I might, I'm just wondering if the speed difference between 5300 and 6400 is worth trashing the 2 or 3 GB of old RAM I would still have room for after adding the new RAM.

    Read the article

  • Would it be better to have 10 GB PC5300 or 8 GB PC6400?

    - by jayrdub
    My development machine is heavy on memory usage. It currently only has 4GB of PC5300. I'm going to buy 8 GB of RAM, but should I take out all my existing memory and install just PC6400 or leave some of the original and stick with 5300? I don't know if I'll ever need more than 8GB, I might, I'm just wondering if the speed difference between 5300 and 6400 is worth trashing the 2 or 3 GB of old RAM I would still have room for after adding the new RAM.

    Read the article

  • Would it be better to have 10 GB PC5300 or 8 GB PC6400?

    - by jayrdub
    My development machine is heavy on memory usage. It currently only has 4GB of PC5300. I'm going to buy 8 GB of RAM, but should I take out all my existing memory and install just PC6400 or leave some of the original and stick with 5300? I don't know if I'll ever need more than 8GB, I might, I'm just wondering if the speed difference between 5300 and 6400 is worth trashing the 2 or 3 GB of old RAM I would still have room for after adding the new RAM.

    Read the article

  • Moving files with batch files from one pc to a server, to a another pc - worried about disk corruption

    - by AnchientAnt
    I use scheduled tasks that calls a batch file, that calls more batch files to move about three files from a pc, to a server, then to multiple other pcs. It all happens very quickly, as they are small files. Are there any pitfalls for how fast these transfers happen? I'm just mildly concerned about causing some disk corruption somehow. I use logic like 1. Call MapToPc if files exist then move file to folder on server. Disconnect 2. Call SendtoPCs If files exist (the files just moved to the server) then MapToPCs Move all files Disconnect All of this happens in about 2 secs or less. edit: this on windows 7, server 2003, xp respectively

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >