Search Results

Search found 39784 results on 1592 pages for 'ignore files'.

Page 424/1592 | < Previous Page | 420 421 422 423 424 425 426 427 428 429 430 431  | Next Page >

  • Overcrowded Windows XP Folders

    - by BlairHippo
    I know that, technically, an individual Windows XP directory can hold an immense number of files (over 4.29 billion, according to a quick Google search). However, is there a practical ceiling where too many files in one directory starts having an impact on reads to those files? If so, what factors would exacerbate or help the issue? I ask because my employer has several hundred XP machines in the field at client sites, and the performance on some of the older ones is getting "sludgy." The machines download and display client-defined images, and my supervisor and I suspect that our slacktastic approach to cache management could be to blame. (Some of the directories have tens of thousands of images in them.) I'm trying to gather evidence to support or contest the theory before spending time on a coding fix.

    Read the article

  • Address bar in Finder?

    - by wag2639
    I'm used to knowing where all my files are (and I'm anal about it -- I don't need Mr Jobs thinking he knows best about where my files should go). Is there a way to get an address bar to show up in Finder in OSX (10.5+) like in Explorer in Windows or Nautilus in Gnome. Edit: I also want to be able to copy the address bar. Perhaps the workflow is different on a Mac, but I'm use to throughly sorting my files under many layers of folders and then when I need to upload or download something, or access a file in command line or etc, I can copy and paste that directly into the file dialog. To clarify, my goal is to have an experience like in Windows: press Ctrl + D (CMD + L) and Ctrl + C.

    Read the article

  • I started getting a weird message "Encrypting file system - Back up your file encryption key"

    - by Ove
    I started getting a strange message when I start my computer. An icon appears in the system tray, and a popup tells me "Encrypting file system - Back up your file encryption key". I know what EFS is, but I don't use it. To my knowledge, I don't have any encrypted files on my partition. I have searched using Total Commander on all the partitions for files that have the "encrypted" attribute, but I found nothing. So I don't have any encrypted files. Does anyone know what I did to get this message?

    Read the article

  • Windows command line built-in compression/decompression tool?

    - by Will Marcouiller
    I need to write a batch file to unzip files to their current folder from a given root folder. Folder 0 |----- Folder 1 | |----- File1.zip | |----- File2.zip | |----- File3.zip | |----- Folder 2 | |----- File4.zip | |----- Folder 3 |----- File5.zip |----- FileN.zip So, I wish that my batch file is launched like so: ocd.bat /d="Folder 0" Then, make it iterate from within the batch file through all of the subfolders to unzip the files exactly where the .zip files are located. So here's my question: Does the Windows (from XP at least) have a command line for its embedded zip tool? Otherwise, shall I stick to another third-party util?

    Read the article

  • "I/O Error Occurred" in vSphere Client working with ESXi

    - by Chris
    I have a datastore set up in ESXi where I put all my ISOs. Somehow, something broke (I don't know what) and now I can't upload files to that (or any other) datastore. For large, ISO-sized files, the "Uploading..." dialog pops up, hangs for a while, and then the "I/O Error Occurred" displays. For smaller files (10 meg neighborhood), the "Uploading..." dialog comes up, a progress bar starts going, and it estimates a time remaining. Then it hangs at 1 second remaining for a while, and the same "I/O Error Occurred" comes up. Has anyone seen a problem like this?

    Read the article

  • How to change drag & drop behaviour in Windows 7's explorer?

    - by Pekka
    I have a new touch screen, and am playing around with its functionality. The most productive use for me is organizing files (literally) by hand. It's fun working through a list of files, dragging and dropping them to the right locations using your index finger. It feels better on the wrist than mouse-clicking, too. The only problem is that when I drag & drop files across drives in Windows 7, the default behaviour is to copy the file instead of moving it. I know I can influence this using right click, but that is of course no option in my situation. How can I change the default drag & drop behaviour in Windows 7's explorer?

    Read the article

  • How to use BT or emule across 2 or more hard drives?

    - by the searcher
    One difficulty with BT or emule is that, when the hard drive is full, we constantly need to move older files to a new hard drive so that we can download newer files. We can change BT or emule's setting so that the folder for downloading points to the new hard drive, but then, what if emule haven't finished downloading for some files that are hard to find, and it is 92% done... in that case, we would like to keep the old setting so that when the last 8% arrives, it can go into the correct file. (and same for BT, if we haven't finished some file or if we want to seed something later). So is there a good way to let BT or emule point to 2 hard drives, or somehow let the new hard drive "merge" into the existing hard drive / folder?

    Read the article

  • How to do simple multitasked loop processing over filenames with PowerShell?

    - by Ville Koskinen
    I'm batch transcoding some 50 GB of video files on a USB hard disk which is connected to a wlan router. The drive is mapped as a network drive on my Windows 7 laptop. The speed handicap of the wlan causes some parts of the processing to become unnecessarily slow, so I would like to do the following with PowerShell: List the names of the files on the network drive to be transcoded Copy the first file to a temporary folder on my laptop Simultaneously Transcode the file in the folder Begin copying the next file from the network drive to the temporary folder After transcoding and copy have both ended, Delete the file which has been transcoded from the temporary folder Begin transcoding next file in the temporary folder Loop until all files have been processed How would I be able to do this with PowerShell? The multitasking part is an obstacle for my skill/persistence combination.

    Read the article

  • Clean URLS on Hiawatha

    - by Botto
    I am using the Hiawatha web server and running drupal on a FastCGI PHP server. The drupal site is using imagecache and it requires either private files or clean urls. The issue I am having with clean urls is that requests to files are being rewritten into index.php as well. My current config is: UrlToolkit { ToolkitID = drupal RequestURI exists Return Match (/files/*) Rewrite $1 Match ^/(.*) Rewrite /index.php?q=$1 } The above does not work. Drupal's apache set up is: <Directory /var/www/example.com> RewriteEngine on RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </Directory>

    Read the article

  • re: 3ware raid 10 (4drive) suggested stripe size suggestions?

    - by dasko
    looked around on the site but nothing really concrete on my question. i will have about 120GB of data total, files are made up of 5MB files, excel, word and about 25 .pst files that are about 1.2GB each. Yes they use .pst over network, even though it is not recommended this is legacy setup without issue so we will continue to support this for another year or so. I need to know what you think about a stripe size of 256kb for the raid 10 based on the above requirements. I did try and bench with these settings and it seems alright without any real issue, just trying to rule out anything i might of missed. thanks.

    Read the article

  • Apache only transferring partial content from a Samba share

    - by thaBadDawg
    I have an Apache server running on CentOS 5.3. It currently hosts 12 sites with no known issues. (I say this to point out that up to this point my Apache installation has performed flawlessly) I'm adding a new site where the DocumentRoot of the new VirtualHost is a Samba share. When at the command line of the server I can cp video.m4v ~ and the whole file is copied properly to my home directory. But when I try to access the file from IE/Firefox/Safari/Chrome it only passes back a partial result of 33k. The same thing is happening with my image and audio files. If I make the files local to the server by copying them from the share and then serving them up then the files transfer. Any ideas?

    Read the article

  • Extract and install Word 2003 standalone without full CD

    - by pcampbell
    Given a proper Office 2003 CD, is it possible to extract just the files that are needed for one application... i.e. Word or Excel? Browsing the CD, you can see WORD11.MSI. The goal here is to extract just the necessary bits to install the one app. Disk space isn't the concern, but rather the larger question of 'is it possible' and how? Is it possible to copy those files from the CD to another location to allow the installation of just one application? What files would be required from the CD to accomplish this?

    Read the article

  • Blocking a specific URL by IP (a URL create by mod-rewrite)

    - by Alex
    We need to block a specific URL for anyone not on a local IP (anyone without a 192.168.. address) We however cannot use apache's <Directory /var/www/foo/bar> Order allow,deny Allow from 192.168 </Directory> <Files /var/www/foo/bar> Order allow,deny Allow from 192.168 <Files> Because these would block specific files or directories, we need to block a specific URL which is created by mod-rewrite and the page is dynamically created using PHP. Any ideas would be greatly appreciated

    Read the article

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • SBSMonitoring.mdf reached limit

    - by Bastien974
    I have SBS 08 Standart. I have some Error in my Event Viewer with MSSQL$SBSMONITORING Event id 1105, 1827: Could not allocate space for object 'dbo.EventLog'.'PK_EventLog' in database 'SBSMonitoring' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. CREATE DATABASE or ALTER DATABASE failed because the resulting cumulative database size would exceed your licensed limit of 4096 MB per database. I tried to schrink the database, worked for SBSMonitoring_log.LDF but nothing for the SBSMonitoring.mdf, still 4096MB. I don't know how to reinstall the monitoring. Thanks for your help.

    Read the article

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

  • IIS 6.0 FTP Folder Permissions

    - by Beuy
    I have a IIS FTP website setup like this \ftp\users\domain\public\public Software that runs on clients computers logs into the FTP server by specifying domain\public and moving to public, it then uploads or downloads files / folders into that area. I want to restrict permissions on \ftp\users\domain\public so that nothing/nobody can write files or folders here, only to \ftp\users\domain\public\public. I setup the NTFS permissions of the folder to remove domain\users, public and server\users to not have modify right, yet I can still upload/modify files. I have disabled inheritance from the parent folder of \ftp\users\domain\public as well. Any ideas on what I'm missing here? P.S I know this is a stupid setup and makes no sense, it's some bizarre legacy application that I need to migrate to a safer environment until it can be replaced.

    Read the article

  • wsgi - narrow user permissions.

    - by Tomasz Wysocki
    I have following Apache configuration and my application is working fine: <VirtualHost *:80> ServerName ig-test.example.com WSGIScriptAlias / /home/ig-test/src/repository/django.wsgi WSGIDaemonProcess ig-test user=ig-test </VirtualHost> But I want to protect my files from other users, so I do: chown ig-test /home/ig-test/ -R chmod og-rwx /home/ig-test/ -R And application stops working: (13)Permission denied: /home/ig-test/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Is it possible to achieve what i'm doing with wsgi? If I have to give read permissions to some files it will be fine. But there are files I have to protect (like file with DB configuration or business logic of application).

    Read the article

  • How to configure nginx to serve static contents from RAM?

    - by Vijayendra Tripathi
    I want to set up nginx as my web server. I want to have image files cached in the memory (RAM) rather then disk. I am serving a small page and want few images always served from RAM. I dont wish to use varnish (or any other such tools) for this as I believe nginx has a capability to cache contents into RAM. I am not sure as how may I configure nginx for this? I did try few combinations but they didn't work. nginx uses disk all the time to get images. For example, when I tried apache benchmark to test with following command - ab -c 500 -n 1000 http://localhost/banner.jpg I get following error - socket: Too many open files (24) I guess this means nginx is trying to open to many files simultaneously from the disk and OS is not allowing this operation. Can anyone please suggest me a correct configuration? Thanks for considering this message.

    Read the article

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Windows 7 - sharing file - slow seek time?

    - by progtick
    I do not want to copy the files. I simply have video files (.flv) on one computer and I would like another computer user to watch them without copying. Playback is fine, but seek time (as in if you move the cursor to skip some portion of the video) takes forever! I thought wireless speed might be the culprit, so I wired the two computer. Maybe I saw some improvement but still so bad. It's 1 Gbps! (I know real speed will vary, but before I monitor real speed and such, do I have reasonable issue? Or am I bound to have very slow seek time?) What is going on? I must mention some of these files are huge!

    Read the article

  • SQL Maintenance Cleanup Task Working but Not Deleting

    - by Alex
    I have a Maintenance Plan that is suppose to go through the BACKUP folder and remove all .bak older than 5 days. When I run the job, it gives me a success message but older .bak files are still present. I've tried the step at the following question: SQL Maintenance Cleanup Task 'Success' But not deleting files Result is column IsDamaged = 0 I've verified with the following question and this is not my issue: Maintenance Cleanup Task(s) running 'successfully' but not deleting back up files. I've also tried deleting the Job and Maintenance Plan and recreating, but to no avail. Any ideas?

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

  • Git pull with unstaged changes

    - by Peter
    Attempting a git pull when you have unstaged changes will fail, saying you can commit or stash then. I suppose a workaround is to git stash, git pull, then git stash pop. However, is there an alternative way to do this? I would like to forcefully git pull if there are unstaged changes, but only if the files being brought down do not override the modified files? AKA. if I have a repo with the files "derp1", "derp2", "derp3" and modify "derp1" locally, a git pull will bring down and overwrite everything except the "derp1" file. I assume a git stash + pull + stash pop achieves this already? And is there a better way? I suppose this could also work differently if it occurs on a submodule.

    Read the article

< Previous Page | 420 421 422 423 424 425 426 427 428 429 430 431  | Next Page >