Search Results

Search found 39577 results on 1584 pages for 'temp files'.

Page 452/1584 | < Previous Page | 448 449 450 451 452 453 454 455 456 457 458 459  | Next Page >

  • How do I reinforce compression options?

    - by Gooberpatrol66
    Shortly after I got my computer, I enabled NTFS compression on it, selecting the option to compress "all files and subfolders". I recently noticed that several folders on my PC are not compressed anymore, including "Program Files" and "Windows". I suspect this happened when I installed Windows 8.1. The problem is, the only way I can think of to fix this would be to uncheck the tick box under "Properties" for my drive, thus decompressing everything on my drive, and then re-check it with the "all files and subfolders" option. Is there a way to compress all the uncompressed folders without first decompressing the compressed folders?

    Read the article

  • How can I organize my video collection and update meta data?

    - by Pieter Breed
    I have a large collection of downloaded video files containing different movies, tv shows and music videos. I have a FreeNAS box set up that uses Fuppes as a UPnP media server. My media player on Windows correctly detects this UPnP collection and can stream from it fine. However, All of my music videos, tv shows and movies are all sorted under the same 'Videos' group. I would like to seperate the different types of video files so that they can correctly go under 'Recorded TV' or whatever the case may be. Any ideas? I guess I am looking for something like an MP3Tagger but for video files?

    Read the article

  • Is there any bad thing happens if I change /etc/ldap/slapd.d/cn=config.ldif manually?

    - by HVNSweeting
    Since 2.3, OpenLDAP uses a configuration engine called slapd-config. They said that use it make all LDAP configuration can be changed on fly. This is the header of /etc/ldap/slapd.d/cn=config.ldif: # AUTO-GENERATED FILE - DO NOT EDIT!! Use ldapmodify. I've changed data in it and some other files which have that header, after restarting slapd, my changes took effects. Is there anything else happen if I change those files manually? If I don't need 'change on fly', should I edit those file manually instead of using ldapmodify? Which application generated those files, and when? NOTE: I'm using openldap-2.4.28 on Ubuntu 12.04

    Read the article

  • Is there a filesystem that is "friendly" to both windows and Linux?

    - by Somebody still uses you MS-DOS
    I'm planning to install Ubuntu 10.04 with Windows 7. (I'm new to Linux, have to use at work so I'm planning to install it at home to learn more) I plan to use a partition to my Windows system files (C:), a partition for my personal files that already exists (D:) and a new partition for Linux. What I want is to have a partition for my personal files that works across these systems - so, if I start with Windows or Linux, there's the same "Videos", "Pictures", "Projects" folders. Is it possible? Is there a hd filesystem capable of having writes from both systems without too much risk of corrupting or something?

    Read the article

  • Windows File Checksums - Is my system hacked?

    - by rism
    I would like to know if there is a utility to verify the checksums of every windows file on my Win 7 Ultimate system. It seems on the surface such an obvious utility but I dont ever remember seeing one? I had a very weird experience while surfing earlier today and now Im not entirely sure my system is secure. I have a collection of tools in the WSCC suite but these tools no doubt just make system calls to the win32 api and if that has been subverted then the tools are practically useless. How do I know my Win 7 files are actually Win 7 files? I am particularly interested in verifying the integrity of all network TCP/IP files.

    Read the article

  • Not able to delete file from the server?

    - by kvijayhari
    I've a file called piture-list.php in my website... When i see them through the ftp client it shows two files with different filesizes.. as File name filesize picture-list.php 19818 picture-list.php 9063 When i select the file with 9063 and delete using ftp it deletes the file with the filesize 19818 then i used the command prompt to list files and happened to see actually there were two files one with the original name and other with a space before the filename (" picture-list.php").. I tried to move, delete the file but nothing is successful.. What may be the issue??

    Read the article

  • Exceptions from automongobackup, yet script completes

    - by chakram88
    I am using automongobackup to, well, automate the backups of mongodb. output from the script (to STDERR) has the following exceptions (but the backup completes, and the dump files are created) ###### WARNING ###### STDERR written to during mongodump execution. The backup probably succeeded, as mongodump sometimes writes to STDERR, but you may wish to scan the error log below: exception: connect failed exception: connect failed exception: connect failed exception: connect failed exception: HostAndPort: bad port # exception: connect failed exception: connect failed exception: connect failed exception: connect failed exception: connect failed exception: connect failed I know that the Host & Port are correct. If I run mongodump --host=127.0.0.1:27017 --journal (which is the effective command from automongobackup based on the options set and my reading of the src code) everything runs clean without any error reporting and the dump files are created as expected. Why would automongobackup report connection errors, even tho it does create the dump files, yet a straight call to mongodump does not? Debian 6.0 Lenny (from Linode image: Latest 3.2 (3.2.1-x86_64-linode23)) AutoMongoBackup VER 0.9 mongodb v 2.0.2

    Read the article

  • Stream tar.gz file from FTP server

    - by linker
    Here is the situation: I have a tar.gz file on a FTP server which can contain an arbitrary number of files. Now what I'm trying to accomplish is have this file streamed and uploaded to HDFS through a Hadoop job. The fact that it's Hadoop is not important, in the end what I need to do is write some shell script that would take this file form ftp with wget and write the output to a stream. The reason why I really need to use streams is that there will be a large number of these files, and each file will be huge. It's fairly easy to do if I have a gzipped file and I'm doing something like this: wget -O - "ftp://${user}:${pass}@${host}/$file" | zcat But I'm not even sure if this is possible for a tar.gz file, especially since there are mutliple files in the archive. I'm a bit confused on what direction to take for this, any help would be greatly appreciated.

    Read the article

  • Apache2: How do I restrict access to a directory, but allow access to one file within it?

    - by Nick
    I've inherited a poorly designed web app, which has a certain file that needs to be publicly accessible, but that file is inside a directory which should not. In other words, I need a way to block all files and sub-directories within a directory, but over-ride it for a single file. I'm trying this: # No one needs to access this directly <Directory /var/www/DangerousDirectory/> Order Deny,allow Deny from all # But this file is OK: <Files /var/www/DangerousDirectory/SafeFile.html> Allow from all </Files> </Directory> But it's not working- it just blocks everything including the file I want to allow. Any suggestions?

    Read the article

  • Make Ms Word activate window of file if it is already open when I double click the file's icon?

    - by barlop
    When I try to open a word document that is already open, I want it to just activate the window where the file is open. How can I do that? sometimes it takes time to check if an ms word document is already open, I don't to have to have to check through a bunch of open word documents to see if it's in there or not, I just want to double click the icon of a word document, and if it's open then go to it, if not then open it. With the doc/docx or shortcut to the doc/docx , I have some files where when I double click , it activates it when already open. I have other files where double clicking will bring up a "file in use" dialog box. I can't find what is the cause. I want it to always activate the window rather than reopen it. update- maybe that is default behaviour to activate when already open, and after a crash I had that stopped. i'll try deleting the working files and starting ms word again, idea from here

    Read the article

  • Using git with cgit for decentralized/centralized development

    - by polemon
    I plan to use git for hosting my projects on my server. I've read about cgit, git-daemon, and I more or less decided to use those tools. But general use is still kind of confusing for me. What do I need to set up on the server, to push my files onto it. And when the files on the server are newer as the files on my computer, how do I merge them? Also, I use, say, two computers where I develop. How do I merge from one computer to the other? Also, when two people are working on the same project, how do they merge their local repos from one another? As you probably can tell by now, I come from SVN, but I've worked with Mercurial and now I'd like to test git.

    Read the article

  • How to migrate Notepad++ settings?

    - by NoCatharsis
    I am trying to portabilize every program I use if possible, and Notepad++ is on the list. The only problem is that I've had a native installation until now so that I'm not totally sure which settings files need to be moved to the portable directory. Surely there's a function tucked away somewhere in NPP exactly for this purpose, or some plugin out there? I mean the developers have literally thought of everything else, yet this is the one thing I cannot find specifically anywhere in the NPP wiki or otherwise, and I don't want to miss an important file. Here is the closest I've gotten: Notepad++'s configuration files and Where are all the files? Should I just copy every configuration file listed on the first link?

    Read the article

  • apache dont send me mp3 header even when use direct address to the file

    - by user1728307
    apache dont send me mp3 header even when use direct address to the file, it means i can play it with flash audio players on my web pages, but when i tried to download from direct address on my server i got "Error 101 (net::ERR_CONNECTION_RESET): The connection was reset" or sometimes gives me a file with mp3 extension that has just 13B files-size, and when i open that file in gedit/notepad there is just: <html></html> i dont have any problem with php files and images, but mp3 files never be send to browser for download or play. i added this code to httpd.conf: AddType audio/mpeg .mp3 but there is not any difference!! thanks in advance

    Read the article

  • Win 2003 Junction Point to Remote Unix Share

    - by Pogrindis
    Env : Windows Server 2003 with already established shared folders over the local Domain via Windows DC and AD. - Linux box being used as a fileserver with the folder /files/share being R+W by all domain users, this is not a problem. I have already transfered the files from the Windows Box to the /files/share on the Linux Box however i now want to create a junction point in order to prevent users saving to the Windows box. I have tried the FileServer Administration on windows server 2003 however it will not allow me to junction remote servers. I have tried mounting the remote filesystem as a drive and proceeding that way however no joy. Anyone have any suggestions ?

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • Store and Encrypt data over the internet.

    - by sotsec
    I am trying to build a system where I will be able to access my files remotely. I want to setup an external hard drive or a NAS that I will access over the internet, and I want every file that is stored on that system to be encrypted. Could you please suggest me what is the best way of doing that? Or if you have any knowledge, what is the best way to access your files remotely with maximum safety? but the same time the space that the files are allocated is protected against theft(encryption) etc. thank you

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • batch file to disable network share on Windows XP

    - by Robb
    Loosely related to this question Network Share causing Cygwin to run slowly after 'ls', I'd like to write a little batch file that I can execute to disconnect the host from any network shares and subsequently another batch file to reconnect. Ideally, this would be something that I can execute from a PuTTY terminal, SSHed into the box running cygwin. I'm pretty sure the batch files can be written easily, but I don't know about executing them from a PuTTY terminal. Regardless, I'd still like the batchfiles anyways. For the sake of simplicity my process would be: Log into server via PuTTY Run batch files to disconnect shares Do what I need to do Run batch files to reconnect shares Exit session, closing PuTTY

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • Hard disk failure. Can I recover my "move"d folders?

    - by Doug
    I am in the process of moving all my files from an old laptop to new one. I just moved 11gb of data from my old laptop to a hard drive (external) and then upon moving it out to the new hard drive, the hard drive is getting a CRC (Data Error (Cyclic Redundancy Check). Now I am looking for a solution to recover the files that I moved on my old laptop (not the external). I understand they they are just marked for potential overwriting to free up space. I was getting ready to test out GetDataBack, but it says to install it on a healthy windows and use the recover-needed drive as an external. However, I don't want to turn off my computer without first getting the okay since it is in a "moved" state. Please help! What can I do to recover the Moved files. I haven't touched the computer since it has been moved. What can I use to recover them?

    Read the article

  • How to auto detect text file encoding?

    - by ???
    There are many plain text files which were encoded in variant charsets. I want to convert them all to UTF-8, but before running iconv, I need to know its original encoding. Most browsers have an Auto Detect option in encodings, however, I can't check those text files one by one because there are too many. Only having known the original encoding, I then can convert the texts by iconv -f DETECTED_CHARSET -t utf-8. Is there any utility to detect the encoding of plain text files? It doesn't have to be a 100% perfect correct, but it should recognize most of them.

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Copying large file from SD to HDD via USB failing on Ubuntu

    - by Kent Boogaart
    Hi, I'm attempting to copy some large files from my camera (Canon EOS 500D) to my laptop, which is running 64 bit Ubuntu 9.04. I am using USB to connect the two devices. For most files, it is simply a matter of control-C and control-V. I have done this successfully many times with both photos and small movies (eg. 180MB). However, when I attempt to do this with very large files (eg. 3GB), the copy seems to start with a lot of activity both on the camera and laptop, but after 10 minutes or so the camera is automatically unmounted and the copy fails to complete. I have read that this might be due to the device not mounting as a mass storage device, but I cannot see any obvious way for me to change this behavior. Can anyone offer any direction here? I'll get a USB card reader if necessary, but I'd prefer to be able to just plug my camera in. Thanks

    Read the article

< Previous Page | 448 449 450 451 452 453 454 455 456 457 458 459  | Next Page >