Search Results

Search found 61297 results on 2452 pages for 'open files'.

Page 609/2452 | < Previous Page | 605 606 607 608 609 610 611 612 613 614 615 616  | Next Page >

  • Why does yum index get corrupted?

    - by TomOnTime
    Occasionally yum's cache gets corrupted and we see errors like this: error: db3 error(-30974) from dbenv->failchk: DB_RUNRECOVERY: Fatal error, run database recovery error: cannot open Packages index using db3 - (-30974) error: cannot open Packages database in /var/lib/rpm The workaround is rm -f /var/lib/rpm/__db* and then the next "yum" command regenerates the data. My question is: what is likely to be causing this? Is there some common task that ignores locks or has other problem that causes this? We have hundreds of CentOS machines and there is no pattern to which see this problem. It could be a "one in a million" issue, which at large scale is seen often. NOTE: I realize this is a very "open ended" question, but if an answer finds the cause, I will go back and turn the question into something more canonical that directly relates to the specific issue.

    Read the article

  • How to get back win32 executable file

    - by Ahmed Ezz
    i make something by wrong... when i right click into .exe file then choose OPEN with and i select choose a program then i select wrong program to open with... then i checked the checkBox that have the label [Always use the selected program to open this kind of file]... The Problem??? All .exe file changed into the wrong program i select it-- so all .exe file opened with this program... My Question?? HOW to get back all .exe file to the regular work..?? and thanks in advance :)

    Read the article

  • Store and Encrypt data over the internet.

    - by sotsec
    I am trying to build a system where I will be able to access my files remotely. I want to setup an external hard drive or a NAS that I will access over the internet, and I want every file that is stored on that system to be encrypted. Could you please suggest me what is the best way of doing that? Or if you have any knowledge, what is the best way to access your files remotely with maximum safety? but the same time the space that the files are allocated is protected against theft(encryption) etc. thank you

    Read the article

  • Searching Multiple Terms

    - by nevets1219
    I know that grep -E 'termA|termB' files allows me to search multiple files for termA OR termB. What I would like to do instead is search for termA AND termB. They do not have to be on the same line as long as the two terms exists within the same file. Essentially a "search within result" feature. I know I can pipe the results of one grep into another but that seems slow when going over many files. grep -l "termA" * | xargs grep -l "termB" | xargs grep -E -H -n --color "termA|termB" Hopefully the above isn't the only way to do this. It would be extra nice if this could work on Windows (have cygwin) and Linux. I don't mind installing a tool to perform this task.

    Read the article

  • Folder sync application which can sync over Internet (the other machine specified by an IP)?

    - by Adal
    I need to sync some folders between two Win 7 machines. While they are connected to the same LAN, they can't see each-other over Windows Networking since sharing is disabled on both of them (security reasons). Do you know any sync app which can work over IP? The folder I need to sync has 500,000 files in it (80 GB in total), so the sync app should be pretty efficient. At the moment I copy the files from one machine to the other over FTP, but it takes forever, since a separate connection is opened for each file. Or maybe you know some app which can efficiently transfer a large number of files between two machines on the Internet?

    Read the article

  • batch file to disable network share on Windows XP

    - by Robb
    Loosely related to this question Network Share causing Cygwin to run slowly after 'ls', I'd like to write a little batch file that I can execute to disconnect the host from any network shares and subsequently another batch file to reconnect. Ideally, this would be something that I can execute from a PuTTY terminal, SSHed into the box running cygwin. I'm pretty sure the batch files can be written easily, but I don't know about executing them from a PuTTY terminal. Regardless, I'd still like the batchfiles anyways. For the sake of simplicity my process would be: Log into server via PuTTY Run batch files to disconnect shares Do what I need to do Run batch files to reconnect shares Exit session, closing PuTTY

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • Hard disk failure. Can I recover my "move"d folders?

    - by Doug
    I am in the process of moving all my files from an old laptop to new one. I just moved 11gb of data from my old laptop to a hard drive (external) and then upon moving it out to the new hard drive, the hard drive is getting a CRC (Data Error (Cyclic Redundancy Check). Now I am looking for a solution to recover the files that I moved on my old laptop (not the external). I understand they they are just marked for potential overwriting to free up space. I was getting ready to test out GetDataBack, but it says to install it on a healthy windows and use the recover-needed drive as an external. However, I don't want to turn off my computer without first getting the okay since it is in a "moved" state. Please help! What can I do to recover the Moved files. I haven't touched the computer since it has been moved. What can I use to recover them?

    Read the article

  • How to auto detect text file encoding?

    - by ???
    There are many plain text files which were encoded in variant charsets. I want to convert them all to UTF-8, but before running iconv, I need to know its original encoding. Most browsers have an Auto Detect option in encodings, however, I can't check those text files one by one because there are too many. Only having known the original encoding, I then can convert the texts by iconv -f DETECTED_CHARSET -t utf-8. Is there any utility to detect the encoding of plain text files? It doesn't have to be a 100% perfect correct, but it should recognize most of them.

    Read the article

  • Our company has 100,000s+ photos, how to store and browse/find these efficiently?

    - by tobefound
    We currently store our photos in a structure like this: folder\1\10000 - 19999.JPG|ORF|TIF (10 000 files) folder\2\20000 - 29999.JPG|ORF|TIF (10 000 files) etc... They are stored on 4 different 2TB D-link NASes attached and shared on our office network (\\nas1, \\nas2, and so on...) Problems: 1) When a client (Windows only, Vista and 7) wishes to browse the let's say \\nas1\folder\1\ folder, performance is quite poor. A problem. List takes a long time to generate in explorer window. Even with icons turned off. 2) Initial access to the NAS itself is sometimes slow. Problem. SAN disks too expensive for us. Even with iSCSI interface/switch technology. I've read a lot of tech pages saying that storing 100 000+ files in one single folder shouldn't be a problem. But we don't dare go there now that we experience problems on a 10K level. All input greatly appreciated, /T

    Read the article

  • Copying large file from SD to HDD via USB failing on Ubuntu

    - by Kent Boogaart
    Hi, I'm attempting to copy some large files from my camera (Canon EOS 500D) to my laptop, which is running 64 bit Ubuntu 9.04. I am using USB to connect the two devices. For most files, it is simply a matter of control-C and control-V. I have done this successfully many times with both photos and small movies (eg. 180MB). However, when I attempt to do this with very large files (eg. 3GB), the copy seems to start with a lot of activity both on the camera and laptop, but after 10 minutes or so the camera is automatically unmounted and the copy fails to complete. I have read that this might be due to the device not mounting as a mass storage device, but I cannot see any obvious way for me to change this behavior. Can anyone offer any direction here? I'll get a USB card reader if necessary, but I'd prefer to be able to just plug my camera in. Thanks

    Read the article

  • Is there a way to get a shared spreadsheet to update without closing and reopening?

    - by Mike
    Using Excel 2010, I have a spreadsheet that is used by 3 different people at any one time. But if one person has the spreadsheet open on there PC the other people can only view it as read only. I have since shared the workbook and put the spreadsheet on a shared network drive and now they can all view the spreadsheet at the same time and edit it at the same time. The problem is that nobody can see the changes that the other users have made unless the close out of the spreadsheet and open it up again to view the changes. I have checked the settings of the shared workbook and on the advanced tab have tick the option that updates the information every 5 minutes but the information does not update until you close out and open the spreadsheet back up again. How can I fix this problem?

    Read the article

  • rsync invocation to replace symlinks pointing to source?

    - by bdbaddog
    Currently I'm moving a big filesystem to a new server as the original fileserver is no longer able to handle the filesystem writes. To make this quick I made symlinks at the target filesystem pointing to the original filesystem. Initially: /company/release (mountpoint of the original filesystem) After migration: /company/release.old (points to original filesystem after automount map update) /company/release (points to new fileserver/filesystem after automount map update) In /company/release there are symlinks like the following: /company/release/product-1.0.tar.gz - /company/release.old/product-1.0.tar.gz /company/release/product-1.0 - /company/release.old/product-1.0 (this is a tree of files) Using symlinks allowed me to move the writes to the new filesystem quickly. Now I'd like to slowly migrate the existing files and directories to the new filesystem. The problem I'm running into is that since the symlinks point back at the original files rsync doesn't see any difference and so it doesn't actually copy the file(s) or directory(s) and remove/overwrite the symlinks. Is there a set of rsync flags which will do what I want?

    Read the article

  • Would SSD drives benefit from a non-default allocation unit size?

    - by davebug
    The default allocation unit size recommended when formatting a drive in our current set-up is 4096 bytes. I understand the basics of the pros and cons of larger and smaller sizes (performance boost vs. space preservation) but it seems the benefits of a solid state drive (seek times massively lower than hard disks) may create a situation where a much smaller allocation size is not detrimental. Were this the case it would at least partially help to overcome the disadvantage of SSD (massively higher prices per GB). Is there a way to determine the 'cost' of smaller allocation sizes specifically related to seek times? Or are there any studies or articles recommending a change from the default based on this newer tech? (Assume the most average scattering of sizes program files, OS files, data, mp3s, text files, etc.)

    Read the article

  • linux system problem

    - by snakec
    very first thanks to u all for Ur support. I'm quite new to Linux .i know how to install software but i don't know 1: how to install library .a or .so files i: how to install tar.gz i use the method like ./configure. make make install but most of the time i got the message nothing to make .in lots of tar.gz there is no installation document no make file no .configure file that make me quite confused how to install them or run them now i got sample source code of cuda i got them in tar.gz form when i extract them i found a folder in folder i found folder like c ,doc,shared etc when i open each folder i found more folder n file like that src, doc common ,lib, in these folder i found source code file header files libraries file make files i don't know how to run this kind of project can the be installed on the system how to run them they don't have .run file or script they don't have configure file can any one explain me how to compile them ,how to run them & how to install them

    Read the article

  • Use SSH reverse tunnel to bypass VPN [on hold]

    - by John J. Camilleri
    I have shell access to a server M, but I need to log into a VPN on my machine L in order to access it. I want to be able to get around this VPN, and I've heard I can do this by creating a reverse SSH tunnel and using a intermediate server E (which I can access without the VPN). This is what I am trying: Turn on VPN on L, open SSH session to M On M, execute the command: ssh -f -N -T -R 22222:localhost:22 user@E From L, try to open SSH session to E on port 22222, hoping to end up at M Step 2 seems to work without any complaint, but on step 3 I keep getting "connection refused". I have made sure that port 22222 is open on E: 7 ACCEPT tcp -- anywhere anywhere tcp dpt:22222 I'm pretty new to SSH tunnelling and not sure what the problem could be. Any ideas what I can try?

    Read the article

  • Is there a way to save "work sessions" in linux

    - by mike
    I regularly work on different projects, using different software. For project 1 I need to open for ex : Filezilla, Gedit and Nautilus (set to a specific folder) For project 2 I need to open foz ex : Gimp, Nautilus (set to another specific folder) etc. What I would like is a kind of sessions manager, where I could create entries "project 1", "project 2", etc. And with one click or command, open all the softwares I need. Perhaps there's an easy way to write a batch file for this? Any idea is welcome :)

    Read the article

  • How can I exclude a file in a folder from basic auth (regex help)?

    - by simon180
    Hi I have a folder on my site which contains admin files and I've added basic auth following a little unwanted attention. This works fine however a couple of the admin functions won't work through basic auth as they handle file uploads and so I want to exclude these files from the auth. It shouldn't have any security implications as any rogue user wouldn't be able to access the pages that could create a session to use these functions. I am using the following basic code to exclude a file: <FilesMatch "(index.php\/myadminfolder\/myurl\/myaction/someotherstuff?)$"> Satisfy Any Order allow,deny Allow from all Deny from none </FilesMatch> The URL exclusion is not working. The URL to exclude is in the form: index.php/directory/subdirectory/action/uniqueid/blah What is the correct URL string to add to FilesMatch to exclude any files that start with the pattern of index.php/directory/subdirectory/action - regardless of what comes after action? Thanks Simon

    Read the article

  • Solaris NFS: user permissions

    - by cjavapro
    I am very new to NFS. I would like to make sure I am clear. If the NFS server shares a directory rw,, and all the files in the directory are permissions 700 and user/group for those files is root/root,,, On the client you would have to log in as root to see it. Is this correct? I am aware that a non root user on the client could make a direct connection to override this. (as in don't use the mount, just use an NFS client hack.) It really seems like anyone who has access to the client machine should have access to the files and that the client machine should be ignoring permissions. Only the server should handle permissions. Am I correct in my understanding? Is it normal to have this type of layout? Is there a way to ignore the permissions on the client side?

    Read the article

  • Unable to read DVD on laptop

    - by usabilitest
    I have a DVD disk that I created on an HP laptop. I believe it was setup to be accessed as a USB device. I do not have that laptop any longer and I still would like to access the files on that DVD from my current DELL laptop but it can not open the disk. My guess, the session nwas not closed on the disk and for some reason the new laptop can not read it. My question is, is there any way I can access the files using new laptop, of do I need to find similar HP to try to access the files and possibly close the session? Are there any utilities out there that could help me? Thanks.

    Read the article

  • vsftpd per group configuration

    - by roqs
    I want to configure a vsftpd in a per group fashion instead of per user configuration. It's possible? Suppose i have two groups: groupA and groupB, so my goal is: users in groupA have permission (wrx) to all files in directory dir1 users in groupB have permission (wrx) to all files in directory dir2 users of the system have permission (wrx) to all files in directory dir3 For example: ftp@test:/home/ftp# ls -l drwxrwxr-x 16 root groupA 4096 Jun 3 10:45 dir1 drwxrwxr-x 2 root groupB 4096 Jun 3 10:56 dir2 drwxrwxr-x 8 root users 4096 Jun 3 11:01 dir3 How to do that with vsftpd?

    Read the article

  • Launching multiple applications with a single command/script/shortcut

    - by Bill
    I realized a few days ago that every time I sit down at work, I do a few things after unlocking my computer. First, I open up Firefox, then I open up Chrome, then I log in to Digsby. I realized I could probably save repeating this daily by writing a small batch script to open up Firefox and Chrome , but I couldn't figure out how to make it work.. and since the whole effort is to save time I don't want to bash my head around in the windows command prompt to do it. I also tired this in powershell but ran in to a bunch of security nonsense. Is there a way to do this that I am missing? Bonus points if somebody has figured out how to manipulate Digsby via COM , scripting, or python =)

    Read the article

  • Unix / linux permissions setup for shared hosting with Apache

    - by weiyin
    I'm in the process of setting up a server from a clean CentOS 5 install. What is the best permission structure (users, groups, unix permissions) for running a single instance of apache for multiple users? Ideally, it should satisfy these requirements: Each user's websites are stored in a subdirectory of their home directory. Users can edit files and permissions. Apache can read the websites of all users. No user can read the website files of other users. Bonus question: how to add PHP and/or Perl and/or Ruby to Apache without allowing any users to access any other user's files?

    Read the article

< Previous Page | 605 606 607 608 609 610 611 612 613 614 615 616  | Next Page >