Search Results

Search found 45987 results on 1840 pages for 'copy files'.

Page 321/1840 | < Previous Page | 317 318 319 320 321 322 323 324 325 326 327 328  | Next Page >

  • Making Windows 7 default to "All Files" when opening a document?

    - by krebshack
    I'm taking over an old coworkers job and part of that means I'll be working on various IFS tickets without any real ERP experience. I'm working with one user who is trying to check in documents in IFS. Right now, IFS defaults to JPEGs and then she has to select "All Files" in order find PDFs. Example: http://i.imgur.com/xK0iAfF.png As much as I'd like to say "it's one extra step, come on" the user's manager has insisted it interrupts her workflow and asked us to get on it. I've spoken with our IFS experts and they're unaware of any setting that would make the open dialog default to All Files in IFS. I've searched in Google for any setting in Windows 7 that would do that - but those have been unsuccessful - I keep getting results about changing which program opens a specific file type.

    Read the article

  • Can NFS be forced to refresh stale files/directories when not using noac on the mount?

    - by johnnycrash
    We mount without using noac. I have a file that I append to once every 20 minutes. Then it will be read with mmap about 5,000 times a minute. We only mmap a couple blocks for each read. Needless to say, noac just kills the access performance, so we don't use it. I add data to the end of the file using a mount with noac and read from a mount without noac. The mounts that are reading are not seeing the new data. I want to know if there is a function I can call from c to refresh the attributes of a path and all its files. EDIT: I should add we cannot mount and unmount since there are 16 servers running on each system and they are constantly accessing the files. Well...maybe we could mount and unmount if each server used their own mount. I'd like to avoid that if possible. thanks!

    Read the article

  • Can files deleted on an ecnrypted drive be restored?

    - by roddik
    Hi. There are ways to restore files, deleted from the system by default, I'm not sure about the way they work but I guess thet read content, that has not been overwritten. On the other hand, there are programs (e.g. TrueCrypt), that encrypt disks, claiming that it wouldn't be possible to tell apart random data and file contents on such a disk without a password. Therefore I think that files, deleted from such disks can't be restored. Is that correct? I know one way to find out would be to try it, but there is a possibility, that I would just pick the wrong restoring software. Moreover, I'm more interested in theorethical explanation why it would/wouldn't be possible. Thanks

    Read the article

  • Is there a Utility that will scan selected locations and return all files older than a certain date?

    - by CT
    Can anyone recommend a utility that can scan specified directory locations (network shares specifically) and return all files older than a certain date? I am looking to implement a data retention policy at my workplace. As our amount of data grows it puts a large strain on our backup routines. I would like to move old data to some sort of archival system. Extra points for the ability to move queried old files to another location for archival and the ability to create schedules for when this occurs. Many thanks. EDIT: Windows Shop. Mostly Windows 2003 Servers.

    Read the article

  • How do I get these permissions working right so Apache can work with the files?

    - by cosmicbdog
    I am having a go at setting up my own Apache and can't seem to get my head around the permissions. Lets say I grab a file from somewhere off the web and it has permission of 600. I then upload this file via ftp to a user directory, which is also an apache virtual site, and so this file retains this permission of 600. This means that the user can read this file, but Apache can't: it will be forbidden. What is the most simple solution so that apache can read + write whatever files end up in the users directory? Can apache be granted some sort of root power over files in a directory?

    Read the article

  • Microsoft Office tracking license marks/keys in saved files (doc/docx/xls etc.)?

    - by pawel159
    Does Microsoft Office use any marks/keys in saved files, that can track the computer and software license using which file was saved? E.g.: I use Office at work and save files on pendrive. When I'm home I recall to check something and introduce little modifications. I'd save this file using MS Office Home and Student license, but at work I will be redistributing this file and it should look as it was saved at work, with company's license (not my private). Will saving it at work once again wipe license marks from home (if any exist)?

    Read the article

  • How to restrict access to the files outside document root in apache?

    - by Bakhtiyor
    I have virtual hosts in /var/www/site1 and /var/www/site2 folders. I want to restrict access to the files outside document root in apache virtual host, i.e. site1 could not access files of site2. Right now this scripts in /var/www/site1 works fine, which is not good: $filename = "/var/www/site2/somefile"; $handle = fopen($filename, "r"); $contents = fread($handle, filesize($filename)); echo $contents; How to solve this problem please? Thank you very much!

    Read the article

  • Windows 7 Mapped Network Drive Multiplying to Create Duplicates all the way to Z:

    - by bendiy
    A strange issue came in today from some users. At least two Windows 7 x64 boxes that have duplicate mappings of a network drive. The drive is not mapped with a log in script, but done manual through "Map Network Drive". Everything has been fine for months, but all of the sudden, Explorer looks like this: Files (\\fileServerPath) (S:) Files (\\fileServerPath) (T:) Files (\\fileServerPath) (U:) Files (\\fileServerPath) (V:) Files (\\otherServerPath) (W:) Files (\\fileServerPath) (X:) Files (\\fileServerPath) (Y:) Files (\\fileServerPath) (Z:) There are some other networks drives mixed in there that did not duplicate. The drive is normally mapped to S:\, but it decided to make its way to Z:. What is going on here? I've found this and will be trying soon: http://social.technet.microsoft.com/Forums/en/w7itpronetworking/thread/b5647cc3-15d0-4776-bb00-a869bd8f930b

    Read the article

  • Is there a "pattern" or a group that defines *rcs files in *nix environments?

    - by Somebody still uses you MS-DOS
    I'm starting to use command line a little more, and I see there are a lot of ways to configure some config files in my $HOME. This is good, since you can customize it the way you really like. Unfortunately, for begginners, having too many options is a little confusing. For example, I created .bash_alias for some alias I'm using. I didn't even know this option existed, I'm used to simply edit .bashrc. Do exist a pattern, a "good practice", envisioning flexibility and modularity in terms of rc files structure? Do exist a standardization group for this, or every body just creates it's own configuration setup?

    Read the article

  • Proper set up shared folders for users

    - by user221486
    First I would like to say thanks for helping, and I have huge problem with proper set up permission for shared folders. I have Windows 7 x64 ent. - name: backupfb - added to domain with shared folder on drive e: (e:\backup) 50 clients/laptops with TSM Tivoli fastback for workstations who save files on shared folder And I need to configure proper permission for my shared folders that only owner of folder can access to their folders. Folder structure is: e:\backup <- shared as a "backup" folder \\backupfb\backup\ e:\backup\BackupAdmin <-- directory is used by the Tivoli Storage Manager FastBack for Workstations client to download revisions and configurations. Nodes require read-only access to these directories e:\backup\RealTimeBackup <-- enable client accounts to create directories that are only accessible by the account that created them. As a result, the directory that contains data for a node is not created until that node connects to the server. So permission should look like that (take from instructions): Inheritable permissions from object`s parents are DISABLE Permission entries: \\backupfb\backup\BackupAdmin Allow Users Read, Execute This folder, subfolders, and files Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Delete subfolders and files Allow Delete Allow Read Permission’s Allow Allow Administrators Full Control This folder, subfolders, and files Both folders have enabled option "apply these permissions to objects and/or containers within this container only" Here everything works fine \\backupfb\backup\RealTimeBackup <<-- Allow Administrators Full Control This folder, subfolders, and files Allow CREATOR OWNER Full Control This folder, subfolders, and files (from domain) Allow Users Special This folder only Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Create Files / Write Data Allow Create Folders / Append Data Allow Delete subfolders and files Allow Read Permission’s Allow Allow OWNER RIGHTS* Full Control This folder, subfolders, and files Here I have huge problem with CREATOR OWNER Im able to set FULL CONTROL but I can only apply "Subfolders and files only". When I change props. to "This folder, subfolders and files" and save its change to "Subfolders and files only" So I try use icacls to set up permissions @echo off takeown /F E:\backup\ /R /A for /D %%i IN (E:\backup\RealTimeBackup*) DO icacls E:\backup\RealTimeBackup\%%~nxi /grant:r cloud\%%~nxi:F /T /C pause but after that user are able to create just one folder in \backupfb\backup\RealTimeBackup\userfolder but problem is with subfolders In log i have: FBW5022E Unable to access the specified file Explanation: The file specified is unable to be accessed. Possibly spelled incorrectly, or bad path, or permissions. User response: Ensure the user has the proper permissions for the file and directories involved andthat the file and directory exist Any idea ?? pls help ;-) thanks

    Read the article

  • Saving backup files automatically in (g)Vim after saving a file.

    - by Somebody still uses you MS-DOS
    I had a problem with my gVim. I lost some important modifications after I plugged on my machine after a hibernating process. To avoid this kind of problem, I would like to know if it's possible to add something in my .vimrc (or a plugin) that automatically backups all saving made to my files. Disk space is not an issue, I can delete these files after. I'm already using set backup set backupdir=~/.backup/vim set directory=~/.swap/vim This creates a myfile.extension~ in my .backup/vim. ...but I would like this configuration to add ~ to first save, ~0 to second, ~1 to third, ~2 to fourth, and so on - something that keeps copies from all modifications I made to a file. Is this possible? Do you know if there's a plugin for this?

    Read the article

  • WIndows 7: would like recursively check a directory for changes to files and folders since my last visit

    - by user1026169
    I have found varieties of this question. Most of them fall under monitoring and others are said to be buggy. A continuously monitoring program won't help me and neither will a list of the most recently modified files. I would like to check if the directory, its sub folders and files have changed since my last visit. It seems to me that I'd likely need a program that maintains and index of that directory and compares it with its condition when I tell it to. Then that program could output to a log for my use. This is on Windows 7 over a shared network, the folder(s) I'd like to check have 4-50GB of data. I did find this, however I am still learning how to program. I think it describes what I want: http://www.rgagnon.com/javadetails/java-0490.html Thanks for considering.

    Read the article

  • `for` loop of Microsoft `cmd`: how can I process only the files with a certain extension?

    - by uvts_cvs
    I have a the folder c:\test\ and two files in it a.txt and b.txtv. I would like to process just the files with extension equal to .txt. If I write this commands cd c:\test for %f in (*.txt) do echo %f I will get the result where both a.txt and b.txtv are listed. The same happens with cd c:\test dir *.txt It seems .txt is the same of .txtv. I have Windows XP SP3 in Italian and the result of ver is Microsoft Windows XP [Versione 5.1.2600]. The same result is from Windows 7 in English Microsoft Windows XP [Version 6.1.7601].

    Read the article

  • How do I edit files in the console when connecting to windows 7 via ssh?

    - by Alex Waters
    I am using tunnelier client and server to connect to a windows machine. I can get in and have access to all of the files on the computer. I have vim installed on that windows machine, but I can't seem to edit anything via the DOS command line. I also tried editing in notepad, but nothing happens when I enter the command. I think this might be the part where DOS doesn't behave like bash. Would I need to setup cygwin / openssh to accomplish this? (boo, tunnelier is so easy) Thanks! p.s. I know I could just use sftp and edit files that way, but it feels dirty.

    Read the article

  • Can I replace a folder with the files it had in it a few hours ago with Time Machine?

    - by 1.21 gigawatts
    I messed up a folder (or the files in that folder) and I want to restore them from a few hours ago. I tried to use Time-Machine to "restore" and it prompts me with where to restore to. I want to restore all files in that directory to the state they were in before. How do I do this? Don't tell me I have to manually replace each file. Edit: It looks like I can restore a folder. The operation can’t be completed because the item “eclipse” is in use.

    Read the article

  • Multiple computers on Ubuntu One

    - by L R Bellmore Jr
    I have added files from 4 computers to Ubuntu One. One computer failed. What happens to the files that I uploaded. Are they still on Ubuntu One? How come when I upload a file from computer A, computer B with the same Ubuntu One account does not sync and load that file into computer B - That has created this question.. what happens to files from computers from which I uploaded documents when those computers are no longer active or failed, or no longer have Ubuntu One on that computer from which the files were uploaded... Have I lost the files? The followup question is how come files from Computer A uploaded to Ubuntu One don't sync to Computer B. That is a related question. I need to shut down a computer, reformat the hard drive and install Linux on the entire harddrive instead of a dual boot with Windows XP as I am going to use Virtual Machines instead.. what happens to the files from the Windows Dual Boot that were uploaded to Ubuntu One? Are they removed from Ubuntu One and then have I lost those files if I don't backup to another service first.

    Read the article

  • Can I use Ubuntu One to sync data fiies between two remote computers

    - by Sleepy John
    I've got two computers, both running Ubuntu with files in their home folders sync'd in to Ubuntu One. I'd like to know if it's possible to make Ubuntu One automatically download data changes that have been uploaded automatically to Ubuntu One from one computer to the equivalent data file in the other. Clarifying a bit further, I've installed Red Notebook in both computers and so they each have their own /.rednotebook/data folder containing a series of .txt files corresponding to the monthly entries in each of them. These are sync'd to upload any changes to those .txt files to Ubuntu One. My question is can I, and if so how, do I make Ubuntu One automatically download and replace those .txt files in the other computer after they've been updated and uploaded from the first computer? I did labouriously manage to download all those text files which had been uploaded from the first computer, from Ubuntu One one-by-one to the second computer, but what I want to do is automate this process and that's where I'm stuck. I'm aware that things could get a bit complicated if both my computers were on-line at the same time and both were simultaneously making different Red Notebook entries, so that's not the scenario I'm trying to cover. All I want to achieve is that whatever updates to the files have been uploaded by one computer, will automatically be downloaded to the same-named files in the other computer as soon as that second computer appears on line and detects that Ubuntu One has matching but more recent sync'd files than the ones it's holding.

    Read the article

  • How do I remove my Windows 7 setup from a Windows/Ubuntu dual-boot?

    - by Tom
    Previously my OS was Window 7 and some day it began to show problems with booting and finally it didn't boot at all. I tried to repair it but it didn't get repaired. Then I installed Ubuntu 14.04 LTS alongside Windows and am impressed much by Ubuntu. So I want to remove all my Windows files. I searched Google to know how to do it and I found OS-Uninstaller. I have some doubts before proceeding with OS-Uninstaller - I need to keep my photos, songs, movies and personal files in my system even if Windows is removed. Normally Windows files are installed in the C Drive. My personal files are not in the C Drive. So will removing Windows files affect my personal files ? Did the OS-Uninstaller affect Ubuntu anyway ? Please note that I want to remove only the Windows installation files(the files added to my system by Windows during its installation). I don't want to change the NTFS partition to any other format since there is a probability that I will install newer version of Windows later.

    Read the article

  • firefox: How to enable local javascript to read/write files on my PC?

    - by Nok Imchen
    well, since greasemonkey cant read/write files from local hard disk. I've heard people suggesting Google gears but i've no idea obout gears :( So, what i've decided is to add a script type="text/javascript" src="file:///c:/test.js"/script Now, this test will use FileSystemObject to read/write file. Since, the "file:///c:/test.js" is a javascript file from local hard disk, so it should probable be able to read/write file on my local hard disk. I tried it but firefox prevented the "file:///c:/test.js" script to read/write files from local diak :( What i want to know is: Is there any setting in firefox about:config where we can specify to let a particular script, say from localfile or xyz.com, to have read/write permission on my local disk files??

    Read the article

  • How do I copy a python function to a remote machine and then execute it?

    - by Hugh
    I'm trying to create a construct in Python 3 that will allow me to easily execute a function on a remote machine. Assuming I've already got a python tcp server that will run the functions it receives, running on the remote server, I'm currently looking at using a decorator like @execute_on(address, port) This would create the necessary context required to execute the function it is decorating and then send the function and context to the tcp server on the remote machine, which then executes it. Firstly, is this somewhat sane? And if not could you recommend a better approach? I've done some googling but haven't found anything that meets these needs. I've got a quick and dirty implementation for the tcp server and client so fairly sure that'll work. I can get a string representation the function (e.g. func) being passed to the decorator by import inspect string = inspect.getsource(func) which can then be sent to the server where it can be executed. The problem is, how do I get all of the context information that the function requires to execute? For example, if func is defined as follows, import MyModule def func(): result = MyModule.my_func() MyModule will need to be available to func either in the global context or funcs local context on the remote server. In this case that's relatively trivial but it can get so much more complicated depending on when and how import statements are used. Is there an easy and elegant way to do this in Python? The best I've come up with at the moment is using the ast library to pull out all import statements, using the inspect module to get string representations of those modules and then reconstructing the entire context on the remote server. Not particularly elegant and I can see lots of room for error. Thanks for your time

    Read the article

  • How do I copy a JavaScript object into another object?

    - by Josh K
    Say I want to start with a blank JavaScript object: me = {}; And then I have an array: me_arr = new Array(); me_arr['name'] = "Josh K"; me_arr['firstname'] = "Josh"; Now I want to throw that array into the object so I can use me.name to return Josh K. I tried: for(var i in me_arr) { me.i = me_arr[i]; } But this didn't have the desired result. Is this possible? My main goal is to wrap this array in a JavaScript object so I can pass it to a PHP script (via AJAX or whatever) as JSON.

    Read the article

< Previous Page | 317 318 319 320 321 322 323 324 325 326 327 328  | Next Page >