Search Results

Search found 81082 results on 3244 pages for 'single file'.

Page 39/3244 | < Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >

  • Settings what-opens-what once and for all (Backing up File Associations)

    - by ldigas
    Every time I switch machines (as in, get a new one, or reinstall an OS or something like that) my precious file associations get lost. And the next six months pass slowly until I again set them up right. Is there a program that allows me to: Set all the extensions I would like to open with let's say, Vim, without setting each one of them individually. Something of a kind: Vim opens: .... list of extensions ... and/or A program which lets me backup my current settings, and when I copy those to a new machine it lets me just modify the paths where I putted the applications in question, and it does the rest (again, associates that program with all the extensions it opened before).

    Read the article

  • Error while executing (.exe ) from windows command script (.cmd) file

    - by mahesh
    I have the following syntax in the .cmd file, where PathList is console application with .exe as extension. cd D:\Sample D: PathList 2> file.txt This syntax works fine if the file is saved with .bat as extension, but if save it with .cmd extension it throws error saying 'PathList' is not recognized as an internal or external command, operable program or batch file. Please can i know what is the issue with saving it with .cmd extension

    Read the article

  • Nautilus file share for multiple users is not working. Only owner gets access.

    - by Niklas
    I have always had trouble setting up samba shares with ubuntu. In the past I've tried getting it to work by configuring /etc/samba/smb.conf but never achieved what I wanted. Last time I managed to get it working by making a share with nautilus built in file sharing (which utilises samba). Now when I try do it again I doesn't work. (running ubuntu 10.10 Desktop x64) What I'm trying to achieve is a share which is available for multiple users (those who are in the same group) and not just the owner (who also is included in the group). As it is now I can connect with only the owner, the others are getting an error when I try to connect with windows 7. All the users are within the same group and the folder permissions are 770. The files and folders have the correct group settings. I think there is no restrictions in the User Settings for the other users blocking them and I marked "make available to other users (or whatever it says)" in the file sharing dialog. What can I do?

    Read the article

  • What is the best practice for reading a large number of custom settings from a text file?

    - by jawilmont
    So I have been looking through some code I wrote a few years ago for an economic simulation program. Each simulation has a large number of settings that can be saved to a file and later loaded back into the program to re-run the same/similar simulation. Some of the settings are optional or depend on what is being simulated. The code to read back the parameters is basically one very large switch statement (with a few nested switch statements). I was wondering if there is a better way to handle this situation. One line of the settings file might look like this: #RA:1,MT:DiscriminatoryPriceKDoubleAuction,OF:Demo Output.csv,QM:100,NT:5000,KP:0.5 //continues... And some of the code that would read that line: switch( Character.toUpperCase( s.charAt(0) ) ) { case 'R': randSeed = Integer.valueOf( s.substring(3).trim() ); break; case 'M': marketType = s.substring(3).trim(); System.err.println("MarketType: " + marketType); break; case 'O': outputFileName = s.substring(3).trim() ; break; case 'Q': quantityOfMarkets = Integer.valueOf( s.substring(3).trim() ); break; case 'N': maxTradesPerRound = Integer.valueOf( s.substring(3).trim() ); break; case 'K': kParameter = Float.valueOf( s.substring(3).trim() ); break; // continues... }

    Read the article

  • Delphi7 - How can i copy a file that is being written to

    - by Simon
    I have an application that logs information to a daily text file every second on a master PC. A Slave PC on the network using the same application would like to copy this text file to its local drive. I can see there is going to be file access issues. These files should be no larger than 30-40MB each. the network will be 100MB ethernet. I can see there is potential for the copying process to take longer than 1 second meaning the logging PC will need to open the file for writing while it is being read. What is the best method for the file writing(logging) and file copying procedures? I know there is the standard Windows CopyFile() procedure, however this has given me file access problems. There is also TFileStream using the fmShareDenyNone flag, but this also very occasionally gives me an access problem too (like 1 per week). What is this the best way of accomplishing this task? My current File Logging: procedure FSWriteline(Filename,Header,s : String); var LogFile : TFileStream; line : String; begin if not FileExists(filename) then begin LogFile := TFileStream.Create(FileName, fmCreate or fmShareDenyNone); try LogFile.Seek(0,soFromEnd); line := Header + #13#10; LogFile.Write(line[1],Length(line)); line := s + #13#10; LogFile.Write(line[1],Length(line)); finally logfile.Free; end; end else begin line := s + #13#10; Logfile:=tfilestream.Create(Filename,fmOpenWrite or fmShareDenyNone); try logfile.Seek(0,soFromEnd); Logfile.Write(line[1], length(line)); finally Logfile.free; end; end; end; My file copy procedure: procedure DoCopy(infile, Outfile : String); begin ForceDirectories(ExtractFilePath(outfile)); //ensure folder exists if FileAge(inFile) = FileAge(OutFile) then Exit; //they are the same modified time try { Open existing destination } fo := TFileStream.Create(Outfile, fmOpenReadWrite or fmShareDenyNone); fo.Position := 0; except { otherwise Create destination } fo := TFileStream.Create(OutFile, fmCreate or fmShareDenyNone); end; try { open source } fi := TFileStream.Create(InFile, fmOpenRead or fmShareDenyNone); try cnt:= 0; fi.Position := cnt; max := fi.Size; {start copying } Repeat dod := BLOCKSIZE; // Block size if cnt+dod>max then dod := max-cnt; if dod>0 then did := fo.CopyFrom(fi, dod); cnt:=cnt+did; Percent := Round(Cnt/Max*100); until (dod=0) finally fi.free; end; finally fo.free; end; end;

    Read the article

  • Benchmarking a file server

    - by Joel Coel
    I'm working on building a new file server... a simple Windows Server box with a few terabytes of disk space to share on the LAN. Pain for current hard drive prices aside :( -- I would like to get some benchmarks for this device under load compared to our old server. The old server was installed in 2005 and had 5 136GB 10K disks in RAID 5. The new server has 8 1TB disks in two RAID 10 volumes (plus a hot spare for each volume), but they're only 7.2K rpm, and of course with a much larger cache size. I'd like to get an idea of the performance expectations of the new server relative to the old. Where do I get started? I'd like to know both raw potential under different kinds of load for each server, as well an idea of what our real-world load looks like and how it will translate. Will disk load even matter, or will performance be more driven by the network connection? I could probably fumble through some disk i/o and wait counters in performance monitor, but I don't really know what to look for, which counters to watch, or for how long and when. FWIW, I'm expecting a nice improvement because of the benefits of having two different volumes and the better RAID 10 performance vs RAID 5, in spite of using slower disks... but I'd like to get an idea of how much.

    Read the article

  • Spurious alleged file corruption on Windows 7

    - by Johannes Rössel
    Recently my Laptop sometimes warns about corrupted files on the hard drive (Samsung SSD PB22-JS3 TM). This has only happened so far when updating (or checking out) an SVN repository with either TortoiseSVN or the command line Subversion client. The fun thing is that the corrupted file has always been a .svn directory (although the directory entry may contain files in that directory too, if they're small enough?—?which should be the case with SVN). However, when looking into the warned-about directory I notice nothing strange or unusual and don't get any more warnings about it and another try (SVN stops updating once that error occurs?—?TortoiseSVN even with an appropriate error message) of updating the working copy works (well, mostly; sometimes it does it again, albeit with a different directory). Since the laptop is only a few months old I doubt the SSD is failing already—five months of normal usage shouldn't be too surprising. Also it (so far) occurred only with SVN updates on a large repository. Maybe that's too many writes in a short time and some part between the software and the hardware doesn't quite catch up fast enough or so?—?I don't know enough about this to actually make an informed guess here. Anyone knows what's up here? ETA: Note to add: I've run chkdsk (it seems to schedule itself anyway when this happens) and it didn't find anything out of the ordinary.

    Read the article

  • Spurious alleged file corruption with an SSD

    - by Johannes Rössel
    Recently my Laptop sometimes warns about corrupted files on the hard drive (Samsung SSD PB22-JS3 TM). This has only happened so far when updating (or checking out) an SVN repository with either TortoiseSVN or the command line Subversion client. The fun thing is that the corrupted file has always been a .svn directory (although the directory entry may contain files in that directory too, if they're small enough?—?which should be the case with SVN). However, when looking into the warned-about directory I notice nothing strange or unusual and don't get any more warnings about it and another try (SVN stops updating once that error occurs?—?TortoiseSVN even with an appropriate error message) of updating the working copy works (well, mostly; sometimes it does it again, albeit with a different directory). Since the laptop is only a few months old I doubt the SSD is failing already—five months of normal usage shouldn't be too surprising. Also it (so far) occurred only with SVN updates on a large repository. Maybe that's too many writes in a short time and some part between the software and the hardware doesn't quite catch up fast enough or so?—?I don't know enough about this to actually make an informed guess here. Anyone knows what's up here? ETA: Note to add: I've run chkdsk (it seems to schedule itself anyway when this happens) and it didn't find anything out of the ordinary.

    Read the article

  • Slow File Copy observed copying 40GB files across network to iSCSI device

    - by Rick
    Here's a curious ones for the gurus: Setup: Source Machine: Windows Server 2003 R2 machine with local hard drive. VHD file of 40GB. 1 x 1Gbps network card, Cat6 cable, switch. Target Machine: Windows Server 2008 R2 machine with iSCSI connection to iSCSI target on separate machine (1TB, RAID5). 1 x 1Gbps network card, Cat6 cable, connected to same switch as for Source Machine. Second 1Gbps network card, Cat6 cable, connected via isolated switch to the iSCSI target. Switches are Netgear JGS524 model (web managed). If I copy from the Win2003R2 machine to Win2008R2 machine local drive I get 40GB in 45 minutes, 36 seconds. If I copy from the Win2008R2 machine to the iSCSI target (local drive to iSCSI target) I get 40GB in 37 minutes 56 seconds. If I copy from the Win2003R2 machine to the iSCSI target via the Win2008R2 machine I get 40GB in 3 hours, 50 minutes, 24 seconds. All copies were done via the following command issued on the Win2008R2 box: XCOPY <source> <target> /J XCOPY /J - Copies using unbuffered I/O. Recommended for very large files. So, what's the bit I'm missing here? Why does a back-to-back copy take in total 1 hour, 23 minutes, 32 seconds when a "straight through" copy take almost 3 times as long? Switches show no errors, network hovers around the 3% utilisation mark for the duration of the copy (whereas the "back-to-back" copies are around the 25% utilisation mark). What have I missed?

    Read the article

  • Access denied error 3221225578 with file sharing to Windows server

    - by Ian Boyd
    i'm trying to access the shares on a server. The credential box appears, and i enter in a correct username and password, and i get access denied. The silly thing is that i can Remote Desktop to the server (using the same credentials), and i can check the Security event log for the access denied errors: Event Type: Failure Audit Event Source: Security Event Category: Account Logon Event ID: 681 Date: 3/19/2011 Time: 11:54:39 PM User: NT AUTHORITY\SYSTEM Computer: STALWART Description: The logon to account: Administrator by: MICROSOFT_AUTHENTICATION_PACKAGE_V1_0 from workstation: HARPAX failed. The error code was: 3221225578 and Event Type: Failure Audit Event Source: Security Event Category: Logon/Logoff Event ID: 529 Date: 3/19/2011 Time: 11:54:39 PM User: NT AUTHORITY\SYSTEM Computer: STALWART Description: Logon Failure: Reason: Unknown user name or bad password User Name: Administrator Domain: stalwart Logon Type: 3 Logon Process: NtLmSsp Authentication Package: NTLM Workstation Name: HARPAX Looking up the error code (3221225578), i get an article on Technet: Audit Account Logon Events By Randy Franklin Smith ... Table 1 - Error Codes for Event ID 681 Error Code Reason for Logon Failure 3221225578 The username is correct, but the password is wrong. Which would seem to indicate that the username is correct, but the password is wrong. i've tried the password many times, uppercase, lowercase, on different user accounts, with and without prefixing the username with servername\username. What gives that i cannot access the server over file sharing, but i can access it over RDP?

    Read the article

  • Relation between server_name in nginx sites-available, /etc/hosts file and A-records

    - by user2818584
    I have the following two server-blocks in my config-file in sites-available: server { listen 80; server_name www.mydomain.be; root /usr/share/nginx/html; index index.html index.htm; location / { try_files $uri $uri/ =404; } } server { listen 80; server_name sub.mydomain.be; root /usr/share/nginx/sub; index index.html index.htm; location / { try_files $uri $uri/ =404; } } I also created an A-record for both www.domain.be and sub.domain.be with the IP of my server as value. Yet, when I try to reload my nginx configuration with service nginx reload it fails. When I remove the second server-block, it reloads as expected. I know this topic is popular, and that there are loads of such [nginx][subdomain] questions here, but none of them seems to discuss explicitly how the following three things hang together: virtual hosts or server blocks in nginx (est. server_name matching) the effect of A-records on how nginx processes requests the need to add hosts to /etc/hosts Right now I have the impression that a lack of knowledge of this bigger picture, rather than specific knowledge of nginx configuration prevents me from making this work.

    Read the article

  • correct file permissions for trac and git user to access gitolite server repos

    - by klemens
    hi, sounds like a stupid questions (to me), but i couldn't find any info. on my server i host some git repositories via gitolite, and have a trac for every repository. i have a user called git to push/pull from server (git clone git@server:repo). and trac is a apache vhost with mod_wsgi. this runs with the www-data user. so what riddles me (maybe because I have not much of a clue about file-permissions at all) is whats the best permissions setup (chown, chmod) for the git repositories (/home/git/repositories/...). www-data (or trac) needs to at least read permissions (i think). and git (or gitolite) needs obviously read/write permissions to push changesets. i tried a little bit around (i.e. adding www-data and/or git to the www-data/git group), but didn't got it right. at least one of the two don't work (git or trac). any suggestions are highly appreciated. regard, klemens

    Read the article

  • File Open/Save Dialog always 'Not Responding'

    - by Amanda
    I am aware that this question has been asked once, however the solution for them didn't work with me. Whenever I go to open/save a file in any program, the dialog does not come up, and the application goes to 'Not Responding'. This goes on for about a few minutes, and then stops, but still does not open the dialog. There have been a few occasions where the dialog has suddenly worked for a while, but then the problem comes back. I have tried many solutions given around the internet, I have cleaned it with CCleaner, disk defragged it, sorted the index. Nothing works. Is there anybody who has any idea what the problem is? This is Windows Vista. I'm not quite sure what kind of information you guys would need about my laptop, but I'll give you it if you need it. :) Solutions I have Tried: I have tried deleting the 'Shellicon' folder in the registry, which wasn't even in there. I have looked for mapped network drives lingering around, and I haven't come across any. I have tried rebuilding teh Widnows Search Indexes...no difference.

    Read the article

  • Windows 7 SSH file server

    - by Siriss
    Hello all- I have looked at the other posts, but have not quite found an answer I have a question about windows file sharing over SSH. I have copssh installed and it is working for Remote desktop connections. I have port 22 forwarded on my router etc. I connect from a Mac or Putty with this address: ssh -l copsshusername 3391:localhost:3389 [external ip] That works fine. I would like to configure Windows 7 to allow my ssh account that I use to login, access to certain shared folders. I have documents and videos and things that I would like to be able to download externally. I have done this before on Linux and a long time ago on XP, but I cannot figure out what I am missing on Windows 7. There is a designated SSH user that copssh uses to run the service and that I use to to login as. I have googled and googled and have not found a solution that does everything I need that is why I am turning here for ideas. I hope I am explaining this correctly. Thank you very much for your help!

    Read the article

  • How can I tell why I have access to a file share on Windows Server

    - by Joel
    I have a file share on a Windows 2008 R2 server in a AD domain (call it \SECURESERVER\STUFF) and I am not sure if I have the share and folder permissions set up right. I noticed the problem when I set up new server (WORKGROUP\FOREIGNSERVER) that was not joined to the domain and tried to copy some files off of \SECURESERVER\STUFF. I was surprised to find that when I tried to access the files, it did not prompt me for a username and password and proceeded to give me full access to the files. That worried me so I tried the same thing on some workstations that were not in the domain and they did NOT have the same behavior (they did prompt for a username/password as desired/expected). So, I think there is something peculiar about FOREIGNSERVER. I am logging into it with a local admin account, but my domain and SECURESERVER should know nothing of this server. I've carefully gone through the share and folder permissions on the share but I can't find the reason that FOREIGNSERVER has access. How can I find out why FOREIGNSERVER has access to SECURESERVER?

    Read the article

  • Website and file/directory permissions

    - by mathiass
    I've been given a task to fix this one website. One of its issues is that on one page, the images have broken links - the images are not showing, and clicking on the image (i.e. direct link to the image file) results in a 403 (Forbidden) error. I am looking for some feedback on what could be the possible cause. The directory where the images are stored has the following permissions: drwxrws--- www "group" 10240 Aug 2008 "image directory name" I had to hide the names. I checked the page source code, and everything seems to be in place. The rest of the site, and other images outside that image directory are showing fine. I was told that recently there have been some changes to the server. I'm trying to assume that there is no fault in the source code, and the permissions are - or used to be - correct (since the site has been working before, and no recent changes to the site itself have been made). My only thoughts at the moment is that either: a) the directory permission should be: drwxrws--x (executable) for the other users, or b) there is a change in the server settings that I don't know of. Is there anything else I should check?

    Read the article

  • .bat file - Nagios v3.2 service check and start if stopped

    - by LbakerIT
    I'm just barely getting into programming so I do apologize for my ignorance. I'm trying to create a .bat file that will check if a service is running on XP Pro. If service is running it will exit 0. If the service is stopped start service wait 10 seconds (via ping i'm guessing) check if service is running if service is running exit 0 if service is stopped start service wait 10 seconds Do this check a total of 3 times. if service does not come up within that time: exit 2 Exit 0 = ok exit 1 = warning exit 3 = critical (and this will alert) I need to do this for 3 different services but i'm expecting that it would be better to create one per service. That way you get notified on the specific service that is not coming back up. The goal is that if the service stops it will start it. If after 30 seconds it is unable to start the service then it will send an alert. The reason I'm trying to do it with a .bat is this is consistent with all other scripts and I did not want to complicate it further by adding different kinds of code. Yay for consistency! Again I do apologize for my ignorance I've been thrown into this project last minute. Thank you for the help and reading my question!

    Read the article

  • using java.util.Scanner to read a file byte by byte

    - by openidsucks
    I'm trying to read a one line file character by character using java.util.Scanner. However I'm getting this exception": Exception in thread "main" java.util.InputMismatchException: For input string: "contents of my file" at java.util.Scanner.nextByte(Scanner.java:1861) at java.util.Scanner.nextByte(Scanner.java:1814) at p008.main(p008.java:18) <-- line where I do scanner.nextByte() Here's my code: public static void main(String[] args) throws FileNotFoundException { File source = new File("file.txt"); Scanner scanner = new Scanner(source); while(scanner.hasNext()) { System.out.println((char)scanner.nextByte()); } scanner.close() } Does anyone have any ideas as to what I might be doing wrong? Edit: I realized I wrote hasNext() instead of hasNextByte(). However if I do that it doesn't print out anything.

    Read the article

  • Batch file script to remove special characters from filenames (Windows)

    - by njreed.myopenid.com
    I have a large set of files, some of which contain special characters in the filename (e.g. ä,ö,%, and others). I'd like a script file to iterate over these files and rename them removing the special characters. I don't really mind what it does, but it could replace them with underscores for example e.g. Störung%20.doc would be renamed to St_rung_20.doc In order of preference: A DOS batch file A Windows script file to run with cscript (vbs) A third party piece of software that can be run from the command-line (i.e. no user interaction required) Another language script file, for which I'd have to install an additional script engine Background: I'm trying to encrypt these file with GnuPG on Windows but it doesn't seem to handle special characters in filenames with the --encrypt-files option.

    Read the article

  • .NET: IOException for permissions when creating new file?

    - by Rosarch
    I am trying to create a new file and write XML to it: FileStream output = File.Create(Path.Combine(PATH_TO_DATA_DIR, fileName)); The argument evaluates to: C:\path\to\Data\test.xml The exception is: The process cannot access the file 'C:\path\to\Data\test.xml' because it is being used by another process. What am I doing wrong here? UPDATE: This code throws the same exception: StreamWriter writer = new StreamWriter(Path.Combine(PATH_TO_DATA_DIR, fileName)); UPDATE 2: The file I am trying to create does not exist in the file system. So how can be it in use?

    Read the article

  • Effective file permissions tool's api in windows

    - by apoorv020
    Starting from Windows Server 2003, Windows included a new tool which calculates the effective permissions for a user (basically it resolves all groups access and takes care of all "deny" permissions as well). An example in point is that a user A belongs to groups B and C. B has been denied read permissions on a file F, while C has been allowed read and write permissions on the file and I want to calculate the effective permissions user A has on file F. This tool is available on Windows Server 2003,Vista,7 and Server 2008 by right clicking on a file and going to properties - security - advanced - effective permissions. What I need is an API in C# which does the same job. The most common FILE API returns access rules (class FileAccessRules), but there seems to be no direct way to calculate effective permissions from these set of access rules. Note: I do not want to process effective permissions in the code if at all possible, but am ready to do so as a last resort.

    Read the article

  • How to know a file is finished copying

    - by Yigang Wu
    I’m using ReadDirectoryChangesW to spy a folder, if I’m copying a large file to the folder, I can receive multiple FILE_ACTION_MODIFIED messages, it seems each time windows writes a large chunk of the file, you get the file modified notification for each time. I tried to use CreateFile API to check if the file can open by AP or not, but sometime, some of files are always locked by other AP, for example, if you are opening the Outlook, the PST will update, but my AP can’t access it, we have to start Shadow Copy to open it. So my question is, how to know a file is finished copying?

    Read the article

  • File I/O OS handling

    - by Albinoswordfish
    This isn't a direct coding question but more of a OS handling mechanism. I was reading somebody's previous question regarding C# and file handling. Apparently C# was throwing an exception regarding a file being locked when trying to access this. So my question is, does C# use an internal lock to handle file I/O between processes, or does the OS use some type of mutual exclusion for file I/O? From what I learned about operating systems, well at least unix, is that the OS doesn't implement any type of mutual exclusion for processes trying to access the same file.

    Read the article

  • Crossdomain file edit

    - by Misiur
    Hi there. I need to know, where from is my script used (it's for sale, and i don't want any thiefs). I want to write on my server in file, IP of user, domain where from script has been runned, date, etc. I've tried fopen, fwrite, but is_file_writable returned that it isn't. File CHmods are 777, it parent catalog has too 777 chmods. Now i'm trying something like that: <?php $file = 'http://www.misiur.com/security/seal.txt'; $data = date("Y-m-d H:i:s"); $ip = $_SERVER['REMOTE_ADDR']; $svr = $_SERVER['SERVER_NAME']; $str = "[$data] Loaded by $ip at $svr\r\n"; $current = file_get_contents($file); $current .= $str; file_put_contents($file, $current); ?> However - nothing happens. What i've got to do?

    Read the article

  • Increase file upload size limit in iis6

    - by JustFoo
    Is there any other place besides the metabase.xml file where the file upload size can be modified? I am currently running a staging server with IIS6 and it is setup to allow uploading of files up to 20mb. This works perfectly fine. I have a new production server where I am trying to setup this same available size limit. So I edited the metabase.xml file and set it to 20971520. Then I restarted IIS and that didn't work. So I then restarted the entire server, that also didn't work. I can upload files around 2mb so it is definitely allowing file sizes larger then the standard 200kb default size. I try uploading a 5mb file and my upload.aspx page completely crashes. Is it possible there is something else I need to configure? The production server is located on a server farm, could there be some limits set on there end? Thanks

    Read the article

< Previous Page | 35 36 37 38 39 40 41 42 43 44 45 46  | Next Page >