Search Results

Search found 38393 results on 1536 pages for 'war files'.

Page 91/1536 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • Awk filtering values between two files when regions intersect (any solutions welcome)

    - by user964689
    This is building upon an earlier question Awk conditional filter one file based on another (or other solutions) I have an awk program that outputs a column from rows in a text file 'refGene.txt if values in that row match 2 out of 3 values in another text file. I need to include an additional criteria for finding a match between the two files. The criteria is inclusion if the range of the 2 numberical values specified in each row in file 1 overlap with the range of the two values in a row in refGene.txt. An example of a line in File 1: chr1 10 20 chr2 10 20 and an example line in file 2(refGene.txt) of the matching columns ($3, $5, $ 6): chr1 5 30 Currently the awk program does not treat this as a match because although the first column matches neither the 2nd or 3rd columns do no. But I would like a way to treat this as a match because the region 10-20 in file 1 is WITHIN the range of 5-30 in refGene.txt. However the second line in file 1 should NOT match because the first column does not match, which is necessary. If there is a way to include cases when any of the range in file 1 overlaps with any of the range in refGene.txt that would be really helpful. It should also replace the below conditional statements as it would also find all the cases currently described below. Please let me know if my question is unclear. Any help is really appreciated, thanks it advance! (solutions do not have to be in awk) Rubal FILES=/files/*txt for f in $FILES ; do awk ' BEGIN { FS = "\t"; } FILENAME == ARGV[1] { pair[ $1, $2, $3 ] = 1; next; } { if ( pair[ $3, $5, $6 ] == 1 ) { print $13; } } ' $(basename $f) /files/refGene.txt > /files/results/$(basename $f) ; done

    Read the article

  • In IIS6, how to provide authenticated access to static files on remote server

    - by frankadelic
    We have a library of ZIP files that we would like to make available for download at an ASP.NET site. The files are sitting on a NAS device that is accessible from out web farm. Here is our initial strategy: Map an IIS virtual directory to the shared drive at path /zipfiles Users can download the zip files when given the URL However, if users share links to the files, anyone can download them. We would instead like to make use of the ASP.NET forms authentication in our site to validate users' requests before initiating the file transfer. A few problems: A request for a zip file is handled by IIS, not ASP.NET. So it is not subject to forms authentication. In addition, we don't want ASP.NET to handle the request, because it uses up an ASP.NET thread and is not scalable for download of large files. So, configuring the asp.net dll to handle *.zip requests is not an option. Any ideas on this? One idea we've tossed around is this: Initial request for download will be for an ashx handler. This handler will, after authentication, generate a download token which is saved to a database. Then, the user is redirected to the file with token appended in QueryString (e.g. /files/xyz.zip?token=123456789). An ISAPI plugin will be used to check the token. Also, the token will expire after x amount of time. Any thoughts on this? I have not implemented an ISAPI plugin so I'm not sure if this will even work. I would like to avoid custom coding since security is an issue and I'd prefer to use a time-tested solution.

    Read the article

  • How to maintain base files for development environment central while allowing people to change their

    - by Ittai
    Hi, what I'd like to do is have files in a central location so that when I add people to my development team they can see the base version of these files but meanwhile have the ability for the rest of the team to work with their own local version. I know I can just put the files in source-control (we use Tortoiese-SVN) and have my team change the local versions but I'd rather not as the exclamation mark signaling the file has been changed and needs to be committed, quite frankly, irritates me greatly. I'll give two examples of what I mean: We use quite a few build.xml files which relate to a single properties files which contains many definitions. Some of them can be different between team-members (mainly temporary working directories) and I'd like a new team-member to have the ability to get the properties file with the base config but change it if they wish. Have the eclipse settings file in the SVN so that when a new team-member joins they can just retrieve the files from the server and have a base system running. If they wish they will be able to change some of these settings. Thanks, Ittai

    Read the article

  • Python: slow read & write for millions of small files

    - by Jami
    I am building directory tree which has tons of subdirectories and files. The total directory count is somewhere along 256^32 subdirectories with 256 files in each end which are only a few bytes long. I did this so I would have fast access to these files (since i'm not searching and i'm just directly accessing then via a known file path) I have a python script that builds this filesystem and reads & writes those files. The problem is that when I reach more than 1Gb of total filesize, the read and write methods become extremely slow. Here's the function I have that reads the contents of a file (the file contains an integer string), adds a certain number to it, then writes it back to the original file. def addInFile(path, scoreToAdd): num = scoreToAdd try: shutil.copyfile(path, '/tmp/tmp.txt') fp = open('/tmp/tmp.txt', 'r') num += int(fp.readlines()[0]) fp.close() except: pass fp = open('/tmp/tmp.txt', 'w') fp.write(str(num)) fp.close() shutil.copyfile('/tmp/tmp.txt', path) I previously tried performing linux console commands but it was slower. I copy the file to a temporary file first then access/modify it then copy it back because i found this was faster than directly accessing the file. I think the cause of the slowdown is because there're tons of files. performing this function 1000 times sometimes reach 1 minute now, but before (when there were only a few files, 1000 calls was performed for only less than 1 second) How do you suggest I fix this?

    Read the article

  • Reading from a database located in the Program Files folder using ODBC

    - by Dabblernl
    We have an application that stores its database files in a subfolder of the Program Files directory. These files are redirected to the VirtualStore in Vista and Windows 7. We represent data from the database using Microsoft DataReports (VB6). So far so good. But we now want to use Crystal Reports XI to represent data from the database. Our idea is to NOT pass this data to CR from our program, but to have CR retreive it from the database using a a system DSN through ODBC. In this way we hope to present our users with more flexibility in designing their own reports. What we do want to ensure though is that these system DSNs are configured correctly when the user installs our program or when the program calls the Crystal Report. Is there a smart way to do this using System variables for instance, instead of having to write a routine that checks for OS-version, whether UAC is enabled on the OS, whether the write restrictions on the Program Files folder have been lifted, etc and then adapts he System DSN to point to either the C:\Program Files\OurApp\Data folder, or the C:\Users\User\AppData\VirtualStore\Program Files\OurApp\Data folder? Suggestions for an entirely different approach are welcome too!

    Read the article

  • TortoiseSVN lists files as modified, but they are identical

    - by BJ Safdie
    I am merging a hot fix from our QA branch back into our Dev branch. Five files have changed. I do a fresh checkout of the Dev branch. I then do a merge (range of revisions) from QA into the Dev working copy. It brings in five files and there is a conflict on an external and ignore property -- which I resolve by "using local" (dev). When I check modifications or commit, I expect to see the five files I merged as the only changes. However, I get close to 700 "modified" files showing up in the commit dialog. If I select one of these file and "Compare with base," WinMerge comes up and says the "files are identical." I have tried this with the file dates set to "last committed" and not. Why are all of these files showing up as modified, when they are identical? What in the merge is causing this? How do I prevent SVN/TortoiseSVN from getting confused this way in the future?

    Read the article

  • How do I enable mod_deflate for PHP files?

    - by DM.
    I have a Liquid Web VPS account, I've made sure that mod_deflate is installed and running/active. I used to gzip my css and js files via PHP, as well as my PHP files themselves... However, I'm now trying to do this via mod_deflate, and it seems to work fine for all files except for PHP files. (Txt files work fine, css, js, static HTML files, just nothing that is generated via a PHP file.) How do I fix this? (I used the "Compress all content" option under "Optimize Website" in cPanel, which creates an .htaccess file in the home directory (not public_html, one level higher than that) with exactly the same text as the "compress everything except images" example on http://httpd.apache.org/docs/2.0/mod/mod_deflate.html) .htaccess file: <IfModule mod_deflate.c> SetOutputFilter DEFLATE <IfModule mod_setenvif.c> # Netscape 4.x has some problems... BrowserMatch ^Mozilla/4 gzip-only-text/html # Netscape 4.06-4.08 have some more problems BrowserMatch ^Mozilla/4\.0[678] no-gzip # MSIE masquerades as Netscape, but it is fine # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 # the above regex won't work. You can use the following # workaround to get the desired effect: BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html # Don't compress images SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary </IfModule> <IfModule mod_headers.c> # Make sure proxies don't deliver the wrong content Header append Vary User-Agent env=!dont-vary </IfModule> </IfModule>

    Read the article

  • Read header data from files on remote server

    - by rejeep
    Hi! I'm working on a project right now where I need to read header data from files on remote servers. I'm talking about many and large files so I cant read whole files, but just the header data I need. The only solution I have is to mount the remote server with fuse and then read the header from the files as if they where on my local computer. I've tried it and it works. But it has some drawbacks. Specially with FTP: Really slow (FTP is compared to SSH with curlftpfs). From same server, with SSH 90 files was read in 18 seconds. And with FTP 10 files in 39 seconds. Not dependable. Sometimes the mountpoint will not be unmounted. If the server is active and a passive mounting is done. That mountpoint and the parent folder gets locked in about 3 minutes. Does timeout, even when there's data transfer going (guess this is the FTP-protocol and not curlftpfs). Fuse is a solution, but I don't like it very much because I don't feel that I can trust it. So my question is basically if there's any other solutions to the problem. Language is preferably Ruby, but any other will work if Ruby does not support the solution. Thanks!

    Read the article

  • Trying to move files with specific file names from root directory to a subfolder

    - by Justin Reagan
    Hi I'm still pretty new to powershell so I apologize if I ask something that extremely basic. I have a root directory on a tftp server that pulls down config files from routers and other equipment every night. The files are like this IPaddress_YYYYMMDD_TA5000. There is a limitation in the equipment where the files can't be set to move into the root directory on their own. What I want to do is make a powershell script that will only move the files with the TA5000 part in the filename to the sub directory and only keep the 5 most recent files. I looked but I couldn't seem to find what I would need to do to parse the file for that specific string. I already have the portion of the script to delete the files based on age that was simple. Any help on getting started would be appreciated. Edit: I forgot to post the code I was trying. Move-Item c:\tftptransferfiles c:\tftptransferfiles\sca | Where-Object {_.name -like "*TA5000*"} I keep getting a error saying that the item at C:\tftptransferfiles is in use.

    Read the article

  • Trouble using files Globally

    - by Nightforce2
    Recently I ran into trouble when I discovered that vista restricts what can be installed into the system32 directory even though I am the administrator for this computer. It will not allow me to register dll files so I can use programs like wget globally like how programs "nslookup" etc are used. Keeps giving me this error. Regsvr32: The module "C:\Windows\System32\libeay32.dll" failed to load. make sure the binary is stored at the specified path or debug it to check for problems with the binary or dependent .DLL files. The specified module could not be found. Moving the required DLL files to system32 prompts me to confirm administrator privileges are needed to move these files, So I give the permission, copy the files to system32, and run wget to confirm. This is where it tells me it cannot find the DLL's required to run and when using regsvr32 it says it cannot find the entry point so it will not load the DLL asking me if it is a valid DLL or OCX file. If I leave the DLL's that came with wget in the same folder as wget outsite of system32 they work vice adding them to system32 with the exe it will not work saying it cannot read the those dll files. Is there a way around this or Do I need to Upgrade to Windows 7 to get away from these problems/restrictions?

    Read the article

  • Windows 7 search not finding files

    - by Rob Nicholson
    Can anyone please explain this quirk in Windows 7 search (not a big fan of it - preferred XP method or least both). With Outlook, you sometimes have to find and delete your OST file. It resides in the user's profile folder. How come searching the entire C: drive for *.ost files works - they are in c:\Users\rob.nicholson\appdata somewhere but starting the search from c:\Users\rob.nicholson fails to find the files??? Cheers, Rob.

    Read the article

  • copy windows registry and/or other locked files

    - by karolrvn
    Hi. While improving my (personal) backup system, I noticed, that I cannot copy certain locked files, like the windows registry files. Is there a way to copy such things? Or a specific solution for the registry (I know of the regedit-File-Export ,,solution'' but this is to text format and seems slow). AFAIK, On Linux the locking system is advisory and on Windows it is mandatory. Can I somehow bypass the mandatory-ness for backup purposes etc.? TIA.

    Read the article

  • How to rip dvd and arrange files

    - by user23950
    I'm ripping a dvd with dvd shrink. There are lots of anime episodes on it. And when I open those .vob files it seems that the episodes are not arranged in one .vob file. Because 1 .vob file has multiple episodes but they are not arranged as ep 1, ep 2, ep 3, etc. Is it possible to edit those files so that the episodes would be arranged

    Read the article

  • Git on DreamHost still balking on big files even after I compiled with NO_MMAP=1

    - by fuzzy lollipop
    I compiled Git 1.7.0.3 on DreamHost with the NO_MMAP=1 option, I also supplied that option when I did the "make NO_MMAP=1 install". I have my paths set up correctly, which git reports my ~/bin dir which is correct, git --version returns the correct version. But when I try to do a "git push origin master" with "big" files ~150MB it always fails. Does anyone have an suggestions on how to get DreamHost to accept this "big" files from a git push?

    Read the article

  • Removing duplicate files, keeping only the newest file

    - by pinkie_d_pie_0228
    I'm trying to clean up a photo dump folder, in which several files are duplicated but with different filenames or lost in subfolders. I've looked at tools like rmlint, duff and fdupes, but I can't seem to find a way to have them keep only the file with the most recent timestamp. I suspect I have to postprocess the results, but I don't even know where to start to do this. Can anyone guide me on how to get the duplicate files list and delete everything but the newest file?

    Read the article

  • Cleaning up temp files in Mac OS X

    - by deddebme
    I was a Windows person for more than 10 years. Around 4 months ago, I switched to Mac, and I have never looked back. But there is one thing that bothers me, which is my Mac partition volume is losing space slowly and gradually. I am pretty sure there are a lot of orphaned temporary files laying around in the volume. I know where to find the obsoleted temp files in my Windows partition, how about in Mac OSX?

    Read the article

  • Aptana show .htaccess files

    - by pygorex1
    Using the new Aptana Studio 2.0 and loving it. Trying to open and edit .htaccess files in Aptana Studio, but it's not showing up in the PHP Explorer pane, which normally shows all files, but not .htaccess.

    Read the article

  • Batch Scripting - Listing files with a specific amount of characters in file name

    - by Jane
    I'm creating a batch script for a class and I've hit a roadblock I have to list all text files whose names are up to seven characters long on the whole c: drive - make the listing output in a wide formant - then append to Batch script file output.txt So far I have -- dir c:*txt/w/o/s/p c:/"My Batch Script File Assigment"/"Output"/"Batch Script File Output Data".txt The above does everything except limit the search to files with only 1-7 characters in their name. If anyone could point me in the right direction I would really appreciate it!

    Read the article

  • Cleaning up temp files in OSX

    - by deddebme
    I was a Windows person for more than 10 years. Around 4 months ago, I switched to Mac, and I have never looked back. But there is one thing that bothers me, which is my Mac partition volume is losing space slowly and gradually. I am pretty sure there are a lot of orphaned temporary files laying around in the volume. I know where to find the obsoleted temp files in my Windows partition, how about in Mac OSX?

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >